HOME
*





Quasi-Monte Carlo Method
In numerical analysis, the quasi-Monte Carlo method is a method for numerical integration and solving some other problems using low-discrepancy sequences (also called quasi-random sequences or sub-random sequences). This is in contrast to the regular Monte Carlo method or Monte Carlo integration, which are based on sequences of pseudorandom numbers. Monte Carlo and quasi-Monte Carlo methods are stated in a similar way. The problem is to approximate the integral of a function ''f'' as the average of the function evaluated at a set of points ''x''1, ..., ''x''''N'': : \int_ f(u)\,u \approx \frac\,\sum_^N f(x_i). Since we are integrating over the ''s''-dimensional unit cube, each ''x''''i'' is a vector of ''s'' elements. The difference between quasi-Monte Carlo and Monte Carlo is the way the ''x''''i'' are chosen. Quasi-Monte Carlo uses a low-discrepancy sequence such as the Halton sequence, the Sobol sequence, or the Faure sequence, whereas Monte Carlo uses a pseudorandom sequenc ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Pseudorandom
A pseudorandom sequence of numbers is one that appears to be statistically random, despite having been produced by a completely deterministic and repeatable process. Background The generation of random numbers has many uses, such as for random sampling, Monte Carlo methods, board games, or gambling. In physics, however, most processes, such as gravitational acceleration, are deterministic, meaning that they always produce the same outcome from the same starting point. Some notable exceptions are radioactive decay and quantum measurement, which are both modeled as being truly random processes in the underlying physics. Since these processes are not practical sources of random numbers, people use pseudorandom numbers, which ideally have the unpredictability of a truly random sequence, despite being generated by a deterministic process. In many applications, the deterministic process is a computer algorithm called a pseudorandom number generator, which must first be prov ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Curse Of Dimensionality
The curse of dimensionality refers to various phenomena that arise when analyzing and organizing data in high-dimensional spaces that do not occur in low-dimensional settings such as the three-dimensional physical space of everyday experience. The expression was coined by Richard E. Bellman when considering problems in dynamic programming. Dimensionally cursed phenomena occur in domains such as numerical analysis, sampling, combinatorics, machine learning, data mining and databases. The common theme of these problems is that when the dimensionality increases, the volume of the space increases so fast that the available data become sparse. In order to obtain a reliable result, the amount of data needed often grows exponentially with the dimensionality. Also, organizing and searching data often relies on detecting areas where objects form groups with similar properties; in high dimensional data, however, all objects appear to be sparse and dissimilar in many ways, which prevents ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Harald Niederreiter
Harald G. Niederreiter (born June 7, 1944) is an Austrian mathematician known for his work in discrepancy theory, algebraic geometry, quasi-Monte Carlo methods, and cryptography. Education and career Niederreiter was born on June 7, 1944, in Vienna, and grew up in Salzburg... He began studying mathematics at the University of Vienna in 1963, and finished his doctorate there in 1969, with a thesis on discrepancy in compact abelian groups supervised by Edmund Hlawka. He began his academic career as an assistant professor at the University of Vienna, but soon moved to Southern Illinois University. During this period he also visited the University of Illinois at Urbana-Champaign, Institute for Advanced Study, and University of California, Los Angeles. In 1978 he moved again, becoming the head of a new mathematics department at the University of the West Indies in Jamaica. In 1981 he returned to Austria for a post at the Austrian Academy of Sciences, where from 1989 to 2000 h ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Markov Chain Monte Carlo
In statistics, Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a Markov chain that has the desired distribution as its equilibrium distribution, one can obtain a sample of the desired distribution by recording states from the chain. The more steps that are included, the more closely the distribution of the sample matches the actual desired distribution. Various algorithms exist for constructing chains, including the Metropolis–Hastings algorithm. Application domains MCMC methods are primarily used for calculating numerical approximations of multi-dimensional integrals, for example in Bayesian statistics, computational physics, computational biology and computational linguistics. In Bayesian statistics, the recent development of MCMC methods has made it possible to compute large hierarchical models that require integrations over hundreds to thousands of unknown parameters. In rare e ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Discrepancy Theory
In mathematics, discrepancy theory describes the deviation of a situation from the state one would like it to be in. It is also called the theory of irregularities of distribution. This refers to the theme of ''classical'' discrepancy theory, namely distributing points in some space such that they are evenly distributed with respect to some (mostly geometrically defined) subsets. The discrepancy (irregularity) measures how far a given distribution deviates from an ideal one. Discrepancy theory can be described as the study of inevitable irregularities of distributions, in measure-theoretic and combinatorial settings. Just as Ramsey theory elucidates the impossibility of total disorder, discrepancy theory studies the deviations from total uniformity. A significant event in the history of discrepancy theory was the 1916 paper of Weyl on the uniform distribution of sequences in the unit interval. __NOTOC__ Theorems Discrepancy theory is based on the following classic theorems: * ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Low-discrepancy Sequences
In mathematics, a low-discrepancy sequence is a sequence with the property that for all values of ''N'', its subsequence ''x''1, ..., ''x''''N'' has a low discrepancy. Roughly speaking, the discrepancy of a sequence is low if the proportion of points in the sequence falling into an arbitrary set ''B'' is close to proportional to the measure of ''B'', as would happen on average (but not for particular samples) in the case of an equidistributed sequence. Specific definitions of discrepancy differ regarding the choice of ''B'' ( hyperspheres, hypercubes, etc.) and how the discrepancy for every B is computed (usually normalized) and combined (usually by taking the worst value). Low-discrepancy sequences are also called quasirandom sequences, due to their common use as a replacement of uniformly distributed random numbers. The "quasi" modifier is used to denote more clearly that the values of a low-discrepancy sequence are neither random nor pseudorandom, but such sequences share so ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Computational Physics
Computational physics is the study and implementation of numerical analysis to solve problems in physics for which a quantitative theory already exists. Historically, computational physics was the first application of modern computers in science, and is now a subset of computational science. It is sometimes regarded as a subdiscipline (or offshoot) of theoretical physics, but others consider it an intermediate branch between theoretical and experimental physics - an area of study which supplements both theory and experiment. Overview In physics, different theories based on mathematical models provide very precise predictions on how systems behave. Unfortunately, it is often the case that solving the mathematical model for a particular system in order to produce a useful prediction is not feasible. This can occur, for instance, when the solution does not have a closed-form expression, or is too complicated. In such cases, numerical approximations are required. Computational phy ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Biology Monte Carlo Method
Biology Monte Carlo methods (BioMOCA) have been developed at the University of Illinois at Urbana-Champaign to simulate ion transport in an electrolyte environment through ion channels or nano-pores embedded in membranes. It is a 3-D particle-based Monte Carlo simulator for analyzing and studying the ion transport problem in ion channel systems or similar nanopores in wet/biological environments. The system simulated consists of a protein forming an ion channel (or an artificial nanopores like a Carbon Nano Tube, CNT), with a membrane (i.e. lipid bilayer) that separates two ion baths on either side. BioMOCA is based on two methodologies, namely the Boltzmann equation, Boltzmann transport Monte Carlo (BTMC)C. Jacoboni, P. Lugli, The Monte Carlo Method for Semiconductor Device Simulation, Springer Verlag, New York (1989) and particle-particle-particle-mesh (P3M).R. Hockney, J. Eastwood, Computer Simulation Using Particles, McGraw-Hill, New York (1981) The first one uses Monte Carlo met ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Quasi-Monte Carlo Methods In Finance
High-dimensional integrals in hundreds or thousands of variables occur commonly in finance. These integrals have to be computed numerically to within a threshold \epsilon. If the integral is of dimension d then in the worst case, where one has a guarantee of error at most \epsilon, the computational complexity is typically of order \epsilon^. That is, the problem suffers the curse of dimensionality. In 1977 P. Boyle, University of Waterloo, proposed using Monte Carlo (MC) to evaluate options.Boyle, P. (1977), Options: a Monte Carlo approach, J. Financial Economics, 4, 323-338. Starting in early 1992, J. F. Traub, Columbia University, and a graduate student at the time, S. Paskov, used quasi-Monte Carlo (QMC) to price a Collateralized mortgage obligation with parameters specified by Goldman Sachs. Even though it was believed by the world's leading experts that QMC should not be used for high-dimensional integration, Paskov and Traub found that QMC beat MC by one to three orders of m ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Monte Carlo Methods In Finance
Monte Carlo methods are used in corporate finance and mathematical finance to value and analyze (complex) instruments, portfolios and investments by simulating the various sources of uncertainty affecting their value, and then determining the distribution of their value over the range of resultant outcomes. This is usually done by help of stochastic asset models. The advantage of Monte Carlo methods over other techniques increases as the dimensions (sources of uncertainty) of the problem increase. Monte Carlo methods were first introduced to finance in 1964 by David B. Hertz through his ''Harvard Business Review'' article, discussing their application in Corporate Finance. In 1977, Phelim Boyle pioneered the use of simulation in derivative valuation in his seminal '' Journal of Financial Economics'' paper. This article discusses typical financial problems in which Monte Carlo methods are used. It also touches on the use of so-called "quasi-random" methods such as the use of So ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Randomization
Randomization is the process of making something random. Randomization is not haphazard; instead, a random process is a sequence of random variables describing a process whose outcomes do not follow a deterministic pattern, but follow an evolution described by probability distributions. For example, a random sample of individuals from a population refers to a sample where every individual has a known probability of being sampled. This would be contrasted with nonprobability sampling where arbitrary individuals are selected. In various contexts, randomization may involve: * generating a random permutation of a sequence (such as when shuffling cards); * selecting a random sample of a population (important in statistical sampling); * allocating experimental units via random assignment to a treatment or control condition; * generating random numbers (random number generation); or * transforming a data stream (such as when using a scrambler in telecommunications). Applications Rando ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


CiteSeer
CiteSeerX (formerly called CiteSeer) is a public search engine and digital library for scientific and academic papers, primarily in the fields of computer and information science. CiteSeer's goal is to improve the dissemination and access of academic and scientific literature. As a non-profit service that can be freely used by anyone, it has been considered as part of the open access movement that is attempting to change academic and scientific publishing to allow greater access to scientific literature. CiteSeer freely provided Open Archives Initiative metadata of all indexed documents and links indexed documents when possible to other sources of metadata such as DBLP and the ACM Portal. To promote open data, CiteSeerX shares its data for non-commercial purposes under a Creative Commons license. CiteSeer is considered as a predecessor of academic search tools such as Google Scholar and Microsoft Academic Search. CiteSeer-like engines and archives usually only harvest documents ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]