HOME
*





Orlicz Space
In mathematical analysis, and especially in real, harmonic analysis and functional analysis, an Orlicz space is a type of function space which generalizes the ''L''''p'' spaces. Like the ''L''''p'' spaces, they are Banach spaces. The spaces are named for Władysław Orlicz, who was the first to define them in 1932. Besides the ''L''''p'' spaces, a variety of function spaces arising naturally in analysis are Orlicz spaces. One such space ''L'' log+ ''L'', which arises in the study of Hardy–Littlewood maximal functions, consists of measurable functions ''f'' such that the integral :\int_ , f(x), \log^+ , f(x), \,dx < \infty. Here log+ is the of the logarithm. Also included in the class of Orlicz spaces are many of the most important

picture info

Mathematical Analysis
Analysis is the branch of mathematics dealing with continuous functions, limit (mathematics), limits, and related theories, such as Derivative, differentiation, Integral, integration, measure (mathematics), measure, infinite sequences, series (mathematics), series, and analytic functions. These theories are usually studied in the context of Real number, real and Complex number, complex numbers and Function (mathematics), functions. Analysis evolved from calculus, which involves the elementary concepts and techniques of analysis. Analysis may be distinguished from geometry; however, it can be applied to any Space (mathematics), space of mathematical objects that has a definition of nearness (a topological space) or specific distances between objects (a metric space). History Ancient Mathematical analysis formally developed in the 17th century during the Scientific Revolution, but many of its ideas can be traced back to earlier mathematicians. Early results in analysis were i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Almost Everywhere
In measure theory (a branch of mathematical analysis), a property holds almost everywhere if, in a technical sense, the set for which the property holds takes up nearly all possibilities. The notion of "almost everywhere" is a companion notion to the concept of measure zero, and is analogous to the notion of ''almost surely'' in probability theory. More specifically, a property holds almost everywhere if it holds for all elements in a set except a subset of measure zero, or equivalently, if the set of elements for which the property holds is conull. In cases where the measure is not complete, it is sufficient that the set be contained within a set of measure zero. When discussing sets of real numbers, the Lebesgue measure is usually assumed unless otherwise stated. The term ''almost everywhere'' is abbreviated ''a.e.''; in older literature ''p.p.'' is used, to stand for the equivalent French language phrase ''presque partout''. A set with full measure is one whose complement i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Moment-generating Function
In probability theory and statistics, the moment-generating function of a real-valued random variable is an alternative specification of its probability distribution. Thus, it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or cumulative distribution functions. There are particularly simple results for the moment-generating functions of distributions defined by the weighted sums of random variables. However, not all random variables have moment-generating functions. As its name implies, the moment-generating function can be used to compute a distribution’s moments: the ''n''th moment about 0 is the ''n''th derivative of the moment-generating function, evaluated at 0. In addition to real-valued distributions (univariate distributions), moment-generating functions can be defined for vector- or matrix-valued random variables, and can even be extended to more general cases. The moment-generating func ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Sub-Gaussian Distribution
In probability theory, a sub-Gaussian distribution is a probability distribution with strong tail decay. Informally, the tails of a sub-Gaussian distribution are dominated by (i.e. decay at least as fast as) the tails of a Gaussian. This property gives sub-Gaussian distributions their name. Formally, the probability distribution of a random variable ''X '' is called sub-Gaussian if there are positive constant ''C'' such that for every t \geq 0, : \operatorname(, X, \geq t) \leq 2 \exp . Sub-Gaussian properties Let ''X '' be a random variable. The following conditions are equivalent: # \operatorname(, X, \geq t) \leq 2 \exp for all t \geq 0, where K_1 is a positive constant; # \operatornameexp\leq 2, where K_2 is a positive constant; # \operatorname , X, ^p \leq 2K_3^p \Gamma\left(\frac+1\right) for all ''p \geq 1'', where K_3 is a positive constant. ''Proof''. (1)\implies(3) By the layer cake representation,\begin \operatorname , X, ^p &= \int_0^\infty \operatorname(, X ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Moment (mathematics)
In mathematics, the moments of a function are certain quantitative measures related to the shape of the function's graph. If the function represents mass density, then the zeroth moment is the total mass, the first moment (normalized by total mass) is the center of mass, and the second moment is the moment of inertia. If the function is a probability distribution, then the first moment is the expected value, the second central moment is the variance, the third standardized moment is the skewness, and the fourth standardized moment is the kurtosis. The mathematical concept is closely related to the concept of moment in physics. For a distribution of mass or probability on a bounded interval, the collection of all the moments (of all orders, from to ) uniquely determines the distribution (Hausdorff moment problem). The same is not true on unbounded intervals (Hamburger moment problem). In the mid-nineteenth century, Pafnuty Chebyshev became the first person to think systematic ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Homogeneous Function
In mathematics, a homogeneous function is a function of several variables such that, if all its arguments are multiplied by a scalar, then its value is multiplied by some power of this scalar, called the degree of homogeneity, or simply the ''degree''; that is, if is an integer, a function of variables is homogeneous of degree if :f(sx_1,\ldots, sx_n)=s^k f(x_1,\ldots, x_n) for every x_1, \ldots, x_n, and s\ne 0. For example, a homogeneous polynomial of degree defines a homogeneous function of degree . The above definition extends to functions whose domain and codomain are vector spaces over a field : a function f : V \to W between two -vector spaces is ''homogeneous'' of degree k if for all nonzero s \in F and v \in V. This definition is often further generalized to functions whose domain is not , but a cone in , that is, a subset of such that \mathbf\in C implies s\mathbf\in C for every nonzero scalar . In the case of functions of several real variables and real vecto ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Random Variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the possible upper sides of a flipped coin such as heads H and tails T) in a sample space (e.g., the set \) to a measurable space, often the real numbers (e.g., \ in which 1 corresponding to H and -1 corresponding to T). Informally, randomness typically represents some fundamental element of chance, such as in the roll of a dice; it may also represent uncertainty, such as measurement error. However, the interpretation of probability is philosophically complicated, and even in specific cases is not always straightforward. The purely mathematical analysis of random variables is independent of such interpretational difficulties, and can be based upon a rigorous axiomatic setup. In the formal mathematical language of measure theory, a random var ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Trudinger Inequality
In mathematical analysis, Trudinger's theorem or the Trudinger inequality (also sometimes called the Moser–Trudinger inequality) is a result of functional analysis on Sobolev spaces. It is named after Neil Trudinger (and Jürgen Moser). It provides an inequality between a certain Sobolev space norm and an Orlicz space norm of a function. The inequality is a limiting case of Sobolev imbedding and can be stated as the following theorem: Let \Omega be a bounded domain in \mathbb^n satisfying the cone condition. Let mp=n and p>1. Set : A(t)=\exp\left( t^ \right)-1. Then there exists the embedding : W^(\Omega)\hookrightarrow L_A(\Omega) where : L_A(\Omega)=\left\. The space :L_A(\Omega) is an example of an Orlicz space In mathematical analysis, and especially in real, harmonic analysis and functional analysis, an Orlicz space is a type of function space which generalizes the ''L'p'' spaces. Like the ''L'p'' spaces, they are Banach spaces. The spaces are na .... Referenc ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Lipschitz Domain
In mathematics, a Lipschitz domain (or domain with Lipschitz boundary) is a domain in Euclidean space whose boundary is "sufficiently regular" in the sense that it can be thought of as locally being the graph of a Lipschitz continuous function. The term is named after the German mathematician Rudolf Lipschitz. Definition Let n \in \mathbb N. Let \Omega be a domain of \mathbb R^n and let \partial\Omega denote the boundary of \Omega. Then \Omega is called a Lipschitz domain if for every point p \in \partial\Omega there exists a hyperplane H of dimension n-1 through p, a Lipschitz-continuous function g : H \rightarrow \mathbb R over that hyperplane, and reals r > 0 and h > 0 such that * \Omega \cap C = \left\ * (\partial\Omega) \cap C = \left\ where :\vec is a unit vector that is normal to H, :B_ (p) := \ is the open ball of radius r, :C := \left\. In other words, at each point of its boundary, \Omega is locally the set of points located above the graph of some Lipschitz function. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Bounded Set
:''"Bounded" and "boundary" are distinct concepts; for the latter see boundary (topology). A circle in isolation is a boundaryless bounded set, while the half plane is unbounded yet has a boundary. In mathematical analysis and related areas of mathematics, a set is called bounded if it is, in a certain sense, of finite measure. Conversely, a set which is not bounded is called unbounded. The word 'bounded' makes no sense in a general topological space without a corresponding metric Metric or metrical may refer to: * Metric system, an internationally adopted decimal system of measurement * An adjective indicating relation to measurement in general, or a noun describing a specific type of measurement Mathematics In mathem .... A bounded set is not necessarily a closed set and vise versa. For example, a subset ''S'' of a 2-dimensional real space R''2'' constrained by two parabolic curves ''x''2 + 1 and ''x''2 - 1 defined in a Cartesian coordinate system is a closed but is not b ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Open Set
In mathematics, open sets are a generalization of open intervals in the real line. In a metric space (a set along with a distance defined between any two points), open sets are the sets that, with every point , contain all points that are sufficiently near to (that is, all points whose distance to is less than some value depending on ). More generally, one defines open sets as the members of a given collection of subsets of a given set, a collection that has the property of containing every union of its members, every finite intersection of its members, the empty set, and the whole set itself. A set in which such a collection is given is called a topological space, and the collection is called a topology. These conditions are very loose, and allow enormous flexibility in the choice of open sets. For example, ''every'' subset can be open (the discrete topology), or no set can be open except the space itself and the empty set (the indiscrete topology). In practice, however, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Norm (mathematics)
In mathematics, a norm is a function from a real or complex vector space to the non-negative real numbers that behaves in certain ways like the distance from the origin: it commutes with scaling, obeys a form of the triangle inequality, and is zero only at the origin. In particular, the Euclidean distance of a vector from the origin is a norm, called the Euclidean norm, or 2-norm, which may also be defined as the square root of the inner product of a vector with itself. A seminorm satisfies the first two properties of a norm, but may be zero for vectors other than the origin. A vector space with a specified norm is called a normed vector space. In a similar manner, a vector space with a seminorm is called a ''seminormed vector space''. The term pseudonorm has been used for several related meanings. It may be a synonym of "seminorm". A pseudonorm may satisfy the same axioms as a norm, with the equality replaced by an inequality "\,\leq\," in the homogeneity axiom. It can also re ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]