Shearer's Inequality
   HOME
*





Shearer's Inequality
Shearer's inequality or also Shearer's lemma, in mathematics, is an inequality in information theory relating the entropy of a set of variables to the entropies of a collection of subsets. It is named for mathematician James B. Shearer. Concretely, it states that if ''X''1, ..., ''X''''d'' are random variables and ''S''1, ..., ''S''''n'' are subsets of such that every integer between 1 and ''d'' lies in at least ''r'' of these subsets, then : H X_1,\dots,X_d)\leq \frac\sum_^n H X_j)_/math> where H is entropy and (X_)_ is the Cartesian product of random variables X_ with indices ''j'' in S_. Combinatorial version Let \mathcal be a family of subsets of (possibly with repeats) with each i\in /math> included in at least t members of \mathcal. Let \mathcal be another set of subsets of \mathcal F. Then \mathcal , \mathcal, \leq \prod_\operatorname_(\mathcal)^ where \operatorname_(\mathcal)=\ the set of possible intersections of elements of \mathcal with ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics with the major subdisciplines of number theory, algebra, geometry, and analysis, respectively. There is no general consensus among mathematicians about a common definition for their academic discipline. Most mathematical activity involves the discovery of properties of abstract objects and the use of pure reason to prove them. These objects consist of either abstractions from nature orin modern mathematicsentities that are stipulated to have certain properties, called axioms. A ''proof'' consists of a succession of applications of deductive rules to already established results. These results include previously proved theorems, axioms, andin case of abstraction from naturesome basic properties that are considered true starting points of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering (field), information engineering, and electrical engineering. A key measure in information theory is information entropy, entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a dice, die (with six equally likely outcomes). Some other important measures in information theory are mutual informat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Entropy (information Theory)
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \mathcal and is distributed according to p: \mathcal\to , 1/math>: \Eta(X) := -\sum_ p(x) \log p(x) = \mathbb \log p(X), where \Sigma denotes the sum over the variable's possible values. The choice of base for \log, the logarithm, varies for different applications. Base 2 gives the unit of bits (or " shannons"), while base ''e'' gives "natural units" nat, and base 10 gives units of "dits", "bans", or " hartleys". An equivalent definition of entropy is the expected value of the self-information of a variable. The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication",PDF archived froherePDF archived frohere and is also referred to as Shannon entropy. Shannon's theory defi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


James B
James is a common English language surname and given name: *James (name), the typically masculine first name James * James (surname), various people with the last name James James or James City may also refer to: People * King James (other), various kings named James * Saint James (other) * James (musician) * James, brother of Jesus Places Canada * James Bay, a large body of water * James, Ontario United Kingdom * James College, York, James College, a college of the University of York United States * James, Georgia, an unincorporated community * James, Iowa, an unincorporated community * James City, North Carolina * James City County, Virginia ** James City (Virginia Company) ** James City Shire * James City, Pennsylvania * St. James City, Florida Arts, entertainment, and media * James (2005 film), ''James'' (2005 film), a Bollywood film * James (2008 film), ''James'' (2008 film), an Irish short film * James (2022 film), ''James'' (2022 film), an Indian Kannada ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Random Variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the possible upper sides of a flipped coin such as heads H and tails T) in a sample space (e.g., the set \) to a measurable space, often the real numbers (e.g., \ in which 1 corresponding to H and -1 corresponding to T). Informally, randomness typically represents some fundamental element of chance, such as in the roll of a dice; it may also represent uncertainty, such as measurement error. However, the interpretation of probability is philosophically complicated, and even in specific cases is not always straightforward. The purely mathematical analysis of random variables is independent of such interpretational difficulties, and can be based upon a rigorous axiomatic setup. In the formal mathematical language of measure theory, a random var ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Cartesian Product
In mathematics, specifically set theory, the Cartesian product of two sets ''A'' and ''B'', denoted ''A''×''B'', is the set of all ordered pairs where ''a'' is in ''A'' and ''b'' is in ''B''. In terms of set-builder notation, that is : A\times B = \. A table can be created by taking the Cartesian product of a set of rows and a set of columns. If the Cartesian product is taken, the cells of the table contain ordered pairs of the form . One can similarly define the Cartesian product of ''n'' sets, also known as an ''n''-fold Cartesian product, which can be represented by an ''n''-dimensional array, where each element is an ''n''-tuple. An ordered pair is a 2-tuple or couple. More generally still, one can define the Cartesian product of an indexed family of sets. The Cartesian product is named after René Descartes, whose formulation of analytic geometry gave rise to the concept, which is further generalized in terms of direct product. Examples A deck of cards An ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Lovász Local Lemma
In probability theory, if a large number of events are all independent of one another and each has probability less than 1, then there is a positive (possibly small) probability that none of the events will occur. The Lovász local lemma allows one to relax the independence condition slightly: As long as the events are "mostly" independent from one another and aren't individually too likely, then there will still be a positive probability that none of them occurs. It is most commonly used in the probabilistic method, in particular to give existence proofs. There are several different versions of the lemma. The simplest and most frequently used is the symmetric version given below. A weaker version was proved in 1975 by László Lovász and Paul Erdős in the article ''Problems and results on 3-chromatic hypergraphs and some related questions''. For other versions, see . In 2020, Robin Moser and Gábor Tardos received the Gödel Prize for their algorithmic version of the Lovász L ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering (field), information engineering, and electrical engineering. A key measure in information theory is information entropy, entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a dice, die (with six equally likely outcomes). Some other important measures in information theory are mutual informat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]