Constraint in
information theory
Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
is the degree of statistical dependence between or among variables.
Garner
[ Garner W R (1962). ''Uncertainty and Structure as Psychological Concepts'', John Wiley & Sons, New York.] provides a thorough discussion of various forms of constraint (internal constraint, external constraint, total constraint) with application to
pattern recognition
Pattern recognition is the automated recognition of patterns and regularities in data. It has applications in statistical data analysis, signal processing, image analysis, information retrieval, bioinformatics, data compression, computer grap ...
and
psychology
Psychology is the scientific study of mind and behavior. Psychology includes the study of conscious and unconscious phenomena, including feelings and thoughts. It is an academic discipline of immense scope, crossing the boundaries between ...
.
See also
*
Mutual Information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the " amount of information" (in units such a ...
*
Total Correlation
*
Interaction information
The interaction information is a generalization of the mutual information for more than two variables.
There are many names for interaction information, including ''amount of information'', ''information correlation'', ''co-information'', and sim ...
References
Information theory
{{compsci-stub