Brevity Law
   HOME
*





Brevity Law
In linguistics, the brevity law (also called Zipf's law of abbreviation) is a linguistic law that qualitatively states that the more frequently a word is used, the shorter that word tends to be, and vice versa; the less frequently a word is used, the longer it tends to be. This is a statistical regularity that can be found in natural languages and other natural systems and that claims to be a general rule. The brevity law was originally formulated by the linguist George Kingsley Zipf in 1945 as a negative correlation between the frequency of a word and its size. He analyzed a written corpus in American English and showed that the average lengths in terms of the average number of phonemes fell as the frequency of occurrence increased. Similarly, in a Latin corpus, he found a negative correlation between the number of syllables in a word and the frequency of its appearance. This observation says that the most frequent words in a language are the shortest, e.g. the most common words ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Linguistics
Linguistics is the scientific study of human language. It is called a scientific study because it entails a comprehensive, systematic, objective, and precise analysis of all aspects of language, particularly its nature and structure. Linguistics is concerned with both the cognitive and social aspects of language. It is considered a scientific field as well as an academic discipline; it has been classified as a social science, natural science, cognitive science,Thagard, PaulCognitive Science, The Stanford Encyclopedia of Philosophy (Fall 2008 Edition), Edward N. Zalta (ed.). or part of the humanities. Traditional areas of linguistic analysis correspond to phenomena found in human linguistic systems, such as syntax (rules governing the structure of sentences); semantics (meaning); morphology (structure of words); phonetics (speech sounds and equivalent gestures in sign languages); phonology (the abstract sound system of a particular language); and pragmatics (how social con ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Acoustic Communication In Animals
Animal communication is the transfer of information from one or a group of animals (sender or senders) to one or more other animals (receiver or receivers) that affects the current or future behavior of the receivers. Information may be sent intentionally, as in a courtship display, or unintentionally, as in the transfer of scent from predator to prey. Information may be transferred to an "audience" of several receivers. Animal communication is a rapidly growing area of study in disciplines including animal behavior, sociology, neurology and animal cognition. Many aspects of animal behavior, such as symbolic name use, emotional expression, learning and sexual behavior, are being understood in new ways. When the information from the sender changes the behavior of a receiver, the information is referred to as a "signal". Signalling theory predicts that for a signal to be maintained in the population, both the sender and receiver should usually receive some benefit from the interact ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Principle Of Least Effort
The principle of least effort is a broad theory that covers diverse fields from evolutionary biology to webpage design. It postulates that animals, people, and even well-designed machines will naturally choose the path of least resistance or "effort". It is closely related to many other similar principles: see Principle of least action or other articles listed below. This is perhaps best known or at least documented among researchers in the field of library and information science. Their principle states that an information-seeking client will tend to use the most convenient search method in the least exacting mode available. Information-seeking behavior stops as soon as minimally acceptable results are found. This theory holds true regardless of the user's proficiency as a searcher, or their level of subject expertise. Also, this theory takes into account the user’s previous information-seeking experience. The user will use the tools that are most familiar and easy to use that ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Pareto Distribution
The Pareto distribution, named after the Italian civil engineer, economist, and sociologist Vilfredo Pareto ( ), is a power-law probability distribution that is used in description of social, quality control, scientific, geophysical, actuarial, and many other types of observable phenomena; the principle originally applied to describing the distribution of wealth in a society, fitting the trend that a large portion of wealth is held by a small fraction of the population. The Pareto principle or "80-20 rule" stating that 80% of outcomes are due to 20% of causes was named in honour of Pareto, but the concepts are distinct, and only Pareto distributions with shape value () of log45 ≈ 1.16 precisely reflect it. Empirical observation has shown that this 80-20 distribution fits a wide range of cases, including natural phenomena and human activities. Definitions If ''X'' is a random variable with a Pareto (Type I) distribution, then the probability that ''X'' is ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Benford's Law
Benford's law, also known as the Newcomb–Benford law, the law of anomalous numbers, or the first-digit law, is an observation that in many real-life sets of numerical data, the leading digit is likely to be small.Arno Berger and Theodore P. HillBenford's Law Strikes Back: No Simple Explanation in Sight for Mathematical Gem 2011. In sets that obey the law, the number 1 appears as the leading significant digit about 30% of the time, while 9 appears as the leading significant digit less than 5% of the time. If the digits were distributed uniformly, they would each occur about 11.1% of the time. Benford's law also makes predictions about the distribution of second digits, third digits, digit combinations, and so on. The graph to the right shows Benford's law for base 10, one of infinitely many cases of a generalized law regarding numbers expressed in arbitrary (integer) bases, which rules out the possibility that the phenomenon might be an artifact of the base-10 number syste ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Bradford's Law
Bradford's law is a pattern first described by Samuel C. Bradford in 1934 that estimates the exponentially diminishing returns of searching for references in science journals. One formulation is that if journals in a field are sorted by number of articles into three groups, each with about one-third of all articles, then the number of journals in each group will be proportional to 1:n:n². There are a number of related formulations of the principle. In many disciplines this pattern is called a Pareto distribution. As a practical example, suppose that a researcher has five core scientific journals for his or her subject. Suppose that in a month there are 12 articles of interest in those journals. Suppose further that in order to find another dozen articles of interest, the researcher would have to go to an additional 10 journals. Then that researcher's Bradford multiplier ''b''''m'' is 2 (i.e. 10/5). For each new dozen articles, that researcher will need to look in ''b''''m'' time ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Menzerath's Law
Menzerath's law, or Menzerath–Altmann law (named after Paul Menzerath and Gabriel Altmann), is a linguistic law according to which the increase of the size of a linguistic construct results in a decrease of the size of its constituents, and vice versa. E.g., the longer a sentence (measured in terms of the number of clauses) the shorter the clauses (measured in terms of the number of words), or: the longer a word (in syllables or morphs) the shorter the syllables or morphs in sounds. According to Altmann (1980), it can be mathematically stated as: y=a \cdot x^ \cdot e^ where: * y is the constituent size (e.g. syllable length) * x size of the linguistic construct that is being inspected (e.g. number of syllables per word) * a, b, c are the parameters The law can be explained by the assumption that linguistic segments contain information about its structure (besides the information that needs to be communicated). The assumption that the length of the structure information is in ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Heaps' Law
In linguistics, Heaps' law (also called Herdan's law) is an empirical law which describes the number of distinct words in a document (or set of documents) as a function of the document length (so called type-token relation). It can be formulated as : V_R(n) = Kn^\beta where ''VR'' is the number of distinct words in an instance text of size ''n''. ''K'' and β are free parameters determined empirically. With English text corpora, typically ''K'' is between 10 and 100, and β is between 0.4 and 0.6. The law is frequently attributed to Harold Stanley Heaps, but was originally discovered by . Under mild assumptions, the Herdan–Heaps law is asymptotically equivalent to Zipf's law concerning the frequencies of individual words within a text. This is a consequence of the fact that the type-token relation (in general) of a homogenous text can be derived from the distribution of its types. Heaps' law means that as more instance text is gathered, there will be diminishing returns in t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering (field), information engineering, and electrical engineering. A key measure in information theory is information entropy, entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a dice, die (with six equally likely outcomes). Some other important measures in information theory are mutual informat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Data Compression
In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation. Any particular compression is either lossy or lossless. Lossless compression reduces bits by identifying and eliminating statistical redundancy. No information is lost in lossless compression. Lossy compression reduces bits by removing unnecessary or less important information. Typically, a device that performs data compression is referred to as an encoder, and one that performs the reversal of the process (decompression) as a decoder. The process of reducing the size of a data file is often referred to as data compression. In the context of data transmission, it is called source coding; encoding done at the source of the data before it is stored or transmitted. Source coding should not be confused with channel coding, for error detection and correction or line coding, the means for mapping data onto a signal. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Principle Of Least Effort
The principle of least effort is a broad theory that covers diverse fields from evolutionary biology to webpage design. It postulates that animals, people, and even well-designed machines will naturally choose the path of least resistance or "effort". It is closely related to many other similar principles: see Principle of least action or other articles listed below. This is perhaps best known or at least documented among researchers in the field of library and information science. Their principle states that an information-seeking client will tend to use the most convenient search method in the least exacting mode available. Information-seeking behavior stops as soon as minimally acceptable results are found. This theory holds true regardless of the user's proficiency as a searcher, or their level of subject expertise. Also, this theory takes into account the user’s previous information-seeking experience. The user will use the tools that are most familiar and easy to use that ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]