Dependency
The fundamental mechanism of operator grammar is the dependency constraint: certain words ( operators) require that one or more words (arguments) be present in an utterance. In the sentence ''John wears boots'', the operator ''wears'' requires the presence of two arguments, such as ''John'' and ''boots''. (This definition of dependency differs from otherLikelihood
The dependency constraint creates a structure (syntax) in which any word of the appropriate class can be an argument for a given operator. The likelihood constraint places additional restrictions on this structure by making some operator/argument combinations more likely than others. Thus, ''John wears hats'' is more likely than ''John wears snow'' which in turn is more likely than ''John wears vacation''. The likelihood constraint creates meaning (semantics) by defining each word in terms of the words it can take as arguments, or of which it can be an argument. Each word has a unique set of words with which it has been observed to occur called its selection. The coherent selection of a word is the set of words for which the dependency relation has above average likelihood. Words that are similar in meaning have similar coherent selection. This approach to meaning is self-organizing in that no external system is necessary to define what words mean. Instead, the meaning of the word is determined by its usage within a population of speakers. Patterns of frequent use are observable and therefore learnable. New words can be introduced at any time and defined through usage. In this sense, link grammar could be viewed as a kind of operator grammar, in that the linkage of words is determined entirely by their context, and that each selection is assigned a log-likelihood.Reduction
The reduction constraint acts on high likelihood combinations of operators and arguments and makes more compact forms. Certain reductions allow words to be omitted completely from an utterance. For example, ''I expect John to come'' is reducible to ''I expect John'', because ''to come'' is highly likely under ''expect''. The sentence ''John wears boots and John wears hats'' can be reduced to ''John wears boots and hats'' because repetition of the first argument ''John'' under the operator ''and'' is highly likely. ''John reads things'' can be reduced to ''John reads'', because the argument ''things'' has high likelihood of occurring under any operator. Certain reductions reduce words to shorter forms, creating pronouns, suffixes and prefixes ( morphology). ''John wears boots and John wears hats'' can be reduced to ''John wears boots and he wears hats'', where the pronoun ''he'' is a reduced form of ''John''. Suffixes and prefixes can be obtained by appending other freely occurring words, or variants of these. ''John is able to be liked'' can be reduced to ''John is likeable''. ''John is thoughtful'' is reduced from ''John is full of thought'', and ''John is anti-war'' from ''John is against war''. Modifiers are the result of several of these kinds of reductions, which give rise to adjectives, adverbs, prepositional phrases,Information
The importance of reductions in operator grammar is that they separate sentences that contain reduced forms from those that don’t (base sentences). All reductions are paraphrases, since they do not remove any information, just make sentences more compact. Thus, the base sentences contain all the information of the language and the reduced sentences are variants of these. Base sentences are made up of simple words without modifiers and largely without affixes, e.g. ''snow falls'', ''sheep eat grass'', ''John knows sheep eat grass'', ''that sheep eat snow surprises John''. Each operator in a sentence makes a contribution in information according to its likelihood of occurrence with its arguments. Highly expected combinations have low information; rare combinations have high information. The precise contribution of an operator is determined by its selection, the set of words with which it occurs with high frequency. The arguments ''boots'', ''hats'', ''sheep'', ''grass'' and ''snow'' differ in meaning according to the operators for which they can appear with high likelihood in first or second argument position. For example, ''snow'' is expected as first argument of ''fall'' but not of ''eat'', while the reverse is true of ''sheep''. Similarly, the operators ''eat'', ''devour'', ''chew'' and ''swallow'' differ in meaning to the extent that the arguments they select and the operators that select them differ. Operator grammar predicts that the information carried by a sentence is the accumulation of contributions of each argument and operator. The increment of information that a given word adds to a new sentence is determined by how it was used before. In turn, new usages stretch or even alter the information content associated with a word. Because this process is based on high frequency usage, the meanings of words are relatively stable over time, but can change in accordance with the needs of a linguistic community.Bibliography
* * * *{{Citation , last= Harris , first=Zellig , title=A Theory of Language and Information: A Mathematical Approach , publisher=Oxford University Press, USA , year=1991 , ISBN=0-19-824224-7 Dependency grammar Information theory