Mildly Context-sensitive Grammar
   HOME
*





Mildly Context-sensitive Grammar
In computational linguistics, the term mildly context-sensitive grammar formalisms refers to several grammar formalisms that have been developed in an effort to provide adequate descriptions of the syntactic structure of natural language. Every mildly context-sensitive grammar formalism defines a class of mildly context-sensitive grammars (the grammars that can be specified in the formalism), and therefore also a class of mildly context-sensitive languages (the formal languages generated by the grammars). Background By 1985, several researchers in descriptive and mathematical linguistics had provided evidence against the hypothesis that the syntactic structure of natural language can be adequately described by context-free grammars.Riny Huybregts. "The Weak Inadequacy of Context-Free Phrase Structure Grammars". In Ger de Haan, Mieke Trommelen, and Wim Zonneveld, editors, ''Van periferie naar kern'', pages 81–99. Foris, Dordrecht, The Netherlands, 1984.Stuart M. Shieber.Evidenc ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Computational Linguistics
Computational linguistics is an Interdisciplinarity, interdisciplinary field concerned with the computational modelling of natural language, as well as the study of appropriate computational approaches to linguistic questions. In general, computational linguistics draws upon linguistics, computer science, artificial intelligence, mathematics, logic, philosophy, cognitive science, cognitive psychology, psycholinguistics, anthropology and neuroscience, among others. Sub-fields and related areas Traditionally, computational linguistics emerged as an area of artificial intelligence performed by computer scientists who had specialized in the application of computers to the processing of a natural language. With the formation of the Association for Computational Linguistics (ACL) and the establishment of independent conference series, the field consolidated during the 1970s and 1980s. The Association for Computational Linguistics defines computational linguistics as: The term "comp ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Pumping Lemma For Context-free Languages
Pumping may refer to: * The operation of a pump, for moving a liquid from one location to another **The use of a breast pump for extraction of milk * Pumping (audio), a creative misuse of dynamic range compression * Pumping (computer systems), the number of times data is transmitted per clock cycle * Pumping (oil well), injecting chemicals into a wellbore * Pumping (noise reduction), an unwanted artifact of some noise reduction systems * Pumping lemma, in the theory of formal languages * Gastric lavage, cleaning the contents of the stomach * Optical pumping, in which light is used to raise electrons from a lower energy level to a higher one * Pump (skateboarding) {{single source, date=February 2019 Pumping is a skateboarding technique used to accelerate without the rider's feet leaving the board. Pumping can be done by turning or on a transition, like a ramp or quarter pipe.https://skateboarding.transworl ..., accelerating without pushing off of the ground * "Pumping" (My Heart) ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Range Concatenation Grammar
Range concatenation grammar (RCG) is a grammar formalism developed by Pierre Boullier in 1998 as an attempt to characterize a number of phenomena of natural language, such as Chinese numbers and German word order scrambling, which are outside the bounds of the mildly context-sensitive languages. From a theoretical point of view, any language that can be parsed in polynomial time belongs to the subset of RCG called positive range concatenation grammars, and reciprocally. Though intended as a variant on Groenink's literal movement grammar In linguistics and theoretical computer science, literal movement grammars (LMGs) are a grammar formalism intended to characterize certain extraposition phenomena of natural language such as topicalization and cross-serial dependency. LMGs extend ...s (LMGs), RCGs treat the grammatical process more as a proof than as a production. Whereas LMGs produce a terminal string from a start predicate, RCGs aim to reduce a start predicate (which predicates ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Range Concatenation Grammars
Range concatenation grammar (RCG) is a grammar formalism developed by Pierre Boullier in 1998 as an attempt to characterize a number of phenomena of natural language, such as Chinese numbers and German word order scrambling, which are outside the bounds of the mildly context-sensitive languages. From a theoretical point of view, any language that can be parsed in polynomial time belongs to the subset of RCG called positive range concatenation grammars, and reciprocally. Though intended as a variant on Groenink's literal movement grammar In linguistics and theoretical computer science, literal movement grammars (LMGs) are a grammar formalism intended to characterize certain extraposition phenomena of natural language such as topicalization and cross-serial dependency. LMGs extend ...s (LMGs), RCGs treat the grammatical process more as a proof than as a production. Whereas LMGs produce a terminal string from a start predicate, RCGs aim to reduce a start predicate (which predicates ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Minimalist Grammar
Minimalist grammars are a class of formal grammars that aim to provide a more rigorous, usually proof-theoretic, formalization of Chomskyan Minimalist program than is normally provided in the mainstream Minimalist literature. A variety of particular formalizations exist, most of them developed by Edward Stabler, Alain Lecomte, Christian Retoré, or combinations thereof. Lecomte and Retoré's extensions of the Lambek Calculus Lecomte and Retoré (2001) introduce a formalism that modifies that core of the Lambek Calculus to allow for movement-like processes to be described without resort to the combinatorics of Combinatory categorial grammar Combinatory categorial grammar (CCG) is an efficiently parsable, yet linguistically expressive grammar formalism. It has a transparent interface between surface syntax and underlying semantic representation, including predicate–argument structure .... The formalism is presented in proof-theoretic terms. Differing only slightly in notation fr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Linear Context-free Rewriting System
Generalized context-free grammar (GCFG) is a grammar formalism that expands on context-free grammars by adding potentially non-context-free composition functions to rewrite rules. Head grammar (and its weak equivalents) is an instance of such a GCFG which is known to be especially adept at handling a wide variety of non-CF properties of natural language. Description A GCFG consists of two components: a set of composition functions that combine string tuples, and a set of rewrite rules. The composition functions all have the form f(\langle x_1, ..., x_m \rangle, \langle y_1, ..., y_n \rangle, ...) = \gamma, where \gamma is either a single string tuple, or some use of a (potentially different) composition function which reduces to a string tuple. Rewrite rules look like X \to f(Y, Z, ...), where Y, Z, ... are string tuples or non-terminal symbols. The rewrite semantics of GCFGs is fairly straightforward. An occurrence of a non-terminal symbol is rewritten using rewrite rules as in ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Combinatory Categorial Grammar
Combinatory categorial grammar (CCG) is an efficiently parsable, yet linguistically expressive grammar formalism. It has a transparent interface between surface syntax and underlying semantic representation, including predicate–argument structure, quantification and information structure. The formalism generates constituency-based structures (as opposed to dependency-based ones) and is therefore a type of phrase structure grammar (as opposed to a dependency grammar). CCG relies on combinatory logic, which has the same expressive power as the lambda calculus, but builds its expressions differently. The first linguistic and psycholinguistic arguments for basing the grammar on combinators were put forth by Steedman and Szabolcsi. More recent prominent proponents of the approach are Pauline Jacobson anJason Baldridge In these new approaches, the combinator B (the compositor) is useful in creating long-distance dependencies, as in "Who do you think Mary is talking about?" and the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Linear Indexed Grammar
Indexed grammars are a generalization of context-free grammars in that nonterminals are equipped with lists of ''flags'', or ''index symbols''. The language produced by an indexed grammar is called an indexed language. Definition Modern definition by Hopcroft and Ullman In contemporary publications following Hopcroft and Ullman (1979), an indexed grammar is formally defined a 5-tuple ''G'' = ⟨''N'',''T'',''F'',''P'',''S''⟩ where * ''N'' is a set of variables or Nonterminal, nonterminal symbols, * ''T'' is a set ("alphabet (formal languages), alphabet") of terminal symbols, * ''F'' is a set of so-called ''index symbols'', or ''indices'', * ''S'' ∈ ''N'' is the ''start symbol (formal languages), start symbol'', and * ''P'' is a finite set of ''Production (formal languages), productions''. In productions as well as in derivations of indexed grammars, a string ("stack") ''σ'' ∈ ''F''Kleene star, * of index symbols is attached to every nonterminal symbol ''A'' ∈ ''N'', de ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Carl Pollard
Carl Jesse Pollard (born June 28, 1947) is a Professor of Linguistics at the Ohio State University. He is the inventor of head grammar and higher-order grammar, as well as co-inventor of head-driven phrase structure grammar (HPSG). He is currently also working on convergent grammar (CVG). He has written numerous books and articles on formal syntax and semantics. He received his Ph.D. from Stanford Stanford University, officially Leland Stanford Junior University, is a private research university in Stanford, California. The campus occupies , among the largest in the United States, and enrolls over 17,000 students. Stanford is considere .... External linksCarl Pollard's website 1947 births Living people Linguists from the United States Syntacticians Ohio State University faculty {{US-linguist-stub ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Head Grammar
Head grammar (HG) is a grammar formalism introduced in Carl Pollard (1984) Pollard, C. 1984. ''Generalized Phrase Structure Grammars, Head Grammars, and Natural Language''. Ph.D. thesis, Stanford University, CA. as an extension of the context-free grammar class of grammars. Head grammar is therefore a type of phrase structure grammar, as opposed to a dependency grammar. The class of head grammars is a subset of the linear context-free rewriting systems. One typical way of defining head grammars is to replace the terminal strings of CFGs with indexed terminal strings, where the index denotes the "head" word of the string. Thus, for example, a CF rule such as A \to abc might instead be A \to (abc, 0), where the 0th terminal, the ''a'', is the head of the resulting terminal string. For convenience of notation, such a rule could be written as just the terminal string, with the head terminal denoted by some sort of mark, as in A \to \widehatbc. Two fundamental operations are then ad ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Tree-adjoining Grammar
Tree-adjoining grammar (TAG) is a grammar formalism defined by Aravind Joshi. Tree-adjoining grammars are somewhat similar to context-free grammars, but the elementary unit of rewriting is the tree rather than the symbol. Whereas context-free grammars have rules for rewriting symbols as strings of other symbols, tree-adjoining grammars have rules for rewriting the nodes of trees as other trees (see tree (graph theory) and tree (data structure)). History TAG originated in investigations by Joshi and his students into the family of adjunction grammars (AG), the "string grammar" of Zellig Harris. AGs handle exocentric properties of language in a natural and effective way, but do not have a good characterization of endocentric constructions; the converse is true of rewrite grammars, or phrase-structure grammar (PSG). In 1969, Joshi introduced a family of grammars that exploits this complementarity by mixing the two types of rules. A few very simple rewrite rules suffice to genera ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Embedded Pushdown Automaton
An embedded pushdown automaton or EPDA is a computational model for parsing languages generated by tree-adjoining grammars (TAGs). It is similar to the context-free grammar-parsing pushdown automaton, but instead of using a plain stack to store symbols, it has a stack of iterated stacks that store symbols, giving TAGs a generative capacity between context-free and context-sensitive grammars, or a subset of mildly context-sensitive grammars. Embedded pushdown automata should not be confused with nested stack automata which have more computational power. History and applications EPDAs were first described by K. Vijay-Shanker in his 1988 doctoral thesis. They have since been applied to more complete descriptions of classes of mildly context-sensitive grammars and have had important roles in refining the Chomsky hierarchy. Various subgrammars, such as the linear indexed grammar, can thus be defined. While natural languages have traditionally been analyzed using context-free grammars (s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]