Indexed Language
   HOME





Indexed Language
Indexed languages are a class of formal languages discovered by Alfred Aho; they are described by indexed grammars and can be recognized by nested stack automata. Indexed languages are a proper subset of context-sensitive languages. They qualify as an abstract family of languages (furthermore a full AFL) and hence satisfy many closure properties. However, they are not closed under intersection or complement. The class of indexed languages has generalization of context-free languages, since indexed grammars can describe many of the nonlocal constraints occurring in natural languages. Gerald Gazdar (1988) and Vijay-Shanker (1987) introduced a mildly context-sensitive language class now known as linear indexed grammars (LIG). Linear indexed grammars have additional restrictions relative to IG. LIGs are weakly equivalent (generate the same language class) as tree adjoining grammars. Examples The following languages are indexed, but are not context-free: : \ : \ These two la ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Formal Language
In logic, mathematics, computer science, and linguistics, a formal language is a set of strings whose symbols are taken from a set called "alphabet". The alphabet of a formal language consists of symbols that concatenate into strings (also called "words"). Words that belong to a particular formal language are sometimes called ''well-formed words''. A formal language is often defined by means of a formal grammar such as a regular grammar or context-free grammar. In computer science, formal languages are used, among others, as the basis for defining the grammar of programming languages and formalized versions of subsets of natural languages, in which the words of the language represent concepts that are associated with meanings or semantics. In computational complexity theory, decision problems are typically defined as formal languages, and complexity classes are defined as the sets of the formal languages that can be parsed by machines with limited computational power. In ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Mildly Context-sensitive Language
In computational linguistics, the term mildly context-sensitive grammar formalisms refers to several grammar formalisms that have been developed in an effort to provide adequate descriptions of the syntactic structure of natural language. Every mildly context-sensitive grammar formalism defines a class of mildly context-sensitive grammars (the grammars that can be specified in the formalism), and therefore also a class of mildly context-sensitive languages (the formal languages generated by the grammars). Background By 1985, several researchers in descriptive and mathematical linguistics had provided evidence against the hypothesis that the syntactic structure of natural language can be adequately described by context-free grammars.Riny Huybregts. "The Weak Inadequacy of Context-Free Phrase Structure Grammars". In Ger de Haan, Mieke Trommelen, and Wim Zonneveld, editors, ''Van periferie naar kern'', pages 81–99. Foris, Dordrecht, The Netherlands, 1984.Stuart M. Shieber.Evide ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Pumping Lemma For Context-free Languages
In computer science, in particular in formal language theory, the pumping lemma for context-free languages, also known as the Bar-Hillel lemma, is a lemma that gives a property shared by all context-free languages and generalizes the pumping lemma for regular languages. The pumping lemma can be used to construct a refutation by contradiction that a specific language is ''not'' context-free. Conversely, the pumping lemma does not suffice to guarantee that a language ''is'' context-free; there are other necessary conditions, such as Ogden's lemma, or the Interchange lemma. Formal statement If a language L is context-free, then there exists some integer p \geq 1 (called a "pumping length") (Also see s, \geq p) can be written as : s = uvwxy with substrings u, v, w, x and y, such that : 1. , vx, \geq 1, : 2. , vwx, \leq p, and : 3. uv^n wx^n y \in L for all n \geq 0. Below is a formal expression of the Pumping Lemma. \begin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Tom Maibaum
Thomas Stephen Edward Maibaum Fellow of the Royal Society of Arts (FRSA) is a computer scientist. Maibaum has a Bachelor of Science (B.Sc.) undergraduate degree in pure mathematics from the University of Toronto, Canada (1970), and a Doctor of Philosophy (Ph.D.) in computer science from Queen Mary and Royal Holloway Colleges, University of London, England (1974). Maibaum has held academic posts at Imperial College, London, King's College London (UK) and McMaster University (Canada). His research interests have concentrated on the theory of specification, together with its application in different contexts, in the general area of software engineering. From 1996 to 2005, he was involved with developing international standards in programming and informatics, as a member of the International Federation for Information Processing (IFIP) IFIP Working Group 2.1 on Algorithmic Languages and Calculi, which specified, maintains, and supports the programming languages ALGOL 60 and ALGOL 6 ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Sheila Greibach
Sheila Adele Greibach (born 6 October 1939 in New York City) is an American researcher in formal languages in computing, automata, compiler theory and computer science. She is an Emeritus Professor of Computer Science at the University of California, Los Angeles, and notable work include working with Seymour Ginsburg and Michael A. Harrison in context-sensitive parsing using the stack automaton model. Besides establishing the normal form ( Greibach normal form) for context-free grammars, in 1965, she also investigated properties of W-grammars, pushdown automata, and decidability problems. Early career Greibach earned an A.B. degree ( summa cum laude) in Linguistics and Applied Mathematics from Radcliffe College in 1960, and two years after achieved an A.M. degree. In 1963, she was awarded a PhD at Harvard University, advised by Anthony Oettinger with a PhD thesis entitled "Inverses of Phrase Structure Generators". She continued to work at Harvard at the Division of E ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Michael J
Michael may refer to: People * Michael (given name), a given name * he He ..., a given name * Michael (surname), including a list of people with the surname Michael Given name * Michael (bishop elect)">Michael (surname)">he He ..., a given name * Michael (surname), including a list of people with the surname Michael Given name * Michael (bishop elect), English 13th-century Bishop of Hereford elect * Michael (Khoroshy) (1885–1977), cleric of the Ukrainian Orthodox Church of Canada * Michael Donnellan (fashion designer), Michael Donnellan (1915–1985), Irish-born London fashion designer, often referred to simply as "Michael" * Michael (footballer, born 1982), Brazilian footballer * Michael (footballer, born 1983), Brazilian footballer * Michael (footballer, born 1993), Brazilian footballer * Michael (footballer, born February 1996), Brazilian footballer * Michael (footballer, born March 1996), Brazilian footballer * Michael (footballer, born 1999), Brazilian football ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Jeffrey Ullman
Jeffrey David Ullman (born November 22, 1942) is an American computer scientist and the Stanford W. Ascherman Professor of Engineering, Emeritus, at Stanford University. His textbooks on compilers (various editions are popularly known as the dragon book), theory of computation (also known as the Cinderella book), data structures, and databases are regarded as standards in their fields. He and his long-time collaborator Alfred Aho are the recipients of the 2020 Turing Award, generally recognized as the highest distinction in computer science.ACM Turing Award Honors Innovators Who Shaped the Foundations of Programming Language Compilers and Algorithms
Retrieved March 31, 2021.


Career

Ullman received a Bachelor of S ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


John Hopcroft
John Edward Hopcroft (born October 7, 1939) is an American theoretical computer scientist. His textbooks on theory of computation (also known as the Cinderella book) and data structures are regarded as standards in their fields. He is a professor emeritus at Cornell University, co-director of the Center on Frontiers of Computing Studies at Peking University, and the director of the John Hopcroft Center for Computer Science at Shanghai Jiao Tong University. Early life and education Hopcroft received a Bachelor of Science with a major in electrical engineering from Seattle University in 1961. He received a Master of Science in electrical engineering in 1962 and a Doctor of Philosophy in electrical engineering in 1964, both from Stanford University. Hopcroft is the grandson of Jacob Nist, who established the Seattle-Tacoma Box Company in 1889. Career and honor He worked for three years at Princeton University and since then has been at Cornell University. In addition to his ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Tree Adjoining Grammars
Tree-adjoining grammar (TAG) is a grammar formalism defined by Aravind Joshi. Tree-adjoining grammars are somewhat similar to context-free grammars, but the elementary unit of rewriting is the tree rather than the symbol. Whereas context-free grammars have rules for rewriting symbols as strings of other symbols, tree-adjoining grammars have rules for rewriting the nodes of trees as other trees (see tree (graph theory) and tree (data structure)). History TAG originated in investigations by Joshi and his students into the family of adjunction grammars (AG), the "string grammar" of Zellig Harris. AGs handle exocentric properties of language in a natural and effective way, but do not have a good characterization of endocentric constructions; the converse is true of rewrite grammars, or phrase-structure grammar (PSG). In 1969, Joshi introduced a family of grammars that exploits this complementarity by mixing the two types of rules. A few very simple rewrite rules suffice to genera ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Equivalence (formal Languages)
In formal language theory, weak equivalence of two grammars means they generate the same set of strings, i.e. that the formal language they generate is the same. In compiler theory the notion is distinguished from strong (or structural) equivalence, which additionally means that the two parse trees are reasonably similar in that the same semantic interpretation can be assigned to both. Vijay-Shanker and Weir (1994) demonstrates that Linear Indexed Grammars, Combinatory Categorial Grammars, Tree-adjoining Grammars, and Head Grammars are weakly equivalent formalisms, in that they all define the same string languages. On the other hand, if two grammars generate the same set of derivation trees (or more generally, the same set of abstract syntactic objects), then the two grammars are strongly equivalent. Chomsky (1963) introduces the notion of strong equivalence, and argues that only strong equivalence is relevant when comparing grammar formalisms. Kornai and Pullum (1990)Kornai, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Gerald Gazdar
Gerald James Michael Gazdar, FBA (born 24 February 1950) is a British linguist and computer scientist. Education He was educated at Heath Mount School, Bradfield College, the University of East Anglia (BA, 1970) and the University of Reading (MA, PhD). Career and research Gazdar was appointed a lecturer at the University of Sussex in 1975, and became Professor of Computational Linguistics there in 1985. He retired in 2002. Gazdar defined Linear Indexed Grammars and pioneered, along with his colleagues Ewan Klein, Geoffrey Pullum and Ivan Sag, the framework of Generalized Phrase Structure Grammar Generalized phrase structure grammar (GPSG) is a framework for describing the syntax and semantics of natural languages. It is a type of constraint-based phrase structure grammar. Constraint based grammars are based around defining certain syntacti ...s. References 1950 births Living people People educated at Heath Mount School People educated at Bradfield College Alumni ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Alfred Aho
Alfred Vaino Aho (born August 9, 1941) is a Canadian computer scientist best known for his work on programming languages, compilers, and related algorithms, and his textbooks on the art and science of computer programming. Aho was elected into the National Academy of Engineering in 1999 for his contributions to the fields of algorithms and programming tools. He and his long-time collaborator Jeffrey Ullman are the recipients of the 2020 Turing Award, generally recognized as the highest distinction in computer science. Career Aho received a B.A.Sc. (1963) in Engineering Physics from the University of Toronto, then an M.A. (1965) and Ph.D. (1967) in Electrical Engineering/Computer Science from Princeton University. He conducted research at Bell Labs from 1967 to 1991, and again from 1997 to 2002 as Vice President of the Computing Sciences Research Center. Since 1995, he has held the Lawrence Gussman Professorship in Computer Science at Columbia University. He served as chair ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]