Shifting (syntax)
   HOME
*



picture info

Shifting (syntax)
In syntax, shifting occurs when two or more constituents appearing on the same side of their common head exchange positions in a sense to obtain non-canonical order. The most widely acknowledged type of shifting is heavy NP shift, but shifting involving a heavy NP is just one manifestation of the shifting mechanism. Shifting occurs in most if not all European languages, and it may in fact be possible in all natural languages including sign languages. Shifting is not inversion, and inversion is not shifting, but the two mechanisms are similar insofar as they are both present in languages like English that have relatively strict word order. The theoretical analysis of shifting varies in part depending on the theory of sentence structure that one adopts. If one assumes relatively flat structures, shifting does not result in a discontinuity. Shifting is often motivated by the relative weight of the constituents involved. The weight of a constituent is determined by a number of facto ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Syntax
In linguistics, syntax () is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure ( constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals. Etymology The word ''syntax'' comes from Ancient Greek roots: "coordination", which consists of ''syn'', "together", and ''táxis'', "ordering". Topics The field of syntax contains a number of various topics that a syntactic theory is often designed to handle. The relation between the topics is treated differently in different theories, and some of them may not be considered to be distinct but instead to be derived from one another (i.e. word order can be seen as the result of movement rules derived from grammatical relations). Se ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Phrase Structure Grammar
The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue (Post canonical systems). Some authors, however, reserve the term for more restricted grammars in the Chomsky hierarchy: context-sensitive grammars or context-free grammars. In a broader sense, phrase structure grammars are also known as ''constituency grammars''. The defining trait of phrase structure grammars is thus their adherence to the constituency relation, as opposed to the dependency relation of dependency grammars. Constituency relation In linguistics, phrase structure grammars are all those grammars that are based on the constituency relation, as opposed to the dependency relation associated with dependency grammars; hence, phrase structure grammars are also known as constituency grammars. Any of several related theories for the parsing of natural language qualify as constituency grammars, and most of them have been develope ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Wh-movement
In linguistics, wh-movement (also known as wh-fronting, wh-extraction, or wh-raising) is the formation of syntactic dependencies involving interrogative words. An example in English is the dependency formed between ''what'' and the object position of ''doing'' in "What are you doing?" Interrogative forms are sometimes known within English linguistics as '' wh-words'', such as ''what, when, where, who'', and ''why'', but also include other interrogative words, such as ''how''. This dependency has been used as a diagnostic tool in syntactic studies as it can be observed to interact with other grammatical constraints. In languages with wh- movement, sentences or clauses with a wh-word show a non-canonical word order that places the wh-word (or phrase containing the wh-word) at or near the front of the sentence or clause ("''Who'' are you thinking about?") instead of the canonical position later in the sentence ("I am thinking about ''you''"). Leaving the wh-word in its canonical positi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Ray Jackendoff
Ray Jackendoff (born January 23, 1945) is an American linguist. He is professor of philosophy, Seth Merrin Chair in the Humanities and, with Daniel Dennett, co-director of the Center for Cognitive Studies at Tufts University. He has always straddled the boundary between generative linguistics and cognitive linguistics, committed to both the existence of an innate universal grammar (an important thesis of generative linguistics) and to giving an account of language that is consistent with the current understanding of the human mind and cognition (the main purpose of cognitive linguistics). Jackendoff's research deals with the semantics of natural language, its bearing on the formal structure of cognition, and its lexical and syntactic expression. He has conducted extensive research on the relationship between conscious awareness and the computational theory of mind, on syntactic theory, and, with Fred Lerdahl, on musical cognition, culminating in their generative theory of ton ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Dependency Grammar
Dependency grammar (DG) is a class of modern grammatical theories that are all based on the dependency relation (as opposed to the ''constituency relation'' of phrase structure) and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called ''dependencies''. Dependency grammar differs from phrase structure grammar in that while it can identify phrases it tends to overlook phrasal nodes. A dependency structure is determined by the relation between a word (a head) and its dependents. Dependency structures are flatter than phrase structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Head-final
In linguistics, head directionality is a proposed parameter that classifies languages according to whether they are head-initial (the head of a phrase precedes its complements) or head-final (the head follows its complements). The head is the element that determines the category of a phrase: for example, in a verb phrase, the head is a verb. Therefore, head initial would be "VO" languages and head final would be "OV" languages. Some languages are consistently head-initial or head-final at all phrasal levels. English is considered to be strongly head-initial (verbs precede their objects, for example), while Japanese is an example of a language that is consistently head-final. In certain other languages, such as German and Gbe, examples of both types of head direction occur. Various theories have been proposed to explain such variation. Head directionality is connected with the type of branching that predominates in a language: head-initial structures are ''right-branching'' ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Constituent (linguistics)
In syntactic analysis, a constituent is a word or a group of words that function as a single unit within a hierarchical structure. The constituent structure of sentences is identified using ''tests for constituents''. These tests apply to a portion of a sentence, and the results provide evidence about the constituent structure of the sentence. Many constituents are phrases. A phrase is a sequence of one or more words (in some theories two or more) built around a head lexical item and working as a unit within a sentence. A word sequence is shown to be a phrase/constituent if it exhibits one or more of the behaviors discussed below. The analysis of constituent structure is associated mainly with phrase structure grammars, although dependency grammars also allow sentence structure to be broken down into constituent parts. Tests for constituents in English Tests for constituents are diagnostics used to identify sentence structure. There are numerous tests for constituents that are co ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Syntactic Category
A syntactic category is a syntactic unit that theories of syntax assume. Word classes, largely corresponding to traditional parts of speech (e.g. noun, verb, preposition, etc.), are syntactic categories. In phrase structure grammars, the ''phrasal categories'' (e.g. noun phrase, verb phrase, prepositional phrase, etc.) are also syntactic categories. Dependency grammars, however, do not acknowledge phrasal categories (at least not in the traditional sense). Word classes considered as syntactic categories may be called ''lexical categories'', as distinct from phrasal categories. The terminology is somewhat inconsistent between the theoretical models of different linguists. However, many grammars also draw a distinction between ''lexical categories'' (which tend to consist of content words, or phrases headed by them) and ''functional categories'' (which tend to consist of function words or abstract functional elements, or phrases headed by them). The term ''lexical category'' therefore ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Semantic
Semantics (from grc, σημαντικός ''sēmantikós'', "significant") is the study of reference, meaning, or truth. The term can be used to refer to subfields of several distinct disciplines, including philosophy, linguistics and computer science. History In English, the study of meaning in language has been known by many names that involve the Ancient Greek word (''sema'', "sign, mark, token"). In 1690, a Greek rendering of the term ''semiotics'', the interpretation of signs and symbols, finds an early allusion in John Locke's ''An Essay Concerning Human Understanding'': The third Branch may be called [''simeiotikí'', "semiotics"], or the Doctrine of Signs, the most usual whereof being words, it is aptly enough termed also , Logick. In 1831, the term is suggested for the third branch of division of knowledge akin to Locke; the "signs of our knowledge". In 1857, the term ''semasiology'' (borrowed from German ''Semasiologie'') is attested in Josiah W. Gibbs' '' ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]