HOME



picture info

Conduit Metaphor
In linguistics, the conduit metaphor is a dominant class of figurative expressions invoked when linguists discuss communication itself (metalanguage). It operates whenever people speak or write as if they "insert" their mental contents (feelings, meanings, thoughts, concepts, etc.) into "containers" (words, phrases, sentences, etc.) whose contents are then "extracted" by listeners and readers. Thus, in this model, language is viewed as a "conduit" conveying mental content between people. The conduit metaphor was first defined and described by linguist Michael J. Reddy in 1979. Reddy's proposal of this conceptual metaphor refocused debate within and outside the linguistic community on the importance of metaphorical language. Fellow linguist George Lakoff stated: "The contemporary theory that metaphor is primarily conceptual, conventional, and part of the ordinary system of thought and language can be traced to Michael Reddy's now classic essay... With a single, thoroughly analyzed ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Linguistics
Linguistics is the scientific study of language. The areas of linguistic analysis are syntax (rules governing the structure of sentences), semantics (meaning), Morphology (linguistics), morphology (structure of words), phonetics (speech sounds and equivalent gestures in sign languages), phonology (the abstract sound system of a particular language, and analogous systems of sign languages), and pragmatics (how the context of use contributes to meaning). Subdisciplines such as biolinguistics (the study of the biological variables and evolution of language) and psycholinguistics (the study of psychological factors in human language) bridge many of these divisions. Linguistics encompasses Outline of linguistics, many branches and subfields that span both theoretical and practical applications. Theoretical linguistics is concerned with understanding the universal grammar, universal and Philosophy of language#Nature of language, fundamental nature of language and developing a general ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Benjamin Lee Whorf
Benjamin Atwood Lee Whorf (; April 24, 1897 – July 26, 1941) was an American linguist and fire prevention engineer best known for proposing the Sapir–Whorf hypothesis. He believed that the structures of different languages shape how their speakers perceive and conceptualize the world. Whorf saw this idea, named after him and his mentor Edward Sapir, as having implications similar to those of Einstein's principle of physical relativity. However, the concept originated from 19th-century philosophy and thinkers like Wilhelm von Humboldt and Wilhelm Wundt. Whorf initially pursued chemical engineering but developed an interest in linguistics, particularly Biblical Hebrew and indigenous Mesoamerican languages. His groundbreaking work on the Nahuatl language earned him recognition, and he received a grant to study it further in Mexico. He presented influential papers on Nahuatl upon his return. Whorf later studied linguistics with Edward Sapir at Yale University while w ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Internet Archive
The Internet Archive is an American 501(c)(3) organization, non-profit organization founded in 1996 by Brewster Kahle that runs a digital library website, archive.org. It provides free access to collections of digitized media including websites, Application software, software applications, music, audiovisual, and print materials. The Archive also advocates a Information wants to be free, free and open Internet. Its mission is committing to provide "universal access to all knowledge". The Internet Archive allows the public to upload and download digital material to its data cluster, but the bulk of its data is collected automatically by its web crawlers, which work to preserve as much of the public web as possible. Its web archiving, web archive, the Wayback Machine, contains hundreds of billions of web captures. The Archive also oversees numerous Internet Archive#Book collections, book digitization projects, collectively one of the world's largest book digitization efforts. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Warren Weaver
Warren Weaver (July 17, 1894 – November 24, 1978) was an American scientist, mathematician, and science administrator. He is widely recognized as one of the pioneers of machine translation and as an important figure in creating support for science in the United States. Career Weaver received three degrees from the University of Wisconsin–Madison: a Bachelor of Science in 1916, a civil engineering degree in 1917, and a Ph.D. in 1921. He became an assistant professor of mathematics at Throop College (now California Institute of Technology). He served as a second lieutenant in the Air Service during World War I. After the war, he returned to teach mathematics at Wisconsin (1920–32). Weaver was also given an honorary LLD degree from the University of Wisconsin-Madison and a Doctor of Science degree from the University of São Paulo. Weaver was director of the Division of Natural Sciences at the Rockefeller Foundation (1932–55), and was science consultant (1947–51), tr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Claude Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, computer scientist, cryptographer and inventor known as the "father of information theory" and the man who laid the foundations of the Information Age. Shannon was the first to describe the use of Boolean algebra—essential to all digital electronic circuits—and helped found artificial intelligence (AI). Roboticist Rodney Brooks declared Shannon the 20th century engineer who contributed the most to 21st century technologies, and mathematician Solomon W. Golomb described his intellectual achievement as "one of the greatest of the twentieth century". At the University of Michigan, Shannon dual degreed, graduating with a Bachelor of Science in electrical engineering and another in mathematics, both in 1936. A 21-year-old master's degree student in electrical engineering at MIT, his thesis, "A Symbolic Analysis of Relay and Switching Circuits", demonstrated that electric ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Colin Cherry
Edward Colin Cherry (23 June 1914 – 23 November 1979) was a British cognitive scientist whose main contributions were in focused auditory attention, specifically the cocktail party problem regarding the capacity to follow one conversation while many other conversations are going on in a noisy room. Cherry used shadowing tasks to study this problem, which involve playing two different auditory messages to a participant's left and right ears and instructing them to attend to only one. The participant must then shadow this attended message. Cherry found that very little information about the unattended message was obtained by his participants: physical characteristics were detected but semantic characteristics were not. Cherry therefore concluded that unattended auditory information receives very little processing and that we use physical differences between messages to select which one we attend. He was born in St Albans in 1914 and educated at St Albans School and North ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Theory
Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, Neuroscience, neurobiology, physics, and electrical engineering. A key measure in information theory is information entropy, entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a Fair coin, fair coin flip (which has two equally likely outcomes) provides less information (lower entropy, less uncertainty) than identifying the outcome from a roll of a dice, die (which has six equally likely outcomes). Some other important measu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Metonymy
Metonymy () is a figure of speech in which a concept is referred to by the name of something associated with that thing or concept. For example, the word " suit" may refer to a person from groups commonly wearing business attire, such as salespeople or attorneys. Etymology The words ''metonymy'' and ''metonym'' come ; , a suffix that names figures of speech, . Background Metonymy and related figures of speech are common in everyday speech and writing. Synecdoche and metalepsis are considered specific types of metonymy. Polysemy, the capacity for a word or phrase to have multiple meanings, sometimes results from relations of metonymy. Both metonymy and metaphor involve the substitution of one term for another. In metaphor, this substitution is based on some specific analogy between two things, whereas in metonymy the substitution is based on some understood association or contiguity. American literary theorist Kenneth Burke considers metonymy as one of four "master tro ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Polysemy
Polysemy ( or ; ) is the capacity for a Sign (semiotics), sign (e.g. a symbol, morpheme, word, or phrase) to have multiple related meanings. For example, a word can have several word senses. Polysemy is distinct from ''monosemy'', where a word has a single meaning. Polysemy is distinct from homonymy—or homophone, homophony—which is an Accident (philosophy), accidental similarity between two or more words (such as ''bear'' the animal, and the verb wikt:bear#Etymology 2, ''bear''); whereas homonymy is a mere linguistic coincidence, polysemy is not. In discerning whether a given set of meanings represent polysemy or homonymy, it is often necessary to look at the history of the word to see whether the two meanings are historically related. Lexicography, Dictionary writers often list polysemes (words or phrases with different, but related, senses) in the same entry (that is, under the same headword) and enter homonyms as separate headwords (usually with a numbering convention such ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Dead Metaphor
A dead metaphor is a figure of speech which has lost the original imagery of its meaning by extensive, repetitive, and popular usage, or because it refers to an obsolete technology or forgotten custom. Because dead metaphors have a conventional meaning that differs from the original, they can be understood without knowing their earlier connotation. Description Dead metaphors are generally the result of a semantic shift in the evolution of a language, a process called the literalization of a metaphor. A distinction is often made between those dead metaphors whose origins are entirely unknown to the majority of people using them (such as the expression "to kick the bucket") and those whose source is widely known or symbolism easily understood but not often thought about (the idea of "falling in love"). The long standing metaphorical application of a term can similarly lose their metaphorical quality, coming simply to denote a larger application of the term. The wings of a plane n ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]