Winograd Schema Challenge
   HOME
*





Winograd Schema Challenge
The Winograd schema challenge (WSC) is a test of machine intelligence proposed by Hector Levesque, a computer scientist at the University of Toronto. Designed to be an improvement on the Turing test, it is a multiple-choice test that employs questions of a very specific structure: they are instances of what are called Winograd schemas, named after Terry Winograd, professor of computer science at Stanford University. On the surface, Winograd schema questions simply require the resolution of anaphora: the machine must identify the antecedent of an ambiguous pronoun in a statement. This makes it a task of natural language processing, but Levesque argues that for Winograd schemas, the task requires the use of knowledge and commonsense reasoning. Nuance Communications announced in July 2014 that it would sponsor an annual WSC competition, with a prize of $25,000 for the best system that could match human performance. However, the prize is no longer offered. Background The Winograd ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Hector Levesque
Hector Joseph Levesque (born 1951) is a Canadian academic and researcher in artificial intelligence. His research concerns incorporating commonsense reasoning in intelligent systems and he initiated the Winograd Schemas Challenge. Education He received his BSc, MSc and PhD from the University of Toronto in 1975, 1977, and 1981, respectively. His PhD advisor was John Mylopoulos. After graduation, he accepted a position at the Fairchild Laboratory for Artificial Intelligence Research in Palo Alto, and then joined the faculty at the University of Toronto where he has remained since 1984. Career His research is in the area of knowledge representation and reasoning in artificial intelligence. On the representation side, he has worked on the formalization of a number of concepts pertaining to artificial and natural agents including belief, goals, intentions, ability, and the interaction between knowledge, perception and action. On the reasoning side, his research mainly concerns how ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Selection (linguistics)
In linguistics, selection denotes the ability of predicates to determine the semantic content of their arguments. Predicates select their arguments, which means they limit the semantic content of their arguments. One sometimes draws a distinction between types of selection; one acknowledges both ''s(emantic)-selection'' and ''c(ategory)-selection''. Selection in general stands in contrast to subcategorization: predicates both select and subcategorize for their complement arguments, whereas they only select their subject arguments. Selection is a semantic concept, whereas subcategorization is a syntactic one. Selection is closely related to valency, a term used in other grammars than the Chomskian generative grammar, for a similar phenomenon. Examples The following pairs of sentences will illustrate the concept of selection: ::a. The plant is wilting. ::b. #The building is wilting. - The argument ''the building'' violates the selectional restrictions of the predicate ''is wilting''. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Turing Tests
Alan Mathison Turing (; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist. Turing was highly influential in the development of theoretical computer science, providing a formalisation of the concepts of algorithm and computation with the Turing machine, which can be considered a model of a general-purpose computer. He is widely considered to be the father of theoretical computer science and artificial intelligence. Born in Maida Vale, London, Turing was raised in southern England. He graduated at King's College, Cambridge, with a degree in mathematics. Whilst he was a fellow at Cambridge, he published a proof demonstrating that some purely mathematical yes–no questions can never be answered by computation and defined a Turing machine, and went on to prove that the halting problem for Turing machines is Decision problem, undecidable. In 1938, he obtained his PhD from the Prince ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Natural-language Understanding
Natural-language understanding (NLU) or natural-language interpretation (NLI) is a subtopic of natural-language processing in artificial intelligence that deals with machine reading comprehension. Natural-language understanding is considered an AI-hard problem. There is considerable commercial interest in the field because of its application to automated reasoning, machine translation, question answering, news-gathering, text categorization, voice-activation, archiving, and large-scale content analysis. History The program STUDENT, written in 1964 by Daniel Bobrow for his PhD dissertation at MIT, is one of the earliest known attempts at natural-language understanding by a computer. Eight years after John McCarthy coined the term artificial intelligence, Bobrow's dissertation (titled ''Natural Language Input for a Computer Problem Solving System'') showed how a computer could understand simple natural language input to solve algebra word problems. A year later, in 1965, Joseph ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




General Language Understanding Evaluation
These datasets are applied for machine learning research and have been cited in peer-reviewed academic journals. Datasets are an integral part of the field of machine learning. Major advances in this field can result from advances in learning algorithms (such as deep learning), computer hardware, and, less-intuitively, the availability of high-quality training datasets. High-quality labeled training datasets for supervised and semi-supervised machine learning algorithms are usually difficult and expensive to produce because of the large amount of time needed to label the data. Although they do not need to be labeled, high-quality datasets for unsupervised learning can also be difficult and costly to produce. Image data These datasets consist primarily of images or videos for tasks such as object detection, facial recognition, and multi-label classification. Facial recognition In computer vision, face images have been used extensively to develop facial recognition syst ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


GPT-3
Generative Pre-trained Transformer 3 (GPT-3) is an autoregressive language model that uses deep learning to produce human-like text. Given an initial text as prompt, it will produce text that continues the prompt. The architecture is a standard transformer network (with a few engineering tweaks) with the unprecedented size of 2048- token-long context and 175 billion parameters (requiring 800 GB of storage). The training method is "generative pretraining", meaning that it is trained to predict what the next token is. The model demonstrated strong few-shot learning on many text-based tasks. It is the third-generation language prediction model in the GPT-n series (and the successor to GPT-2) created by OpenAI, a San Francisco-based artificial intelligence research laboratory. GPT-3, which was introduced in May 2020, and was in beta testing as of July 2020, is part of a trend in natural language processing (NLP) systems of pre-trained language representations. The quality of the t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


BERT (language Model)
Bidirectional Encoder Representations from Transformers (BERT) is a transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google. BERT was created and published in 2018 by Jacob Devlin and his colleagues from Google. In 2019, Google announced that it had begun leveraging BERT in its search engine, and by late 2020 it was using BERT in almost every English-language query. A 2020 literature survey concluded that "in a little over a year, BERT has become a ubiquitous baseline in NLP experiments", counting over 150 research publications analyzing and improving the model. The original English-language BERT has two models: (1) the BERTBASE: 12 encoders with 12 bidirectional self-attention heads, and (2) the BERTLARGE: 24 encoders with 16 bidirectional self-attention heads. Both models are pre-trained from unlabeled data extracted from the BooksCorpus with 800M words and English Wikipedia with 2,500M words. Architecture BERT is a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

University Of Illinois At Chicago
The University of Illinois Chicago (UIC) is a Public university, public research university in Chicago, Illinois. Its campus is in the Near West Side, Chicago, Near West Side community area, adjacent to the Chicago Loop. The second campus established under the University of Illinois system, UIC is also the largest university in the Chicago metropolitan area, having more than 33,000 students enrolled in 16 colleges. It is Carnegie Classification of Institutions of Higher Education, classified among "R1: Doctoral Universities – Very high research activity." The roots of UIC can be traced to the establishment of the Chicago College of Pharmacy in 1859, which was joined in the 1800s by additional medical related schools. It began an undergraduate program toward the end of World War II, and developed its West side campus in the 1960s. In 1982, it consolidated the University of Illinois at Chicago Circle and the University of Illinois at the Medical Center into the present universi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Leidos
Leidos, formerly known as Science Applications International Corporation (SAIC), is an American defense, aviation, information technology (Lockheed Martin IS&GS), and biomedical research company headquartered in Reston, Virginia, that provides scientific, engineering, systems integration, and technical services. Leidos merged with Lockheed Martin's IT sector, Information Systems & Global Solutions, in August 2016 to create the defense industry’s largest IT services provider. The Leidos-Lockheed Martin merger is one of the biggest transactions thus far in the consolidation of a defense sector. Leidos works extensively with the United States Department of Defense, the United States Department of Homeland Security, and the United States Intelligence Community, including the NSA, as well as other U.S. government civil agencies and selected commercial markets. History As SAIC The company was founded by J. Robert "Bob" Beyster in 1969 in the La Jolla neighborhood of San Diego, Cali ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


AAAI
The Association for the Advancement of Artificial Intelligence (AAAI) is an international scientific society devoted to promote research in, and responsible use of, artificial intelligence. AAAI also aims to increase public understanding of artificial intelligence (AI), improve the teaching and training of AI practitioners, and provide guidance for research planners and funders concerning the importance and potential of current AI developments and future directions. History The organization was founded in 1979 under the name "American Association for Artificial Intelligence" and changed its name in 2007 to "Association for the Advancement of Artificial Intelligence". It has in excess of 4,000 members worldwide. In its early history, the organization was presided over by notable figures in computer science such as Allen Newell, Edward Feigenbaum, Marvin Minsky and John McCarthy. The current president is Yolanda Gil, and the president elect is Bart Selman. Conferences and public ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Binary Decision
A binary decision is a choice between two alternatives, for instance between taking some specific action or not taking it. Binary decisions are basic to many fields. Examples include: *Truth values in mathematical logic, and the corresponding Boolean data type in computer science, representing a value which may be chosen to be either true or false. * Conditional statements (if-then or if-then-else) in computer science, binary decisions about which piece of code to execute next. * Decision trees and binary decision diagrams, representations for sequences of binary decisions. *Binary choice, a statistical model for the outcome of a binary decision. Binary decision diagrams A binary decision diagram (BDD) is a way to visually represent a boolean function. One application of BDDs is in CAD software and digital circuit analysis where they are an efficient way to represent and manipulate boolean functions. The value of a boolean function can be determined by following a path in its BDD ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Semantic Class
A semantic class contains words that share a semantic feature. For example within nouns there are two sub classes, concrete nouns and abstract nouns. The concrete nouns include people, plants, animals, materials and objects while the abstract nouns refer to concepts such as qualities, actions, and processes. According to the nature of the noun, they are categorized into different semantic classes. Semantic classes may intersect. The intersection of ''female'' and ''young'' can be ''girl''. See also * Semantic property * Categorization Categorization is the ability and activity of recognizing shared features or similarities between the elements of the experience of the world (such as objects, events, or ideas), organizing and classifying experience by associating them to a ... * Semantic field References * Semantics {{semantics-stub es:Campo semántico fr:Classe sémantique kk:Семантикалық өріс ru:Семантическое поле ta:ச ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]