IBM Watson is a computer system capable of
answering questions posed in
natural language.
It was developed as a part of
IBM
International Business Machines Corporation (using the trademark IBM), nicknamed Big Blue, is an American Multinational corporation, multinational technology company headquartered in Armonk, New York, and present in over 175 countries. It is ...
's DeepQA project by a research team, led by
principal investigator David Ferrucci. Watson was named after IBM's founder and first CEO, industrialist
Thomas J. Watson.
The computer system was initially developed to answer questions on the popular quiz show ''
Jeopardy!'' and in 2011, the Watson computer system competed on ''Jeopardy!'' against champions
Brad Rutter and
Ken Jennings,
winning the first-place prize of US$1 million.
In February 2013, IBM announced that Watson's first commercial application would be for
utilization management decisions in lung cancer treatment, at
Memorial Sloan Kettering Cancer Center, New York City, in conjunction with WellPoint (now
Elevance Health).
[
]
Description
Watson was created as a question answering (QA) computing system that IBM built to apply advanced natural language processing
Natural language processing (NLP) is a subfield of computer science and especially artificial intelligence. It is primarily concerned with providing computers with the ability to process data encoded in natural language and is thus closely related ...
, information retrieval
Information retrieval (IR) in computing and information science is the task of identifying and retrieving information system resources that are relevant to an Information needs, information need. The information need can be specified in the form ...
, knowledge representation, automated reasoning, and machine learning
Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of Computational statistics, statistical algorithms that can learn from data and generalise to unseen data, and thus perform Task ( ...
technologies to the field of open domain question answering. The system is named DeepQA (though it did not involve the use of deep neural networks).[
IBM stated that Watson uses "more than 100 different techniques to analyze natural language, identify sources, find and generate hypotheses, find and score evidence, and merge and rank hypotheses."
In recent years, Watson's capabilities have been extended and the way in which Watson works has been changed to take advantage of new deployment models (Watson on IBM Cloud), evolved machine learning capabilities, and optimized hardware available to developers and researchers.
]
Software
Watson uses IBM's DeepQA software and the Apache UIMA (Unstructured Information Management Architecture) framework implementation. The system was written in various languages, including Java
Java is one of the Greater Sunda Islands in Indonesia. It is bordered by the Indian Ocean to the south and the Java Sea (a part of Pacific Ocean) to the north. With a population of 156.9 million people (including Madura) in mid 2024, proje ...
, C++, and Prolog
Prolog is a logic programming language that has its origins in artificial intelligence, automated theorem proving, and computational linguistics.
Prolog has its roots in first-order logic, a formal logic. Unlike many other programming language ...
, and runs on the SUSE Linux Enterprise Server 11 operating system using the Apache Hadoop framework to provide distributed computing.
Other than the DeepQA system, Watson contained several strategy modules. For example, one module calculated the amount to bet for ''Final Jeopardy'', according to the confidence score on getting the answer right, and the current scores of all contestants. One module used the Bayes rule to calculate the probability that each unrevealed question might be the ''Daily Double'', using historical data from the J! Archive as the prior. If a Daily Double is found, the amount to wager is computed by a 2-layered neural network of the same kind as those used by TD-Gammon, a neural network that played backgammon, developed by Gerald Tesauro in the 1990s. The parameters in the strategy modules were tuned by benchmarking against a statistical model of human contestants fitted on data from the J! Archive, and selecting the best one.
Hardware
The system is workload-optimized, integrating massively parallel POWER7 processors and built on IBM's ''DeepQA'' technology, which it uses to generate hypotheses, gather massive evidence, and analyze data.[ Watson employs a cluster of ninety IBM Power 750 servers, each of which uses a 3.5 GHz POWER7 eight-core processor, with four threads per core. In total, the system uses 2,880 POWER7 processor threads and 16 terabytes of RAM.]
According to John Rennie, Watson can process 500 gigabytes (the equivalent of a million books) per second. IBM master inventor and senior consultant Tony Pearson estimated Watson's hardware cost at about three million dollars. Its Linpack performance stands at 80 TeraFLOPs, which is about half as fast as the cut-off line for the Top 500 Supercomputers list. According to Rennie, all content was stored in Watson's RAM for the Jeopardy game because data stored on hard drives would be too slow to compete with human Jeopardy champions.[
]
Data
The sources of information for Watson include encyclopedias, dictionaries
A dictionary is a listing of lexemes from the lexicon of one or more specific languages, often arranged Alphabetical order, alphabetically (or by Semitic root, consonantal root for Semitic languages or radical-and-stroke sorting, radical an ...
, thesauri, newswire articles and literary works. Watson also used databases, taxonomies and ontologies including DBPedia, WordNet
WordNet is a lexical database of semantic relations between words that links words into semantic relations including synonyms, hyponyms, and meronyms. The synonyms are grouped into ''synsets'' with short definitions and usage examples. It can thu ...
and Yago. The IBM team provided Watson with millions of documents, including dictionaries, encyclopedias and other reference material, that it could use to build its knowledge.
Operation
Watson parses questions into different keywords and sentence fragments in order to find statistically related phrases.[ Watson's main innovation was not in the creation of a new ]algorithm
In mathematics and computer science, an algorithm () is a finite sequence of Rigour#Mathematics, mathematically rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algo ...
for this operation, but rather its ability to quickly execute hundreds of proven language analysis algorithms simultaneously.[ The more algorithms that find the same answer independently, the more likely Watson is to be correct. Once Watson has a small number of potential solutions, it is able to check against its database to ascertain whether the solution makes sense or not.][
]
Comparison with human players
Watson's basic working principle is to parse keywords in a clue while searching for related terms as responses. This gives Watson some advantages and disadvantages compared with human ''Jeopardy!'' players. Watson has deficiencies in understanding the context of the clues. Watson can read, analyze, and learn from natural language, which gives it the ability to make human-like decisions. As a result, human players usually generate responses faster than Watson, especially to short clues. Watson's programming prevents it from using the popular tactic of buzzing before it is sure of its response. However, Watson has consistently better reaction time on the buzzer once it has generated a response, and is immune to human players' psychological tactics, such as jumping between categories on every clue.
In a sequence of 20 mock games of ''Jeopardy!'', human participants were able to use the six to seven seconds that Watson needed to hear the clue and decide whether to signal for responding.[ During that time, Watson also has to evaluate the response and determine whether it is sufficiently confident in the result to signal.][ Part of the system used to win the ''Jeopardy!'' contest was the electronic circuitry that receives the "ready" signal and then examines whether Watson's confidence level was great enough to activate the buzzer. Given the speed of this circuitry compared to the speed of human reaction times, Watson's reaction time was faster than the human contestants except when the human anticipated (instead of reacted to) the ready signal.] After signaling, Watson speaks with an electronic voice and gives the responses in ''Jeopardy!'' question format.[ Watson's voice was synthesized from recordings that actor Jeff Woodman made for an IBM ]text-to-speech
Speech synthesis is the artificial production of human speech. A computer system used for this purpose is called a speech synthesizer, and can be implemented in software or Computer hardware, hardware products. A text-to-speech (TTS) system conv ...
program in 2004.
The ''Jeopardy!'' staff used different means to notify Watson and the human players when to buzz, which was critical in many rounds. The humans were notified by a light, which took them tenths of a second to perceive. Watson was notified by an electronic signal and could activate the buzzer within about eight milliseconds. The humans tried to compensate for the perception delay by anticipating the light, but the variation in the anticipation time was generally too great to fall within Watson's response time. Watson did not attempt to anticipate the notification signal.
History
Development
Since Deep Blue's victory over Garry Kasparov in chess in 1997, IBM had been on the hunt for a new challenge. In 2004, IBM Research manager Charles Lickel, over dinner with coworkers, noticed that the restaurant they were in had fallen silent. He soon discovered the cause of this evening's hiatus: Ken Jennings, who was then in the middle of his successful 74-game run on '' Jeopardy!''. Nearly the entire restaurant had piled toward the televisions, mid-meal, to watch ''Jeopardy!''. Intrigued by the quiz show as a possible challenge for IBM, Lickel passed the idea on, and in 2005, IBM Research executive Paul Horn supported Lickel, pushing for someone in his department to take up the challenge of playing ''Jeopardy!'' with an IBM system. Though he initially had trouble finding any research staff willing to take on what looked to be a much more complex challenge than the wordless game of chess, eventually David Ferrucci took him up on the offer. In competitions managed by the United States government, Watson's predecessor, a system named Piquant, was usually able to respond correctly to only about 35% of clues and often required several minutes to respond. To compete successfully on ''Jeopardy!'', Watson would need to respond in no more than a few seconds, and at that time, the problems posed by the game show were deemed to be impossible to solve.[
In initial tests run during 2006 by David Ferrucci, the senior manager of IBM's Semantic Analysis and Integration department, Watson was given 500 clues from past ''Jeopardy!'' programs. While the best real-life competitors buzzed in half the time and responded correctly to as many as 95% of clues, Watson's first pass could get only about 15% correct. During 2007, the IBM team was given three to five years and a staff of 15 people to solve the problems.][ John E. Kelly III succeeded Paul Horn as head of IBM Research in 2007.] '' InformationWeek'' described Kelly as "the father of Watson" and credited him for encouraging the system to compete against humans on ''Jeopardy!''. By 2008, the developers had advanced Watson such that it could compete with ''Jeopardy!'' champions.[ By February 2010, Watson could beat human ''Jeopardy!'' contestants on a regular basis.]
During the game, Watson had access to 200 million pages of structured and unstructured content consuming four terabytes of disk storage
Disc or disk may refer to:
* Disk (mathematics)
In geometry, a disk (Spelling of disc, also spelled disc) is the region in a plane (geometry), plane bounded by a circle. A disk is said to be ''closed'' if it contains the circle that constitut ...
including the full text of the 2011 edition of Wikipedia, but was not connected to the Internet. For each clue, Watson's three most probable responses were displayed on the television screen. Watson consistently outperformed its human opponents on the game's signaling device, but had trouble in a few categories, notably those having short clues containing only a few words.
Although the system is primarily an IBM effort, Watson's development involved faculty and graduate students from Rensselaer Polytechnic Institute, Carnegie Mellon University, University of Massachusetts Amherst, the University of Southern California
The University of Southern California (USC, SC, or Southern Cal) is a Private university, private research university in Los Angeles, California, United States. Founded in 1880 by Robert M. Widney, it is the oldest private research university in ...
's Information Sciences Institute, the University of Texas at Austin
The University of Texas at Austin (UT Austin, UT, or Texas) is a public university, public research university in Austin, Texas, United States. Founded in 1883, it is the flagship institution of the University of Texas System. With 53,082 stud ...
, the Massachusetts Institute of Technology
The Massachusetts Institute of Technology (MIT) is a Private university, private research university in Cambridge, Massachusetts, United States. Established in 1861, MIT has played a significant role in the development of many areas of moder ...
, and the University of Trento, as well as students from New York Medical College. Among the team of IBM programmers who worked on Watson was 2001 '' Who Wants to Be a Millionaire?'' top prize winner Ed Toutant, who himself had appeared on ''Jeopardy!'' in 1989 (winning one game).
Jeopardy!
Preparation
In 2008, IBM representatives communicated with ''Jeopardy!'' executive producer Harry Friedman about the possibility of having Watson compete against Ken Jennings and Brad Rutter, two of the most successful contestants on the show, and the program's producers agreed.[ Watson's differences with human players had generated conflicts between IBM and ''Jeopardy!'' staff during the planning of the competition.] IBM repeatedly expressed concerns that the show's writers would exploit Watson's cognitive deficiencies when writing the clues, thereby turning the game into a Turing test. To alleviate that claim, a third party randomly picked the clues from previously written shows that were never broadcast.[ ''Jeopardy!'' staff also showed concerns over Watson's reaction time on the buzzer. Originally Watson signaled electronically, but show staff requested that it press a button physically, as the human contestants would. Even with a robotic "finger" pressing the buzzer, Watson remained faster than its human competitors. Ken Jennings noted, "If you're trying to win on the show, the buzzer is all", and that Watson "can knock out a microsecond-precise buzz every single time with little or no variation. Human reflexes can't compete with computer circuits in this regard."] Stephen Baker, a journalist who recorded Watson's development in his book ''Final Jeopardy'', reported that the conflict between IBM and ''Jeopardy!'' became so serious in May 2010 that the competition was almost cancelled.[ As part of the preparation, IBM constructed a mock set in a conference room at one of its technology sites to model the one used on ''Jeopardy!''. Human players, including former ''Jeopardy!'' contestants, also participated in mock games against Watson with Todd Alan Crain of '' The Onion'' playing host.][ About 100 test matches were conducted with Watson winning 65% of the games.
To provide a physical presence in the televised games, Watson was represented by an " avatar" of a globe, inspired by the IBM "smarter planet" symbol. Jennings described the computer's avatar as a "glowing blue ball crisscrossed by 'threads' of thought—42 threads, to be precise",] and stated that the number of thought threads in the avatar was an in-joke referencing the significance of the number 42 in Douglas Adams' '' Hitchhiker's Guide to the Galaxy''. Joshua Davis, the artist who designed the avatar for the project, explained to Stephen Baker that there are 36 triggerable states that Watson was able to use throughout the game to show its confidence in responding to a clue correctly; he had hoped to be able to find forty-two, to add another level to the ''Hitchhiker's Guide'' reference, but he was unable to pinpoint enough game states.
A practice match was recorded on January 13, 2011, and the official matches were recorded on January 14, 2011. All participants maintained secrecy about the outcome until the match was broadcast in February.
Practice match
In a practice match before the press on January 13, 2011, Watson won a 15-question round against Ken Jennings and Brad Rutter with a score of $4,400 to Jennings's $3,400 and Rutter's $1,200, though Jennings and Watson were tied before the final $1,000 question. None of the three players responded incorrectly to a clue.
First match
The first round was broadcast February 14, 2011, and the second round, on February 15, 2011. The right to choose the first category had been determined by a draw won by Rutter. Watson, represented by a computer monitor display and artificial voice, responded correctly to the second clue and then selected the fourth clue of the first category, a deliberate strategy to find the Daily Double as quickly as possible. Watson's guess at the Daily Double location was correct. At the end of the first round, Watson was tied with Rutter at $5,000; Jennings had $2,000.
Watson's performance was characterized by some quirks. In one instance, Watson repeated a reworded version of an incorrect response offered by Jennings. (Jennings said "What are the '20s?" in reference to the 1920s. Then Watson said "What is 1920s?") Because Watson could not recognize other contestants' responses, it did not know that Jennings had already given the same response. In another instance, Watson was initially given credit for a response of "What is a leg?" after Jennings incorrectly responded "What is: he only had one hand?" to a clue about George Eyser (the correct response was, "What is: he's missing a leg?"). Because Watson, unlike a human, could not have been responding to Jennings's mistake, it was decided that this response was incorrect. The broadcast version of the episode was edited to omit Trebek's original acceptance of Watson's response. Watson also demonstrated complex wagering strategies on the Daily Doubles, with one bet at $6,435 and another at $1,246. Gerald Tesauro, one of the IBM researchers who worked on Watson, explained that Watson's wagers were based on its confidence level for the category and a complex regression model called the Game State Evaluator.
Watson took a commanding lead in Double Jeopardy!, correctly responding to both Daily Doubles. Watson responded to the second Daily Double correctly with a 32% confidence score.
However, during the Final Jeopardy! round, Watson was the only contestant to miss the clue in the category U.S. Cities ("Its largest airport was named for a World War II hero; its second largest, for a World War II battle"). Rutter and Jennings gave the correct response of Chicago, but Watson's response was "What is Toronto
Toronto ( , locally pronounced or ) is the List of the largest municipalities in Canada by population, most populous city in Canada. It is the capital city of the Provinces and territories of Canada, Canadian province of Ontario. With a p ...
?????" with five question marks appended indicating a lack of confidence. Ferrucci offered reasons why Watson would appear to have guessed a Canadian city: categories only weakly suggest the type of response desired, the phrase "U.S. city" did not appear in the question, there are cities named Toronto in the U.S., and Toronto in Ontario has an American League
The American League of Professional Baseball Clubs, known simply as the American League (AL), is the younger of two sports leagues, leagues constituting Major League Baseball (MLB) in the United States and Canada. It developed from the Western L ...
baseball team. Chris Welty, who also worked on Watson, suggested that it may not have been able to correctly parse the second part of the clue, "its second largest, for a World War II battle" (which was not a standalone clause despite it following a semicolon, and required context to understand that it was referring to a second-largest ''airport''). Eric Nyberg, a professor at Carnegie Mellon University and a member of the development team, stated that the error occurred because Watson does not possess the comparative knowledge to discard that potential response as not viable. Although not displayed to the audience as with non-Final Jeopardy! questions, Watson's second choice was Chicago. Both Toronto and Chicago were well below Watson's confidence threshold, at 14% and 11% respectively. Watson wagered only $947 on the question.
The game ended with Jennings with $4,800, Rutter with $10,400, and Watson with $35,734.
Second match
During the introduction, Trebek (a Canadian native) joked that he had learned Toronto was a U.S. city, and Watson's error in the first match prompted an IBM engineer to wear a Toronto Blue Jays jacket to the recording of the second match.
In the first round, Jennings was finally able to choose a Daily Double clue, while Watson responded to one Daily Double clue incorrectly for the first time in the Double Jeopardy! Round. After the first round, Watson placed second for the first time in the competition after Rutter and Jennings were briefly successful in increasing their dollar values before Watson could respond.[ Nonetheless, the final result ended with a victory for Watson with a score of $77,147, besting Jennings who scored $24,000 and Rutter who scored $21,600.
]
Final outcome
The prizes for the competition were $1 million for first place (Watson), $300,000 for second place (Jennings), and $200,000 for third place (Rutter). As promised, IBM donated 100% of Watson's winnings to charity, with 50% of those winnings going to World Vision and 50% going to World Community Grid. Similarly, Jennings and Rutter donated 50% of their winnings to their respective charities.
In acknowledgement of IBM and Watson's achievements, Jennings made an additional remark in his Final Jeopardy! response: "I for one welcome our new computer overlords", paraphrasing a joke from ''The Simpsons''. Jennings later wrote an article for '' Slate'', in which he stated: IBM has bragged to the media that Watson's question-answering skills are good for more than annoying Alex Trebek. The company sees a future in which fields like medical diagnosis, business analytics, and tech support are automated by question-answering software like Watson. Just as factory jobs were eliminated in the 20th century by new assembly-line robots, Brad and I were the first knowledge-industry workers put out of work by the new generation of 'thinking' machines. 'Quiz show contestant' may be the first job made redundant by Watson, but I'm sure it won't be the last.
Philosophy
Philosopher John Searle argues that Watson—despite impressive capabilities—cannot actually think. Drawing on his Chinese room thought experiment, Searle claims that Watson, like other computational machines, is capable only of manipulating symbols, but has no ability to understand the meaning of those symbols; however, Searle's experiment has its detractors.
Match against members of the United States Congress
On February 28, 2011, Watson played an untelevised exhibition match of ''Jeopardy!'' against members of the United States House of Representatives
The United States House of Representatives is a chamber of the Bicameralism, bicameral United States Congress; it is the lower house, with the U.S. Senate being the upper house. Together, the House and Senate have the authority under Artic ...
. In the first round, Rush D. Holt, Jr. (D-NJ, a former ''Jeopardy!'' contestant), who was challenging the computer with Bill Cassidy
William Morgan Cassidy (born September 28, 1957) is an American physician and politician serving as the Seniority in the United States Senate, senior United States senator from Louisiana, a seat he has held since 2015. A member of the Republic ...
(R-LA, later Senator from Louisiana), led with Watson in second place. However, combining the scores between all matches, the final score was $40,300 for Watson and $30,000 for the congressional players combined.
IBM's Christopher Padilla said of the match, "The technology behind Watson represents a major advancement in computing. In the data-intensive environment of government, this type of technology can help organizations make better decisions and improve how government helps its citizens."
Applications
After the national press attention gained by the 2011 ''Jeopardy!'' appearance, IBM sought out partnerships from education to weather and cancer to retail chatbots in order convince business about Watson's alleged capabilities. This ultimately led to the failure of Watson to find a profit-making product for the company.
In 2011, the IBM general counsel wrote in '' The National Law Review'' arguing that the law profession will become more efficient and better with Watson. After the national attention ''Jeopardy!'' afforded them, began an ultimately unsuccessful and expensive project that began when the Memorial Sloan Kettering Cancer Center tried to use Watson to help doctors diagnose and treat cancer patients. Ultimately, the division cost $4 billion to develop but was sold for a quarter of that—$1 billion, in 2022. By 2023, Watson resulted in IBM losing 10% of its stock value, costing four times more than what it brought to the company and resulting in mass layoffs.
From 2012 through the late 2010s, Watson's technology was used to create applications—mostly discontinued to help people make decisions in a variety of areas, among them:
* diagnosing cancer and treatment plans,
* retail shopping,
* medical equipment purchasing,
* cooking and recipes,
* water conservation,
* hospitality management,
* human genetic sequencing,
* music development and identification,
* weather forecasting
* to sell ads with weather forecasts,
* to tutor students,
* and tax preparations,
In 2021, technology reporter at ''The New York Times
''The New York Times'' (''NYT'') is an American daily newspaper based in New York City. ''The New York Times'' covers domestic, national, and international news, and publishes opinion pieces, investigative reports, and reviews. As one of ...
'' for Steve Rohr, explained:
Writing in '' The Atlantic'' in 2023, Mac Schwerin argued that IBM's leadership fundamentally did not understand the technology, leading to the hardship and strain caused by the project, saying:
In the end, IBM's initial vision for Watson as a transformative technology capable of revolutionizing industries did not materialize as anticipated. Watson's capabilities were primarily suited to specific tasks, like natural language processing for trivia games, rather than generalized commercial problem-solving. Watson's mismatch between capabilities and IBM's marketing contributed significantly to Watson's commercial struggles and eventual decline. The overstated claims about Watson's abilities also caused public sentiment to turn against the idea of Watson and artificial intelligence.
Between 2019 and 2023, IBM shifted focus to a separate initiative WatsonX, distinctly different from Watson, aiming for narrower, industry-targeted technology within IBM's cloud computing and platform-based strategies IBM Watsonx.
Healthcare
IBM's Watson was used to analyze medical datasets to provide physicians with guidance on diagnoses and cancer treatment decisions. When a physician submitted a query to Watson, the system started a multi-step process by parsing the input to identify key information, examining patient data to uncover relevant medical and hereditary history, and finally compare various data sources to form and test hypotheses.
IBM claimed that Watson's could draw from a wide range of sources, including treatment guidelines, electronic medical records, and research materials. Although, company executives would later blame the lack of data on the projects ultimate failure.
Notably, Watson has not been involved in the actual diagnosis process, but rather assists doctors in identifying suitable treatment options for patients who have already been diagnosed.In fact, a study of 1,000 challenging patient cases found that Watson's recommendations matched those of human doctors in an impressive 99% of cases.
IBM established partnerships with the Cleveland Clinic, the MD Anderson Cancer Center, and Memorial Sloan-Kettering Cancer Center to further its mission in healthcare. In 2011, IBM entered into a research partnership with Nuance Communications and physicians at the University of Maryland and Harvard to develop a commercial product using Watson's clinical decision support capabilities. IBM partnered with WellPoint (now Anthem) in 2011 to utilize Watson in suggesting treatment options to physicians, and in 2013, Watson was deployed in its first commercial application for utilization management decisions in lung cancer treatment at Memorial Sloan-Kettering Cancer Center. The Cleveland Clinic collaboration aimed to enhance Watson's health expertise and support medical professionals in treating patients more effectively. However, the MD Anderson Cancer Center pilot program, initiated in 2013, ultimately failed to meet its goals and was discontinued after $65 million in investment.
In 2016, IBM launched "IBM Watson for Oncology," a product designed to provide personalized, evidence-based cancer care options to physicians and patients. This initiative marked a significant milestone in the adoption of Watson's technology in the healthcare industry. Additionally, IBM partnered with Manipal Hospitals in India to offer Watson's expertise to patients online.
The company ultimately faced challenges in the healthcare market, with no profit and increased competition. In 2022, IBM announced the sell-off of its Watson Health unit to Francisco Partners, marking a significant shift in the company's approach to the healthcare industry.
IBM Watson Group
On January 9, 2014, IBM announced it was creating a business unit around Watson. IBM Watson Group will have headquarters in New York City's Silicon Alley and will employ 2,000 people. IBM has invested $1 billion to get the division going. Watson Group will develop three new cloud-delivered services: Watson Discovery Advisor, Watson Engagement Advisor, and Watson Explorer. Watson Discovery Advisor will focus on research and development projects in pharmaceutical industry, publishing, and biotechnology
Biotechnology is a multidisciplinary field that involves the integration of natural sciences and Engineering Science, engineering sciences in order to achieve the application of organisms and parts thereof for products and services. Specialists ...
, Watson Engagement Advisor will focus on self-service applications using insights on the basis of natural language questions posed by business users, and Watson Explorer will focus on helping enterprise users uncover and share data-driven insights based on federated search more easily. The company is also launching a $100 million venture fund to spur application development for "cognitive" applications. According to IBM, the cloud-delivered enterprise-ready Watson has seen its speed increase 24 times over—a 2,300 percent improvement in performance and its physical size shrank by 90 percent—from the size of a master bedroom to three stacked pizza boxes. IBM CEO Virginia Rometty said she wants Watson to generate $10 billion in annual revenue within ten years. In 2017, IBM and MIT established a new joint research venture in artificial intelligence. IBM invested $240 million to create the MIT–IBM Watson AI Lab in partnership with MIT, which brings together researchers in academia and industry to advance AI research, with projects ranging from computer vision and NLP to devising new ways to ensure that AI systems are fair, reliable and secure. In March 2018, IBM's CEO Ginni Rometty proposed "Watson's Law," the "use of and application of business, smart cities, consumer applications and life in general."["IBM CEO Rometty Proposes 'Watson's Law': AI In Everything"](_blank)
, Adrian Bridgewater, ''Forbes'', March 20, 2018
See also
* Artificial intelligence
Artificial intelligence (AI) is the capability of computer, computational systems to perform tasks typically associated with human intelligence, such as learning, reasoning, problem-solving, perception, and decision-making. It is a field of re ...
* Blue Gene
* IBM Watsonx
* Commonsense knowledge (artificial intelligence)
* Glossary of artificial intelligence
* Artificial general intelligence
* Tech companies in the New York metropolitan area
* Wolfram Alpha
References
Bibliography
*
Further reading
* Baker, Stephen (2012) ''Final Jeopardy: The Story of Watson, the Computer That Will Transform Our World'', Mariner Books.
* Jackson, Joab (2014).
IBM bets big on Watson-branded cognitive computing
' PCWorld: Jan 9, 2014 2:30 PM
* Greenemeier, Larry. (2013). Will IBM's Watson Usher in a New Era of Cognitive Computing? Scientific American. Nov 13, 2013 , * Lazarus, R. S. (1982).
* Kelly, J.E. and Hamm, S. ( 2013). Smart Machines: IBM's Watson and the Era of Cognitive Computing. Columbia Business School Publishing
External links
Watson homepage
DeepQA homepage
About Watson on Jeopardy.com
Smartest Machine on Earth (PBS ''NOVA'' documentary about the making of Watson)
Power Systems
''The New York Times
''The New York Times'' (''NYT'') is an American daily newspaper based in New York City. ''The New York Times'' covers domestic, national, and international news, and publishes opinion pieces, investigative reports, and reviews. As one of ...
''. June 16, 2010.
This is Watson
– IBM Journal of Research and Development (published by the IEEE
The Institute of Electrical and Electronics Engineers (IEEE) is an American 501(c)(3) organization, 501(c)(3) public charity professional organization for electrical engineering, electronics engineering, and other related disciplines.
The IEEE ...
)
J! Archive
''Jeopardy!'' Show #6086 – Game 1, Part 1
''Jeopardy!'' Show #6087 – Game 1, Part 2
''Jeopardy!'' Show #6088 – Game 2
Videos
PBS NOVA documentary on the making of Watson
* (21:42), IBMLabs
*
*
* – November 15, 2011, David Ferrucci at Computer History Museum
alternate
* – 2012
* – IBM at EDGE 2012
* – Martin Kohn, 2013
IBM Watson playlist
IBMLabs Watson playlist
{{Artificial intelligence navbox
Computer-related introductions in 2006
IBM cloud services
IBM computers
Contestants on American game shows
Natural language processing software
One-of-a-kind computers
Virtual assistants