HOME

TheInfoList



OR:

The Fifth Generation Computer Systems (FGCS; ) was a 10-year initiative launched in 1982 by Japan's
Ministry of International Trade and Industry The was a Ministry (government department), ministry of the Government of Japan from 1949 to 2001. The MITI was one of the most powerful government agencies in Japan and, at the height of its influence, effectively ran much of Japanese industri ...
(MITI) to develop computers based on massively parallel computing and
logic programming Logic programming is a programming, database and knowledge representation paradigm based on formal logic. A logic program is a set of sentences in logical form, representing knowledge about some problem domain. Computation is performed by applyin ...
. The project aimed to create an "epoch-making computer" with supercomputer-like performance and to establish a platform for future advancements in
artificial intelligence Artificial intelligence (AI) is the capability of computer, computational systems to perform tasks typically associated with human intelligence, such as learning, reasoning, problem-solving, perception, and decision-making. It is a field of re ...
. Although FGCS was ahead of its time, its ambitious goals ultimately led to commercial failure. However, on a theoretical level, the project significantly contributed to the development of concurrent logic programming. The term "fifth generation" was chosen to emphasize the system's advanced nature. In the
history of computing hardware The history of computing hardware spans the developments from early devices used for simple calculations to today's complex computers, encompassing advancements in both analog and digital technology. The first aids to computation were purely mec ...
, there had been four prior "generations" of computers: the first generation utilized
vacuum tube A vacuum tube, electron tube, thermionic valve (British usage), or tube (North America) is a device that controls electric current flow in a high vacuum between electrodes to which an electric voltage, potential difference has been applied. It ...
s; the second,
transistor A transistor is a semiconductor device used to Electronic amplifier, amplify or electronic switch, switch electrical signals and electric power, power. It is one of the basic building blocks of modern electronics. It is composed of semicondu ...
s and
diode A diode is a two-Terminal (electronics), terminal electronic component that conducts electric current primarily in One-way traffic, one direction (asymmetric electrical conductance, conductance). It has low (ideally zero) Electrical resistance ...
s; the third,
integrated circuit An integrated circuit (IC), also known as a microchip or simply chip, is a set of electronic circuits, consisting of various electronic components (such as transistors, resistors, and capacitors) and their interconnections. These components a ...
s; and the fourth,
microprocessor A microprocessor is a computer processor (computing), processor for which the data processing logic and control is included on a single integrated circuit (IC), or a small number of ICs. The microprocessor contains the arithmetic, logic, a ...
s. While earlier generations focused on increasing the number of logic elements within a single CPU, it was widely believed at the time that the fifth generation would achieve enhanced performance through the use of massive numbers of CPUs.


Background

In the late 1960s until the early 1970s, there was much talk about "generations" of computer hardware, then usually organized into three generations # First generation: Thermionic vacuum tubes. Mid-1940s. IBM pioneered the arrangement of vacuum tubes in pluggable modules. The
IBM 650 The IBM 650 Magnetic Drum Data-Processing Machine is an early digital computer produced by IBM in the mid-1950s. It was the first mass-produced computer in the world. Almost 2,000 systems were produced, the last in 1962, and it was the firs ...
was a first-generation computer. # Second generation: Transistors. 1956. The era of miniaturization begins. Transistors are much smaller than vacuum tubes, draw less power, and generate less heat. Discrete transistors are soldered to circuit boards, with interconnections accomplished by stencil-screened conductive patterns on the reverse side. The
IBM 7090 The IBM 7090 is a second-generation Transistor computer, transistorized version of the earlier IBM 709 vacuum tube mainframe computer that was designed for "large-scale scientific and technological applications". The 7090 is the fourth member o ...
was a second-generation computer. # Third generation: Integrated circuits (silicon chips containing multiple transistors). 1964. A pioneering example is the ACPX module used in the IBM 360/91, which, by stacking layers of silicon over a ceramic substrate, accommodated over 20 transistors per chip; the chips could be packed together onto a circuit board to achieve unprecedented logic densities. The IBM 360/91 was a hybrid second and third-generation computer. Omitted from this taxonomy is the "zeroth-generation" computer based on metal gears (such as the IBM 407) or mechanical relays (such as the Mark I), and the post-third-generation computers based on Very Large Scale Integrated ( VLSI) circuits. There was also a parallel set of generations for software: # First generation:
Machine language In computer programming, machine code is computer code consisting of machine language instructions, which are used to control a computer's central processing unit (CPU). For conventional binary computers, machine code is the binaryOn nonb ...
. # Second generation:
Low-level programming language A low-level programming language is a programming language that provides little or no Abstraction (computer science), abstraction from a computer's instruction set architecture, memory or underlying physical hardware; commands or functions in the ...
s such as
Assembly language In computing, assembly language (alternatively assembler language or symbolic machine code), often referred to simply as assembly and commonly abbreviated as ASM or asm, is any low-level programming language with a very strong correspondence bet ...
. # Third generation: Structured
high-level programming language A high-level programming language is a programming language with strong Abstraction (computer science), abstraction from the details of the computer. In contrast to low-level programming languages, it may use natural language ''elements'', be ea ...
s such as C,
COBOL COBOL (; an acronym for "common business-oriented language") is a compiled English-like computer programming language designed for business use. It is an imperative, procedural, and, since 2002, object-oriented language. COBOL is primarily ...
and FORTRAN. # Fourth generation: "Non-procedural"
high-level programming language A high-level programming language is a programming language with strong Abstraction (computer science), abstraction from the details of the computer. In contrast to low-level programming languages, it may use natural language ''elements'', be ea ...
s (such as object-oriented languages). Throughout these multiple generations up to the 1970s, Japan built computers following U.S. and British leads. In the mid-1970s, the Ministry of International Trade and Industry stopped following western leads and started looking into the future of computing on a small scale. They asked the Japan Information Processing Development Center (JIPDEC) to indicate a number of future directions, and in 1979 offered a three-year contract to carry out more in-depth studies along with industry and academia. It was during this period that the term "fifth-generation computer" started to be used. Prior to the 1970s, MITI guidance had successes such as an improved steel industry, the creation of the oil supertanker, the
automotive industry The automotive industry comprises a wide range of company, companies and organizations involved in the design, Business development, development, manufacturing, marketing, selling, Maintenance, repairing, and Custom car, modification of motor ve ...
, consumer electronics, and computer memory. MITI decided that the future was going to be
information technology Information technology (IT) is a set of related fields within information and communications technology (ICT), that encompass computer systems, software, programming languages, data processing, data and information processing, and storage. Inf ...
. However, the
Japanese language is the principal language of the Japonic languages, Japonic language family spoken by the Japanese people. It has around 123 million speakers, primarily in Japan, the only country where it is the national language, and within the Japanese dia ...
, particularly in its written form, presented and still presents obstacles for computers. As a result of these hurdles, MITI held a conference to seek assistance from experts. The primary fields for investigation from this initial project were: * Inference computer technologies for knowledge processing * Computer technologies to process large-scale data bases and
knowledge base In computer science, a knowledge base (KB) is a set of sentences, each sentence given in a knowledge representation language, with interfaces to tell new sentences and to ask questions about what is known, where either of these interfaces migh ...
s * High-performance workstations * Distributed functional computer technologies * Super-computers for scientific calculation


Project launch

The aim was to build parallel computers for artificial intelligence applications using concurrent logic programming. The project imagined an "epoch-making" computer with supercomputer-like performance running on top of large
database In computing, a database is an organized collection of data or a type of data store based on the use of a database management system (DBMS), the software that interacts with end users, applications, and the database itself to capture and a ...
s (as opposed to a traditional filesystem) using a logic programming language to define and access the data using massively parallel computing/processing. They envisioned building a prototype machine with performance between 100M and 1G LIPS, where a LIPS is a '' Logical Inference Per Second.'' At the time typical workstation machines were capable of about 100k LIPS. They proposed to build this machine over a ten-year period, 3 years for initial R&D, 4 years for building various subsystems, and a final 3 years to complete a working prototype system. In 1982 the government decided to go ahead with the project, and established the Institute for New Generation Computer Technology (ICOT) through joint investment with various Japanese computer companies. After the project ended, MITI would consider an investment in a new "sixth generation" project. Ehud Shapiro captured the rationale and motivations driving this project:
"As part of Japan's effort to become a leader in the computer industry, the Institute for New Generation Computer Technology has launched a revolutionary ten-year plan for the development of large computer systems which will be applicable to knowledge information processing systems. These Fifth Generation computers will be built around the concepts of logic programming. In order to refute the accusation that Japan exploits knowledge from abroad without contributing any of its own, this project will stimulate original research and will make its results available to the international research community."


Logic programming

The target defined by the FGCS project was to develop "Knowledge Information Processing systems" (roughly meaning, applied
Artificial Intelligence Artificial intelligence (AI) is the capability of computer, computational systems to perform tasks typically associated with human intelligence, such as learning, reasoning, problem-solving, perception, and decision-making. It is a field of re ...
). The chosen tool to implement this goal was
logic programming Logic programming is a programming, database and knowledge representation paradigm based on formal logic. A logic program is a set of sentences in logical form, representing knowledge about some problem domain. Computation is performed by applyin ...
. Logic programming approach as was characterized by Maarten Van Emden – one of its founders – as: * The use of logic to express information in a computer. * The use of logic to present problems to a computer. * The use of logical inference to solve these problems. More technically, it can be summed up in two equations: * ''Program'' = ''Set of axioms''. * ''Computation'' = ''Proof of a statement from axioms''. The Axioms typically used are universal axioms of a restricted form, called Horn-clauses or definite-clauses. The statement proved in a computation is an existential statement. The proof is constructive, and provides values for the existentially quantified variables: these values constitute the output of the computation. Logic programming was thought of as something that unified various gradients of computer science (
software engineering Software engineering is a branch of both computer science and engineering focused on designing, developing, testing, and maintaining Application software, software applications. It involves applying engineering design process, engineering principl ...
,
databases In computing, a database is an organized collection of data or a type of data store based on the use of a database management system (DBMS), the software that interacts with end users, applications, and the database itself to capture and ana ...
, computer architecture and
artificial intelligence Artificial intelligence (AI) is the capability of computer, computational systems to perform tasks typically associated with human intelligence, such as learning, reasoning, problem-solving, perception, and decision-making. It is a field of re ...
). It seemed that logic programming was a key missing connection between
knowledge engineering Knowledge engineering (KE) refers to all aspects involved in knowledge-based systems. Background Expert systems One of the first examples of an expert system was MYCIN, an application to perform medical diagnosis. In the MYCIN example, the ...
and parallel computer architectures.


Results

After having influenced the
consumer electronics Consumer electronics, also known as home electronics, are electronic devices intended for everyday household use. Consumer electronics include those used for entertainment, Communication, communications, and recreation. Historically, these prod ...
field during the 1970s and the automotive world during the 1980s, the Japanese had developed a strong reputation. The launch of the FGCS project spread the belief that parallel computing was the future of all performance gains, producing a wave of apprehension in the computer field. Soon parallel projects were set up in the US as the Strategic Computing Initiative and the Microelectronics and Computer Technology Corporation (MCC), in the UK as Alvey, and in Europe as the European Strategic Program on Research in Information Technology (ESPRIT), as well as the European Computer‐Industry Research Centre (ECRC) in
Munich Munich is the capital and most populous city of Bavaria, Germany. As of 30 November 2024, its population was 1,604,384, making it the third-largest city in Germany after Berlin and Hamburg. Munich is the largest city in Germany that is no ...
, a collaboration between ICL in Britain,
Bull A bull is an intact (i.e., not Castration, castrated) adult male of the species ''Bos taurus'' (cattle). More muscular and aggressive than the females of the same species (i.e. cows proper), bulls have long been an important symbol cattle in r ...
in France, and
Siemens Siemens AG ( ) is a German multinational technology conglomerate. It is focused on industrial automation, building automation, rail transport and health technology. Siemens is the largest engineering company in Europe, and holds the positi ...
in Germany. The project ran from 1982 to 1994, spending a little less than ¥57 billion (about US$320 million) total. After the FGCS Project, MITI stopped funding large-scale computer research projects, and the research momentum developed by the FGCS Project dissipated. However MITI/ICOT embarked on a neural-net project which some called the Sixth Generation Project in the 1990s, with a similar level of funding. Per-year spending was less than 1% of the entire R&D expenditure of the electronics and communications equipment industry. For example, the project's highest expenditure year was 7.2 million yen in 1991, but IBM alone spent 1.5 billion dollars (370 billion yen) in 1982, while the industry spent 2150 billion yen in 1990.


Concurrent logic programming

In 1982, during a visit to the ICOT, Ehud Shapiro invented Concurrent
Prolog Prolog is a logic programming language that has its origins in artificial intelligence, automated theorem proving, and computational linguistics. Prolog has its roots in first-order logic, a formal logic. Unlike many other programming language ...
, a novel programming language that integrated logic programming and concurrent programming. Concurrent Prolog is a process oriented language, which embodies
dataflow In computing, dataflow is a broad concept, which has various meanings depending on the application and context. In the context of software architecture, data flow relates to stream processing or reactive programming. Software architecture Dat ...
synchronization and guarded-command indeterminacy as its basic control mechanisms. Shapiro described the language in a Report marked as ICOT Technical Report 003, which presented a Concurrent Prolog
interpreter Interpreting is translation from a spoken or signed language into another language, usually in real time to facilitate live communication. It is distinguished from the translation of a written text, which can be more deliberative and make use o ...
written in Prolog. Shapiro's work on Concurrent Prolog inspired a change in the direction of the FGCS from focusing on parallel implementation of Prolog to the focus on concurrent logic programming as the software foundation for the project. It also inspired the concurrent logic programming language Guarded Horn Clauses (GHC) by Ueda, which was the basis of KL1, the programming language that was finally designed and implemented by the FGCS project as its core programming language. The FGCS project and its findings contributed greatly to the development of the concurrent logic programming field. The project produced a new generation of promising Japanese researchers.


Commercial failure

Five running Parallel Inference Machines (PIM) were eventually produced: PIM/m, PIM/p, PIM/i, PIM/k, PIM/c. The project also produced applications to run on these systems, such as the parallel
database management system In computing, a database is an organized collection of data or a type of data store based on the use of a database management system (DBMS), the software that interacts with end users, applications, and the database itself to capture and an ...
Kappa, the legal reasoning system '' HELIC-II'', and the
automated theorem prover Automated theorem proving (also known as ATP or automated deduction) is a subfield of automated reasoning and mathematical logic dealing with proving mathematical theorems by computer programs. Automated reasoning over mathematical proof was a ma ...
'' MGTP'', as well as bioinformatics applications. The FGCS Project did not meet with commercial success for reasons similar to the Lisp machine companies and Thinking Machines. The highly parallel computer architecture was eventually surpassed in speed by less specialized hardware (for example, Sun workstations and
Intel Intel Corporation is an American multinational corporation and technology company headquartered in Santa Clara, California, and Delaware General Corporation Law, incorporated in Delaware. Intel designs, manufactures, and sells computer compo ...
x86 x86 (also known as 80x86 or the 8086 family) is a family of complex instruction set computer (CISC) instruction set architectures initially developed by Intel, based on the 8086 microprocessor and its 8-bit-external-bus variant, the 8088. Th ...
machines). A primary problem was the choice of concurrent logic programming as the bridge between the parallel computer architecture and the use of logic as a
knowledge representation Knowledge representation (KR) aims to model information in a structured manner to formally represent it as knowledge in knowledge-based systems whereas knowledge representation and reasoning (KRR, KR&R, or KR²) also aims to understand, reason, and ...
and problem solving language for AI applications. This never happened cleanly; a number of languages were developed, all with their own limitations. In particular, the committed choice feature of concurrent constraint logic programming interfered with the logical semantics of the languages.Carl Hewitt
Inconsistency Robustness in Logic Programming
ArXiv 2009.
The project found that the benefits of
logic programming Logic programming is a programming, database and knowledge representation paradigm based on formal logic. A logic program is a set of sentences in logical form, representing knowledge about some problem domain. Computation is performed by applyin ...
were largely negated using committed choice. Another problem was that existing CPU performance quickly overcame the barriers that experts anticipated in the 1980s, and the value of parallel computing dropped to the point where it was for some time used only in niche situations. Although a number of
workstation A workstation is a special computer designed for technical or computational science, scientific applications. Intended primarily to be used by a single user, they are commonly connected to a local area network and run multi-user operating syste ...
s of increasing capacity were designed and built over the project's lifespan, they generally found themselves soon outperformed by "off the shelf" units available commercially. The project also failed to incorporate outside innovations. During its lifespan, GUIs became mainstream in computers; the
internet The Internet (or internet) is the Global network, global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP) to communicate between networks and devices. It is a internetworking, network of networks ...
enabled locally stored databases to become distributed; and even simple research projects provided better real-world results in data mining. The FGCS workstations had no appeal in a market where general purpose systems could replace and outperform them. This is parallel to the Lisp machine market, where rule-based systems such as CLIPS could run on general-purpose computers, making expensive Lisp machines unnecessary.


Ahead of its time

In summary, the Fifth-Generation project was revolutionary, and accomplished some basic research that anticipated future research directions. Many papers and patents were published. MITI established a committee which assessed the performance of the FGCS Project as having made major contributions in computing, in particular eliminating bottlenecks in parallel processing software and the realization of intelligent interactive processing based on large knowledge bases. However, the committee was strongly biased to justify the project, so this overstates the actual results. Many of the themes seen in the Fifth-Generation project are now being re-interpreted in current technologies, as the hardware limitations foreseen in the 1980s were finally reached in the 2000s. When
clock speed Clock rate or clock speed in computing typically refers to the frequency at which the clock generator of a processor can generate pulses used to synchronize the operations of its components. It is used as an indicator of the processor's ...
s of CPUs began to move into the 3–5 GHz range,
CPU power dissipation Processor power dissipation or processing unit power dissipation is the process in which computer processors consume electrical energy, and dissipate this energy in the form of heat due to the resistance in the electronic circuits. Power manag ...
and other problems became more important. The ability of industry to produce ever-faster single CPU systems (linked to
Moore's Law Moore's law is the observation that the Transistor count, number of transistors in an integrated circuit (IC) doubles about every two years. Moore's law is an observation and Forecasting, projection of a historical trend. Rather than a law of ...
about the periodic doubling of transistor counts) began to be threatened. In the early 21st century, many flavors of
parallel computing Parallel computing is a type of computing, computation in which many calculations or Process (computing), processes are carried out simultaneously. Large problems can often be divided into smaller ones, which can then be solved at the same time. ...
began to proliferate, including
multi-core A multi-core processor (MCP) is a microprocessor on a single integrated circuit (IC) with two or more separate central processing units (CPUs), called ''cores'' to emphasize their multiplicity (for example, ''dual-core'' or ''quad-core''). Ea ...
architectures at the low-end and massively parallel processing at the high end. Ordinary consumer machines and
game console A video game console is an electronic device that outputs a video signal or image to display a video game that can typically be played with a game controller. These may be home consoles, which are generally placed in a permanent location conne ...
s began to have parallel processors like the
Intel Core Intel Core is a line of multi-core (with the exception of Core Solo and Core 2 Solo) central processing units (CPUs) for midrange, embedded, workstation, high-end and enthusiast computer markets marketed by Intel Corporation. These processors ...
, AMD K10, and Cell.
Graphics card A graphics card (also called a video card, display card, graphics accelerator, graphics adapter, VGA card/VGA, video adapter, display adapter, or colloquially GPU) is a computer expansion card that generates a feed of graphics output to a displa ...
companies like Nvidia and AMD began introducing large parallel systems like
CUDA In computing, CUDA (Compute Unified Device Architecture) is a proprietary parallel computing platform and application programming interface (API) that allows software to use certain types of graphics processing units (GPUs) for accelerated gene ...
and
OpenCL OpenCL (Open Computing Language) is a software framework, framework for writing programs that execute across heterogeneous computing, heterogeneous platforms consisting of central processing units (CPUs), graphics processing units (GPUs), di ...
. It appears, however, that these new technologies do not cite FGCS research. It is not clear if FGCS was leveraged to facilitate these developments in any significant way. No significant impact of FGCS on the computing industry has been demonstrated.


References

* * *
第五世代コンピュータ・プロジェクト 最終評価報告書 [Fifth Generation Computer Project Final Evaluation Report
(March 30, 1993)]


External links


FGCS Museum
- contains a large archive of nearly all of the output of the FGCS project, including technical reports, technical memoranda, hardware specifications, and software.
Details about 5th Generation Computer
- How the Computer System evolved. {{DEFAULTSORT:Fifth Generation Computer Classes of computers History of artificial intelligence MITI projects Parallel computing Research projects Supercomputing in Japan