Von Neumann Computer
   HOME

TheInfoList



OR:

The von Neumann architecture — also known as the von Neumann model or Princeton architecture — is a computer architecture based on a 1945 description by John von Neumann, and by others, in the '' First Draft of a Report on the EDVAC''. The document describes a design architecture for an electronic digital computer with these components: * A processing unit with both an
arithmetic logic unit In computing, an arithmetic logic unit (ALU) is a Combinational logic, combinational digital circuit that performs arithmetic and bitwise operations on integer binary numbers. This is in contrast to a floating-point unit (FPU), which operates on ...
and
processor register A processor register is a quickly accessible location available to a computer's processor. Registers usually consist of a small amount of fast storage, although some registers have specific hardware functions, and may be read-only or write-only. ...
s * A control unit that includes an
instruction register In computing, the instruction register (IR) or current instruction register (CIR) is the part of a CPU's control unit that holds the instruction currently being executed or decoded. In simple processors, each instruction to be executed is loaded ...
and a
program counter The program counter (PC), commonly called the instruction pointer (IP) in Intel x86 and Itanium microprocessors, and sometimes called the instruction address register (IAR), the instruction counter, or just part of the instruction sequencer, is ...
* Memory that stores data and
instructions Instruction or instructions may refer to: Computing * Instruction, one operation of a processor within a computer architecture instruction set * Computer program, a collection of instructions Music * Instruction (band), a 2002 rock band from Ne ...
* External mass storage * Input and output mechanisms.. The term "von Neumann architecture" has evolved to refer to any
stored-program computer A stored-program computer is a computer that stores program instructions in electronically or optically accessible memory. This contrasts with systems that stored the program instructions with plugboards or similar mechanisms. The definition i ...
in which an
instruction fetch The instruction cycle (also known as the fetch–decode–execute cycle, or simply the fetch-execute cycle) is the cycle that the central processing unit (CPU) follows from boot-up until the computer has shut down in order to process instruction ...
and a data operation cannot occur at the same time (since they share a common
bus A bus (contracted from omnibus, with variants multibus, motorbus, autobus, etc.) is a road vehicle that carries significantly more passengers than an average car or van. It is most commonly used in public transport, but is also in use for cha ...
). This is referred to as the von Neumann bottleneck, which often limits the performance of the corresponding system. The design of a von Neumann architecture machine is simpler than in a Harvard architecture machine—which is also a stored-program system, yet has one dedicated set of address and data buses for reading and writing to memory, and another set of address and data buses to fetch
instructions Instruction or instructions may refer to: Computing * Instruction, one operation of a processor within a computer architecture instruction set * Computer program, a collection of instructions Music * Instruction (band), a 2002 rock band from Ne ...
. A stored-program digital computer keeps both program instructions and data in read–write, random-access memory (RAM). Stored-program computers were an advancement over the program-controlled computers of the 1940s, such as the
Colossus Colossus, Colossos, or the plural Colossi or Colossuses, may refer to: Statues * Any exceptionally large statue ** List of tallest statues ** :Colossal statues * ''Colossus of Barletta'', a bronze statue of an unidentified Roman emperor * ''Col ...
and the ENIAC. Those were programmed by setting switches and inserting patch cables to route data and control signals between various functional units. The vast majority of modern computers use the same memory for both data and program instructions, but have caches between the CPU and memory, and, for the caches closest to the CPU, have separate caches for instructions and data, so that most instruction and data fetches use separate buses ( split cache architecture).


History

The earliest computing machines had fixed programs. Some very simple computers still use this design, either for simplicity or training purposes. For example, a desk
calculator An electronic calculator is typically a portable electronic device used to perform calculations, ranging from basic arithmetic to complex mathematics. The first solid-state electronic calculator was created in the early 1960s. Pocket-sized ...
(in principle) is a fixed program computer. It can do basic
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, but it cannot run a word processor or games. Changing the program of a fixed-program machine requires rewiring, restructuring, or redesigning the machine. The earliest computers were not so much "programmed" as "designed" for a particular task. "Reprogramming" – when possible at all – was a laborious process that started with
flowchart A flowchart is a type of diagram that represents a workflow or process. A flowchart can also be defined as a diagrammatic representation of an algorithm, a step-by-step approach to solving a task. The flowchart shows the steps as boxes of va ...
s and paper notes, followed by detailed engineering designs, and then the often-arduous process of physically rewiring and rebuilding the machine. It could take three weeks to set up and debug a program on ENIAC. With the proposal of the stored-program computer, this changed. A stored-program computer includes, by design, an
instruction set In computer science, an instruction set architecture (ISA), also called computer architecture, is an abstract model of a computer. A device that executes instructions described by that ISA, such as a central processing unit (CPU), is called an ' ...
, and can store in memory a set of instructions (a
program Program, programme, programmer, or programming may refer to: Business and management * Program management, the process of managing several related projects * Time management * Program, a part of planning Arts and entertainment Audio * Progra ...
) that details the
computation Computation is any type of arithmetic or non-arithmetic calculation that follows a well-defined model (e.g., an algorithm). Mechanical or electronic devices (or, historically, people) that perform computations are known as ''computers''. An es ...
. A stored-program design also allows for self-modifying code. One early motivation for such a facility was the need for a program to increment or otherwise modify the address portion of instructions, which operators had to do manually in early designs. This became less important when index registers and
indirect addressing Addressing modes are an aspect of the instruction set architecture in most central processing unit (CPU) designs. The various addressing modes that are defined in a given instruction set architecture define how the machine language instructions in ...
became usual features of machine architecture. Another use was to embed frequently used data in the instruction stream using immediate addressing. Self-modifying code has largely fallen out of favor, since it is usually hard to understand and
debug In computer programming and software development, debugging is the process of finding and resolving '' bugs'' (defects or problems that prevent correct operation) within computer programs, software, or systems. Debugging tactics can involve int ...
, as well as being inefficient under modern processor pipelining and caching schemes.


Capabilities

On a large scale, the ability to treat instructions as data is what makes assemblers, compilers, linkers, loaders, and other automated programming tools possible. It makes "programs that write programs" possible. This has made a sophisticated self-hosting computing ecosystem flourish around von Neumann architecture machines. Some high level languages leverage the von Neumann architecture by providing an abstract, machine-independent way to manipulate executable code at runtime (e.g.,
LISP A lisp is a speech impairment in which a person misarticulates sibilants (, , , , , , , ). These misarticulations often result in unclear speech. Types * A frontal lisp occurs when the tongue is placed anterior to the target. Interdental lisping ...
), or by using runtime information to tune just-in-time compilation (e.g. languages hosted on the
Java virtual machine A Java virtual machine (JVM) is a virtual machine that enables a computer to run Java programs as well as programs written in other languages that are also compiled to Java bytecode. The JVM is detailed by a specification that formally describes ...
, or languages embedded in web browsers). On a smaller scale, some repetitive operations such as BITBLT or pixel and vertex shaders can be accelerated on general purpose processors with just-in-time compilation techniques. This is one use of self-modifying code that has remained popular.


Development of the stored-program concept

The mathematician Alan Turing, who had been alerted to a problem of mathematical logic by the lectures of Max Newman at the University of Cambridge, wrote a paper in 1936 entitled ''On Computable Numbers, with an Application to the Entscheidungsproblem'', which was published in the ''Proceedings of the London Mathematical Society''. and . In it he described a hypothetical machine he called a ''universal computing machine'', now known as the " Universal Turing machine". The hypothetical machine had an infinite store (memory in today's terminology) that contained both instructions and data. John von Neumann became acquainted with Turing while he was a visiting professor at Cambridge in 1935, and also during Turing's PhD year at the Institute for Advanced Study in Princeton, New Jersey during 1936–1937. Whether he knew of Turing's paper of 1936 at that time is not clear. In 1936, Konrad Zuse also anticipated, in two patent applications, that machine instructions could be stored in the same storage used for data. Independently,
J. Presper Eckert John Adam Presper Eckert Jr. (April 9, 1919 – June 3, 1995) was an American electrical engineer and computer pioneer. With John Mauchly, he designed the first general-purpose electronic digital computer (ENIAC), presented the first course in co ...
and John Mauchly, who were developing the ENIAC at the Moore School of Electrical Engineering of the University of Pennsylvania, wrote about the stored-program concept in December 1943. In planning a new machine, EDVAC, Eckert wrote in January 1944 that they would store data and programs in a new addressable memory device, a mercury metal delay-line memory. This was the first time the construction of a practical stored-program machine was proposed. At that time, he and Mauchly were not aware of Turing's work. Von Neumann was involved in the Manhattan Project at the Los Alamos National Laboratory. It required huge amounts of calculation, and thus drew him to the ENIAC project, during the summer of 1944. There he joined the ongoing discussions on the design of this stored-program computer, the EDVAC. As part of that group, he wrote up a description titled ''First Draft of a Report on the EDVAC'' based on the work of Eckert and Mauchly. It was unfinished when his colleague
Herman Goldstine Herman Heine Goldstine (September 13, 1913 – June 16, 2004) was a mathematician and computer scientist, who worked as the director of the IAS machine at Princeton University's Institute for Advanced Study and helped to develop ENIAC, the ...
circulated it, and bore only von Neumann's name (to the consternation of Eckert and Mauchly). The paper was read by dozens of von Neumann's colleagues in America and Europe, and influenced the next round of computer designs.
Jack Copeland Brian John Copeland (born 1950) is Professor of Philosophy at the University of Canterbury, Christchurch, New Zealand, and author of books on the computing pioneer Alan Turing. Education Copeland was educated at the University of Oxford, obta ...
considers that it is "historically inappropriate to refer to electronic stored-program digital computers as 'von Neumann machines. His Los Alamos colleague Stan Frankel said of von Neumann's regard for Turing's ideas At the time that the "First Draft" report was circulated, Turing was producing a report entitled ''Proposed Electronic Calculator''. It described in engineering and programming detail, his idea of a machine he called the '' Automatic Computing Engine (ACE)''. He presented this to the executive committee of the British National Physical Laboratory on February 19, 1946. Although Turing knew from his wartime experience at Bletchley Park that what he proposed was feasible, the secrecy surrounding
Colossus Colossus, Colossos, or the plural Colossi or Colossuses, may refer to: Statues * Any exceptionally large statue ** List of tallest statues ** :Colossal statues * ''Colossus of Barletta'', a bronze statue of an unidentified Roman emperor * ''Col ...
, that was subsequently maintained for several decades, prevented him from saying so. Various successful implementations of the ACE design were produced. Both von Neumann's and Turing's papers described stored-program computers, but von Neumann's earlier paper achieved greater circulation and the computer architecture it outlined became known as the "von Neumann architecture". In the 1953 publication ''Faster than Thought: A Symposium on Digital Computing Machines'' (edited by B. V. Bowden), a section in the chapter on ''Computers in America'' reads as follows:
The Machine of the Institute For Advanced Studies, Princeton In 1945, Professor J. von Neumann, who was then working at the Moore School of Engineering in Philadelphia, where the E.N.I.A.C. had been built, issued on behalf of a group of his co-workers, a report on the logical design of digital computers. The report contained a detailed proposal for the design of the machine that has since become known as the E.D.V.A.C. (electronic discrete variable automatic computer). This machine has only recently been completed in America, but the von Neumann report inspired the construction of the E.D.S.A.C. (electronic delay-storage automatic calculator) in Cambridge (see page 130). In 1947, Burks, Goldstine and von Neumann published another report that outlined the design of another type of machine (a parallel machine this time) that would be exceedingly fast, capable perhaps of 20,000 operations per second. They pointed out that the outstanding problem in constructing such a machine was the development of suitable memory with instantaneously accessible contents. At first they suggested using a special vacuum tube—called the " Selectron"—which the Princeton Laboratories of RCA had invented. These tubes were expensive and difficult to make, so von Neumann subsequently decided to build a machine based on the Williams memory. This machine—completed in June, 1952 in Princeton—has become popularly known as the Maniac. The design of this machine inspired at least half a dozen machines now being built in America, all known affectionately as "Johniacs".
In the same book, the first two paragraphs of a chapter on ACE read as follows:
Automatic Computation at the National Physical Laboratory One of the most modern digital computers which embodies developments and improvements in the technique of automatic electronic computing was recently demonstrated at the National Physical Laboratory, Teddington, where it has been designed and built by a small team of mathematicians and electronics research engineers on the staff of the Laboratory, assisted by a number of production engineers from the English Electric Company, Limited. The equipment so far erected at the Laboratory is only the pilot model of a much larger installation which will be known as the Automatic Computing Engine, but although comparatively small in bulk and containing only about 800 thermionic valves, as can be judged from Plates XII, XIII and XIV, it is an extremely rapid and versatile calculating machine. The basic concepts and abstract principles of computation by a machine were formulated by Dr. A. M. Turing, F.R.S., in a paper1. read before the London Mathematical Society in 1936, but work on such machines in Britain was delayed by the war. In 1945, however, an examination of the problems was made at the National Physical Laboratory by Mr. J. R. Womersley, then superintendent of the Mathematics Division of the Laboratory. He was joined by Dr. Turing and a small staff of specialists, and, by 1947, the preliminary planning was sufficiently advanced to warrant the establishment of the special group already mentioned. In April, 1948, the latter became the Electronics Section of the Laboratory, under the charge of Mr. F. M. Colebrook.


Early von Neumann-architecture computers

The ''First Draft'' described a design that was used by many universities and corporations to construct their computers. Among these various computers, only ILLIAC and ORDVAC had compatible instruction sets. *
ARC2 ARC may refer to: Business * Aircraft Radio Corporation, a major avionics manufacturer from the 1920s to the '50s * Airlines Reporting Corporation, an airline-owned company that provides ticket distribution, reporting, and settlement services * ...
(
Birkbeck, University of London , mottoeng = Advice comes over nightTranslation used by Birkbeck. , established = , type = Public research university , endowment = £4.3 m (2014) , budget = £10 ...
) officially came online on May 12, 1948. * Manchester Baby (Victoria University of Manchester, England) made its first successful run of a stored program on June 21, 1948. *
EDSAC The Electronic Delay Storage Automatic Calculator (EDSAC) was an early British computer. Inspired by John von Neumann's seminal ''First Draft of a Report on the EDVAC'', the machine was constructed by Maurice Wilkes and his team at the Universi ...
(University of Cambridge, England) was the first practical stored-program electronic computer (May 1949) * Manchester Mark 1 (University of Manchester, England) Developed from the Baby (June 1949) * CSIRAC (
Council for Scientific and Industrial Research The Council for Scientific and Industrial Research (CSIR) is South Africa's central and premier scientific research and development organisation. It was established by an act of parliament in 1945 and is situated on its own campus in the cit ...
) Australia (November 1949) *
MESM MESM (Ukrainian: MEOM, Мала Електронна Обчислювальна Машина; Russian: МЭСМ, Малая Электронно-Счетная Машина; 'Small Electronic Calculating Machine') was the first universally programm ...
in Kyiv, Ukraine (November 1950) * EDVAC ( Ballistic Research Laboratory, Computing Laboratory at Aberdeen Proving Ground 1951) * ORDVAC (U-Illinois) at Aberdeen Proving Ground, Maryland (completed November 1951) * IAS machine at Princeton University (January 1952) * MANIAC I at Los Alamos Scientific Laboratory (March 1952) * ILLIAC at the University of Illinois, (September 1952) * BESM-1 in Moscow (1952) *
AVIDAC The AVIDAC or ''Argonne Version of the Institute's Digital Automatic Computer'', an early computer built by Argonne National Laboratory, was partially based on the IAS architecture developed by John von Neumann. It was built by the Laboratory's Ph ...
at
Argonne National Laboratory Argonne National Laboratory is a science and engineering research United States Department of Energy National Labs, national laboratory operated by University of Chicago, UChicago Argonne LLC for the United States Department of Energy. The facil ...
(1953) *
ORACLE An oracle is a person or agency considered to provide wise and insightful counsel or prophetic predictions, most notably including precognition of the future, inspired by deities. As such, it is a form of divination. Description The word '' ...
at Oak Ridge National Laboratory (June 1953) * BESK in Stockholm (1953) * JOHNNIAC at RAND Corporation (January 1954) * DASK in Denmark (1955) * WEIZAC at the Weizmann Institute of Science in
Rehovot Rehovot ( he, רְחוֹבוֹת ''Rəḥōvōt'', ar, رحوڤوت ''Reḥūfūt'') is a city in the Central District of Israel, about south of Tel Aviv. In it had a population of . Etymology Israel Belkind, founder of the Bilu movement, ...
, Israel (1955) * PERM in Munich (1956) *
SILLIAC The SILLIAC (''Sydney version of the Illinois Automatic Computer'', i.e. the ''Sydney ILLIAC''), an early computer built by the University of Sydney, Australia, was based on the ILLIAC and ORDVAC computers developed at the University of Illinoi ...
in Sydney (1956)


Early stored-program computers

The date information in the following chronology is difficult to put into proper order. Some dates are for first running a test program, some dates are the first time the computer was demonstrated or completed, and some dates are for the first delivery or installation. * The
IBM SSEC The IBM Selective Sequence Electronic Calculator (SSEC) was an electromechanical computer built by IBM. Its design was started in late 1944 and it operated from January 1948 to August 1952. It had many of the features of a stored-program computer, ...
had the ability to treat instructions as data, and was publicly demonstrated on January 27, 1948. This ability was claimed in a US patent. However it was partially electromechanical, not fully electronic. In practice, instructions were read from paper tape due to its limited memory.. * The
ARC2 ARC may refer to: Business * Aircraft Radio Corporation, a major avionics manufacturer from the 1920s to the '50s * Airlines Reporting Corporation, an airline-owned company that provides ticket distribution, reporting, and settlement services * ...
developed by Andrew Booth and Kathleen Booth at
Birkbeck, University of London , mottoeng = Advice comes over nightTranslation used by Birkbeck. , established = , type = Public research university , endowment = £4.3 m (2014) , budget = £10 ...
officially came online on May 12, 1948. It featured the first rotating drum storage device. * The Manchester Baby was the first fully electronic computer to run a stored program. It ran a factoring program for 52 minutes on June 21, 1948, after running a simple division program and a program to show that two numbers were relatively prime. * The ENIAC was modified to run as a primitive read-only stored-program computer (using the Function Tables for program
ROM Rom, or ROM may refer to: Biomechanics and medicine * Risk of mortality, a medical classification to estimate the likelihood of death for a patient * Rupture of membranes, a term used during pregnancy to describe a rupture of the amniotic sac * R ...
) and was demonstrated as such on September 16, 1948, running a program by
Adele Goldstine Adele Goldstine (; December 21, 1920 – November 1964) was an American mathematician and computer programmer. She wrote the manual for the first electronic digital computer, ENIAC. Through her work programming the computer, she was also an inst ...
for von Neumann. * The BINAC ran some test programs in February, March, and April 1949, although was not completed until September 1949. * The Manchester Mark 1 developed from the Baby project. An intermediate version of the Mark 1 was available to run programs in April 1949, but was not completed until October 1949. * The
EDSAC The Electronic Delay Storage Automatic Calculator (EDSAC) was an early British computer. Inspired by John von Neumann's seminal ''First Draft of a Report on the EDVAC'', the machine was constructed by Maurice Wilkes and his team at the Universi ...
ran its first program on May 6, 1949. * The EDVAC was delivered in August 1949, but it had problems that kept it from being put into regular operation until 1951. * The CSIR Mk I ran its first program in November 1949. * The SEAC was demonstrated in April 1950. * The Pilot ACE ran its first program on May 10, 1950, and was demonstrated in December 1950. * The SWAC was completed in July 1950. * The Whirlwind was completed in December 1950 and was in actual use in April 1951. * The first ERA Atlas (later the commercial ERA 1101/UNIVAC 1101) was installed in December 1950.


Evolution

Through the decades of the 1960s and 1970s computers generally became both smaller and faster, which led to evolutions in their architecture. For example, memory-mapped I/O lets input and output devices be treated the same as memory. A single
system bus A system bus is a single computer bus that connects the major components of a computer system, combining the functions of a data bus to carry information, an address bus to determine where it should be sent or read from, and a control bus to dete ...
could be used to provide a modular system with lower cost. This is sometimes called a "streamlining" of the architecture. In subsequent decades, simple microcontrollers would sometimes omit features of the model to lower cost and size. Larger computers added features for higher performance.


Design limitations


Von Neumann bottleneck

The shared bus between the program memory and data memory leads to the ''von Neumann bottleneck'', the limited throughput (data transfer rate) between the central processing unit (CPU) and memory compared to the amount of memory. Because the single bus can only access one of the two classes of memory at a time, throughput is lower than the rate at which the CPU can work. This seriously limits the effective processing speed when the CPU is required to perform minimal processing on large amounts of data. The CPU is continually forced to wait for needed data to move to or from memory. Since CPU speed and memory size have increased much faster than the throughput between them, the bottleneck has become more of a problem, a problem whose severity increases with every new generation of CPU. The von Neumann bottleneck was described by John Backus in his 1977 ACM Turing Award lecture. According to Backus:
Surely there must be a less primitive way of making big changes in the store than by pushing vast numbers of words back and forth through the von Neumann bottleneck. Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand. Thus programming is basically planning and detailing the enormous traffic of words through the von Neumann bottleneck, and much of that traffic concerns not significant data itself, but where to find it.


Mitigations

There are several known methods for mitigating the Von Neumann performance bottleneck. For example, the following all can improve performance: * Providing a cache between the CPU and the
main memory Computer data storage is a technology consisting of computer components and recording media that are used to retain digital data. It is a core function and fundamental component of computers. The central processing unit (CPU) of a computer ...
* providing separate caches or separate access paths for data and instructions (the so-called Modified Harvard architecture) * using branch predictor algorithms and logic * providing a limited CPU stack or other on-chip scratchpad memory to reduce memory access * Implementing the CPU and the
memory hierarchy In computer architecture, the memory hierarchy separates computer storage into a hierarchy based on response time. Since response time, complexity, and capacity are related, the levels may also be distinguished by their performance and controlli ...
as a
system on chip A system on a chip or system-on-chip (SoC ; pl. ''SoCs'' ) is an integrated circuit that integrates most or all components of a computer or other electronic system. These components almost always include a central processing unit (CPU), memory ...
, providing greater locality of reference and thus reducing latency and increasing throughput between
processor register A processor register is a quickly accessible location available to a computer's processor. Registers usually consist of a small amount of fast storage, although some registers have specific hardware functions, and may be read-only or write-only. ...
s and
main memory Computer data storage is a technology consisting of computer components and recording media that are used to retain digital data. It is a core function and fundamental component of computers. The central processing unit (CPU) of a computer ...
The problem can also be sidestepped somewhat by using
parallel computing Parallel computing is a type of computation in which many calculations or processes are carried out simultaneously. Large problems can often be divided into smaller ones, which can then be solved at the same time. There are several different fo ...
, using for example the non-uniform memory access (NUMA) architecture—this approach is commonly employed by supercomputers. It is less clear whether the ''intellectual bottleneck'' that Backus criticized has changed much since 1977. Backus's proposed solution has not had a major influence. Modern functional programming and object-oriented programming are much less geared towards "pushing vast numbers of words back and forth" than earlier languages like FORTRAN were, but internally, that is still what computers spend much of their time doing, even highly parallel supercomputers. As of 1996, a database benchmark study found that three out of four CPU cycles were spent waiting for memory. Researchers expect that increasing the number of simultaneous instruction streams with multithreading or single-chip
multiprocessing Multiprocessing is the use of two or more central processing units (CPUs) within a single computer system. The term also refers to the ability of a system to support more than one processor or the ability to allocate tasks between them. There ar ...
will make this bottleneck even worse.Sites, Richard L.; Patt, Yale
"Architects Look to Processors of Future"
Microprocessor report. 1996.
In the context of multi-core processors, additional overhead is required to maintain
cache coherence In computer architecture, cache coherence is the uniformity of shared resource data that ends up stored in multiple local caches. When clients in a system maintain caches of a common memory resource, problems may arise with incoherent data, whi ...
between processors and threads.


Self-modifying code

Aside from the von Neumann bottleneck, program modifications can be quite harmful, either by accident or design. In some simple stored-program computer designs, a malfunctioning program can damage itself, other programs, or the operating system, possibly leading to a computer crash. Memory protection and other forms of
access control In the fields of physical security and information security, access control (AC) is the selective restriction of access to a place or other resource, while access management describes the process. The act of ''accessing'' may mean consuming ...
can usually protect against both accidental and malicious program changes.


See also

*
CARDboard Illustrative Aid to Computation CARDIAC (CARDboard Illustrative Aid to Computation) is a learning aid developed by David Hagelbarger and Saul Fingerman for Bell Telephone Laboratories in 1968 to teach high school students how computers work. The kit consists of an instruction ...
* Interconnect bottleneck * Little man computer *
Random-access machine In computer science, random-access machine (RAM) is an abstract machine in the general class of register machines. The RAM is very similar to the counter machine but with the added capability of 'indirect addressing' of its registers. Like the cou ...
* Harvard architecture * Turing machine *
Eckert architecture Eckert may refer to: People * Allan W. Eckert (1931–2011), American historical novelist * Andrea Eckert (born 1958), Austrian actress * Charles R. Eckert (1868–1959), U.S. congressman from Pennsylvania * Ernst R. G. Eckert (1904–2004), Ge ...


References


Further reading

* * * republished as: * ''Can Programming be Liberated from the von Neumann Style?''. Backus, John. 1977 ACM Turing Award Lecture. Communications of the ACM, August 1978, Volume 21, Number
Online PDF
see details at https://www.cs.tufts.edu/~nr/backus-lecture.html * Bell, C. Gordon; Newell, Allen (1971), ''Computer Structures: Readings and Examples'', McGraw-Hill Book Company, New York. Massive (668 pages) * * * * *


External links


Harvard vs von Neumann

A tool that emulates the behavior of a von Neumann machine

JOHNNY: A simple Open Source simulator of a von Neumann machine for educational purposes
{{DEFAULTSORT:Von Neumann Architecture Computer architecture Flynn's taxonomy Reference models Classes of computers Department of Computer Science, University of Manchester Computer-related introductions in 1945 John von Neumann