HOME

TheInfoList



OR:

Distributed data processing (DDP) was the term that IBM used for the
IBM 3790 The IBM 3790 Communications System was one of the first distributed computing platforms. The 3790 was developed by IBM's Data Processing Division (DPD) and announced in 1974. It preceded the IBM 8100, announced in 1979. It was designed to be inst ...
(1975) and its successor, the
IBM 8100 The IBM 8100 Information System, announced Oct. 3, 1978, was at one time IBM’s principal distributed processing engine, providing local processing capability under two incompatible operating systems ( DPPX and DPCX) and was a follow-on to th ...
(1979). ''
Datamation ''Datamation'' is a computer magazine that was published in print form in the United States between 1957 and 1998,
'' described the 3790 in March 1979 as "less than successful." Distributed data processing was used by IBM to refer to two environments: *
IMS Ims is a Norwegian surname. Notable people with the surname include: * Gry Tofte Ims (born 1986), Norwegian footballer * Rolf Anker Ims (born 1958), Norwegian ecologist See also * IMS (disambiguation) Ims is a Norwegian surname. Notable people wit ...
DB/DC *
CICS IBM CICS (Customer Information Control System) is a family of mixed-language application servers that provide online transaction management and connectivity for applications on IBM mainframe systems under z/OS and z/VSE. CICS family products ...
/
DL/I Data Language One (Data Language/I, DL/I, Data Language/One, Data Language/One) is the language system used to access IBM's IMS databases and its data communication system. It is implemented from many languages by making calls to a software stub, ...
Each pair included a Telecommunications Monitor and a Database system. The layering involved a message, containing information to form a transaction, which was then processed by an application program. Development tools such as program validation services were released by IBM to facilitate expansion. Use of "a number of small computers linked to a central computer" permitted local and central processing, each optimized at what it could best do. Terminals, including those described as ''intelligent'', typically were attached locally, to a "satellite processor." Central systems, sometimes multi-processors, grew to handle the load. Some of this extra capacity, of necessity, is used to enhance data security. Years before open systems made its presence felt, the goal of some hardware suppliers was "to replace the big, central mainframe computer with an array of smaller computers that are tied together."


Lower case distributed data processing

Hadoop Apache Hadoop () is a collection of open-source software utilities that facilitates using a network of many computers to solve problems involving massive amounts of data and computation. It provides a software framework for distributed storage ...
adds another term to the mix:
File System In computing, file system or filesystem (often abbreviated to fs) is a method and data structure that the operating system uses to control how data is stored and retrieved. Without a file system, data placed in a storage medium would be one lar ...
. Tools added for this use of distributed data processing include new programming languages.


TSI/DPF Flexicom

In 1976 ''Turnkey Systems Inc'' (TSI)/DPF Inc. introduced a hardware/software telecommunications front-end to off-load some processing that handled ''distributed data processing''. Named Flexicom, The CPU was IBM-manufactured, and it ran (mainframe) DOS Rel. 26, with ''Flexicom'''s additions. Of four models available, the smallest had the CPU of a 360/30.


See also

* HPCC


References

History of computing hardware Computer-related introductions in 1975 {{computer-stub