HOME
*





DYNAMO (programming Language)
DYNAMO (DYNAmic MOdels) is a simulation language and accompanying graphical notation developed within the system dynamics analytical framework. It was originally for industrial dynamics but was soon extended to other applications, including population and resource studies and urban planning. DYNAMO was initially developed under the direction of Jay Wright Forrester in the late 1950s, by Dr. Phyllis Fox, Alexander L. Pugh III, Grace Duren, and others at the M.I.T. Computation Center. DYNAMO was used for the system dynamics simulations of global resource depletion reported in the Club of Rome's Limits to Growth, but has since fallen into disuse. Beginnings In 1958, Forrester unwittingly instigated DYNAMO's development when he asked an MIT staff programmer to compute needed solutions to some equations, for a Harvard Business Review paper he was writing about industrial dynamics. The programmer, Richard Bennett, chose to implement a system (SIMPLE - "Simulation of Industrial Ma ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Simulation Language
A computer simulation language is used to describe the operation of a simulation on a computer.Fritzson, Peter, and Vadim Engelson.Modelica—A unified object-oriented language for system modeling and simulation" European Conference on Object-Oriented Programming. Springer, Berlin, Heidelberg, 1998. There are two major types of simulation: continuous and discrete event though more modern languages can handle more complex combinations. Most languages also have a graphical interface and at least a simple statistic gathering capability for the analysis of the results. An important part of discrete-event languages is the ability to generate pseudo-random numbers and variants from different probability distributions. See also * Discrete event simulation * List of computer simulation software The following is a list of notable computer simulation software. Free or open-source * Advanced Simulation Library - open-source hardware accelerated multiphysics simulation software. * ASCE ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

IBM 709
The IBM 709 was a computer system, initially announced by IBM in January 1957 and first installed during August 1958. The 709 was an improved version of its predecessor, the IBM 704, and was the third of the IBM 700/7000 series of scientific computers. The improvements included overlapped input/output, indirect addressing, and three "convert" instructions which provided support for decimal arithmetic, leading zero suppression, and several other operations. The 709 had 32,768 words of 36-bit magnetic core memory and could execute 42,000 add or subtract instructions per second. It could multiply two 36-bit integers at a rate of 5000 per second. An optional hardware emulator executed old IBM 704 programs on the IBM 709. This was the first commercially available emulator. Registers and most 704 instructions were emulated in 709 hardware. Complex 704 instructions such as floating point trap and input-output routines were emulated in 709 software. The FORTRAN Assembly Program was firs ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Complex Systems Theory
A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication systems, complex software and electronic systems, social and economic organizations (like cities), an ecosystem, a living cell, and ultimately the entire universe. Complex systems are systems whose behavior is intrinsically difficult to model due to the dependencies, competitions, relationships, or other types of interactions between their parts or between a given system and its environment. Systems that are "complex" have distinct properties that arise from these relationships, such as nonlinearity, emergence, spontaneous order, adaptation, and feedback loops, among others. Because such systems appear in a wide variety of fields, the commonalities among them have become the topic of their independent area of research. In many cases, it ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Simulation Programming Languages
A simulation is the imitation of the operation of a real-world process or system over time. Simulations require the use of Conceptual model, models; the model represents the key characteristics or behaviors of the selected system or process, whereas the simulation represents the evolution of the model over time. Often, computers are used to execute the computer simulation, simulation. Simulation is used in many contexts, such as simulation of technology for performance tuning or optimizing, safety engineering, testing, training, education, and video games. Simulation is also used with scientific modelling of natural systems or human systems to gain insight into their functioning, as in economics. Simulation can be used to show the eventual real effects of alternative conditions and courses of action. Simulation is also used when the real system cannot be engaged, because it may not be accessible, or it may be dangerous or unacceptable to engage, or it is being designed bu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Domain-specific Programming Languages
Domain specificity is a theoretical position in cognitive science (especially modern cognitive development) that argues that many aspects of cognition are supported by specialized, presumably evolutionarily specified, learning devices. The position is a close relative of modularity of mind, but is considered more general in that it does not necessarily entail all the assumptions of Fodorian modularity (e.g., informational encapsulation). Instead, it is properly described as a variant of psychological nativism. Other cognitive scientists also hold the mind to be modular, without the modules necessarily possessing the characteristics of Fodorian modularity. Domain specificity emerged in the aftermath of the cognitive revolution as a theoretical alternative to empiricist theories that believed all learning can be driven by the operation of a few such general learning devices. Prominent examples of such domain-general views include Jean Piaget’s theory of cognitive development, a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Euler Integration
In mathematics and computational science, the Euler method (also called forward Euler method) is a first-order numerical analysis, numerical procedure for solving ordinary differential equations (ODEs) with a given Initial value problem, initial value. It is the most basic explicit and implicit methods, explicit method for numerical ordinary differential equations, numerical integration of ordinary differential equations and is the simplest Runge–Kutta method. The Euler method is named after Leonhard Euler, who treated it in his book ''Institutionum calculi integralis'' (published 1768–1870). The Euler method is a first-order method, which means that the local error (error per step) is proportional to the square of the step size, and the global error (error at a given time) is proportional to the step size. The Euler method often serves as the basis to construct more complex methods, e.g., predictor–corrector method. Informal geometrical description Consider the problem ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Donella Meadows
Donella Hager "Dana" Meadows (March 13, 1941 – February 20, 2001) was an American environmental scientist, educator, and writer. She is best known as lead author of the books ''The Limits to Growth'' and '' Thinking In Systems: A Primer''. Early life and education Born in Elgin, Illinois, Meadows was educated in science, receiving a B.A. in chemistry from Carleton College in 1963 and a PhD in biophysics from Harvard in 1968. After a yearlong trip from England to Sri Lanka and back, she became a research fellow at the Massachusetts Institute of Technology as a member of a team in the department created by Jay Forrester, the inventor of system dynamics as well as the principle of magnetic data storage for computers. Career Meadows taught at Dartmouth College for 29 years, beginning in 1972.Meadows, Donella H. 2008, ''Thinking in Systems: A Primer'', Chelsea Green Publishing, Vermont, p. 213 (About the Author), . Meadows was honored both as a Pew Scholar in Conservation ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Difference Equations
In mathematics, a recurrence relation is an equation according to which the nth term of a sequence of numbers is equal to some combination of the previous terms. Often, only k previous terms of the sequence appear in the equation, for a parameter k that is independent of n; this number k is called the ''order'' of the relation. If the values of the first k numbers in the sequence have been given, the rest of the sequence can be calculated by repeatedly applying the equation. In ''linear recurrences'', the th term is equated to a linear function of the k previous terms. A famous example is the recurrence for the Fibonacci numbers, F_n=F_+F_ where the order k is two and the linear function merely adds the two previous terms. This example is a linear recurrence with constant coefficients, because the coefficients of the linear function (1 and 1) are constants that do not depend on n. For these recurrences, one can express the general term of the sequence as a closed-form expression ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Computerworld
''Computerworld'' (abbreviated as CW) is an ongoing decades old professional publication which in 2014 "went digital." Its audience is information technology (IT) and business technology professionals, and is available via a publication website and as a digital magazine. As a printed weekly during the 1970s and into the 1980s, ''Computerworld'' was the leading trade publication in the data processing industry. Indeed, based on circulation and revenue it was one of the most successful trade publications in any industry. Later in the 1980s it began to lose its dominant position. It is published in many countries around the world under the same or similar names. Each country's version of ''Computerworld'' includes original content and is managed independently. The parent company of Computerworld US is IDG Communications. History The first issue was published in 1967. Going international The company IDG offers the brand "Computerworld" in 47 countries worldwide, the name and fre ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Minicomputers
A minicomputer, or colloquially mini, is a class of smaller general purpose computers that developed in the mid-1960s and sold at a much lower price than mainframe and mid-size computers from IBM and its direct competitors. In a 1970 survey, ''The New York Times'' suggested a consensus definition of a minicomputer as a machine costing less than (), with an input-output device such as a teleprinter and at least four thousand words of memory, that is capable of running programs in a higher level language, such as Fortran or BASIC. The class formed a distinct group with its own software architectures and operating systems. Minis were designed for control, instrumentation, human interaction, and communication switching as distinct from calculation and record keeping. Many were sold indirectly to original equipment manufacturers (OEMs) for final end use application. During the two decade lifetime of the minicomputer class (1965–1985), almost 100 companies formed and only a half ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Batch Processing
Computerized batch processing is a method of running software programs called jobs in batches automatically. While users are required to submit the jobs, no other interaction by the user is required to process the batch. Batches may automatically be run at scheduled times as well as being run contingent on the availability of computer resources. History The term "batch processing" originates in the traditional classification of methods of production as job production (one-off production), batch production (production of a "batch" of multiple items at once, one stage at a time), and flow production (mass production, all stages in process at once). Early history Early computers were capable of running only one program at a time. Each user had sole control of the machine for a scheduled period of time. They would arrive at the computer with program and data, often on punched paper cards and magnetic or paper tape, and would load their program, run and debug it, and carry off their ou ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]