Trace-based Simulator
   HOME
*





Trace-based Simulator
In computer science, trace-based simulation refers to system simulation performed by looking at traces of program execution or system component access with the purpose of performance prediction.''Software Technologies for Embedded and Ubiquitous Systems'' edited by Sunggu Lee and Priya Narasimhan 2009 page 28 Trace-based simulation may be used in a variety of applications, from the analysis of solid state disks to the message passing performance on very large computer clusters.''Languages and Compilers for Parallel Computing'' edited by Keith Cooper, John Mellor-Crummey and Vivek Sarkar 2011 pages 202-203 Traced-based simulators usually have two components: one that executes actions and stores the results (i.e. traces) and another which reads the log files of traces and interpolates them to new (and often more complex) scenarios. For instance, in the case of large computer cluster design, the execution takes place on a small number of nodes, and traces are left in log files. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Computer Science
Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to Applied science, practical disciplines (including the design and implementation of Computer architecture, hardware and Computer programming, software). Computer science is generally considered an area of research, academic research and distinct from computer programming. Algorithms and data structures are central to computer science. The theory of computation concerns abstract models of computation and general classes of computational problem, problems that can be solved using them. The fields of cryptography and computer security involve studying the means for secure communication and for preventing Vulnerability (computing), security vulnerabilities. Computer graphics (computer science), Computer graphics and computational geometry address the generation of images. Progr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Computer Simulation
Computer simulation is the process of mathematical modelling, performed on a computer, which is designed to predict the behaviour of, or the outcome of, a real-world or physical system. The reliability of some mathematical models can be determined by comparing their results to the real-world outcomes they aim to predict. Computer simulations have become a useful tool for the mathematical modeling of many natural systems in physics (computational physics), astrophysics, climatology, chemistry, biology and manufacturing, as well as human systems in economics, psychology, social science, health care and engineering. Simulation of a system is represented as the running of the system's model. It can be used to explore and gain new insights into new technology and to estimate the performance of systems too complex for analytical solutions. Computer simulations are realized by running computer programs that can be either small, running almost instantly on small devices, or large ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Performance Prediction
In computer science, performance prediction means to estimate the execution time or other performance factors (such as cache misses) of a program on a given computer. It is being widely used for computer architects to evaluate new computer designs, for compiler writers to explore new optimizations, and also for advanced developers to tune their programs. There are many approaches to predict program 's performance on computers. They can be roughly divided into three major categories: * simulation-based prediction * profile-based prediction * analytical modeling Simulation-based prediction Performance data can be directly obtained from computer simulators, within which each instruction of the target program is actually dynamically executed given a particular input data set. Simulators can predict program's performance very accurately, but takes considerable time to handle large programs. Examples include thPACEanWisconsin Wind Tunnelsimulators as well as the more recenWARPPsimulat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Solid State Disk
A solid-state drive (SSD) is a solid-state storage device that uses integrated circuit assemblies to store data Persistence (computer science), persistently, typically using flash memory, and functioning as secondary storage in the Computer data storage#Hierarchy of storage, hierarchy of computer storage. It is also sometimes called a semiconductor storage device, a solid-state device or a solid-state disk, even though SSDs lack the physical spinning hard disk drive platter, disks and movable disk read-and-write head, read–write heads used in hard disk drives (HDDs) and floppy disks. SSD also has rich internal parallelism for data processing. In comparison to hard disk drives and similar electromechanical media which use moving parts, SSDs are typically more resistant to physical shock, run silently, and have higher input/output rates and lower latency (engineering), latency. SSDs store data in semiconductor cells. cells can contain between 1 and 4 Bit (computing), bits ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Message Passing
In computer science, message passing is a technique for invoking behavior (i.e., running a program) on a computer. The invoking program sends a message to a process (which may be an actor or object) and relies on that process and its supporting infrastructure to then select and run some appropriate code. Message passing differs from conventional programming where a process, subroutine, or function is directly invoked by name. Message passing is key to some models of concurrency and object-oriented programming. Message passing is ubiquitous in modern computer software. It is used as a way for the objects that make up a program to work with each other and as a means for objects and systems running on different computers (e.g., the Internet) to interact. Message passing may be implemented by various mechanisms, including channels. Overview Message passing is a technique for invoking behavior (i.e., running a program) on a computer. In contrast to the traditional technique of callin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Computer Cluster
A computer cluster is a set of computers that work together so that they can be viewed as a single system. Unlike grid computers, computer clusters have each node set to perform the same task, controlled and scheduled by software. The components of a cluster are usually connected to each other through fast local area networks, with each node (computer used as a server) running its own instance of an operating system. In most circumstances, all of the nodes use the same hardware and the same operating system, although in some setups (e.g. using Open Source Cluster Application Resources (OSCAR)), different operating systems can be used on each computer, or different hardware. Clusters are usually deployed to improve performance and availability over that of a single computer, while typically being much more cost-effective than single computers of comparable speed or availability. Computer clusters emerged as a result of convergence of a number of computing trends including t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


BIGSIM
BIGSIM is a computer simulation and performance modeling system for parallel computing, typically used for very large computer clusters.''Petascale Computing: Algorithms and Applications'' by David A. Bader 2007 pages 435-435''Languages and Compilers for Parallel Computing'' edited by Keith Cooper, John Mellor-Crummey and Vivek Sarkar 2011 pages 202-203 BIGSIM was developed at the University of Illinois.''Advances in Computer Science, Environment, Ecoinformatics, and Education'' edited by Song Lin and Xiong Huang 2011 page 16 When a large scale, often supercomputer level, parallel system is being developed, it is essential to be able to experiment with multiple configurations and simulate performance. BIGSIM provides these facilities by allowing the simulation of performance on various node topologies, message passing and scheduling strategies. BIGSIM includes an emulator and a trace-based simulator. The emulator executes applications on a small number of nodes and stores ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Cluster Computing
A computer cluster is a set of computers that work together so that they can be viewed as a single system. Unlike grid computers, computer clusters have each node set to perform the same task, controlled and scheduled by software. The components of a cluster are usually connected to each other through fast local area networks, with each node (computer used as a server) running its own instance of an operating system. In most circumstances, all of the nodes use the same hardware and the same operating system, although in some setups (e.g. using Open Source Cluster Application Resources (OSCAR)), different operating systems can be used on each computer, or different hardware. Clusters are usually deployed to improve performance and availability over that of a single computer, while typically being much more cost-effective than single computers of comparable speed or availability. Computer clusters emerged as a result of convergence of a number of computing trends including t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]