Big memory computers are machines with a large amount of
random-access memory
Random-access memory (RAM; ) is a form of Computer memory, electronic computer memory that can be read and changed in any order, typically used to store working Data (computing), data and machine code. A random-access memory device allows ...
(RAM). The computers are required for databases, graph analytics, or more generally,
high-performance computing
High-performance computing (HPC) is the use of supercomputers and computer clusters to solve advanced computation problems.
Overview
HPC integrates systems administration (including network and security knowledge) and parallel programming into ...
,
data science
Data science is an interdisciplinary academic field that uses statistics, scientific computing, scientific methods, processing, scientific visualization, algorithms and systems to extract or extrapolate knowledge from potentially noisy, stru ...
and
big data
Big data primarily refers to data sets that are too large or complex to be dealt with by traditional data processing, data-processing application software, software. Data with many entries (rows) offer greater statistical power, while data with ...
. Some database systems called
in-memory database
An in-memory database (IMDb, or main memory database system (MMDB) or memory resident database) is a database management system that primarily relies on main memory for computer data storage. It is contrasted with database management systems that e ...
s are designed to run mostly in memory, rarely if ever retrieving data from disk or flash memory. See
list of in-memory databases
Notable in-memory database system software includes:
References
{{cite web , title=Stackshare Top In-Memory Databases , url=https://stackshare.io/in-memory-databases#:~:text=Redis%2C%20Hazelcast%2C%20Aerospike%2C%20SAP,reason%20why%20Hazelcast% ...
.
Details
The performance of big memory systems depends on how the
central processing units
A central processing unit (CPU), also called a central processor, main processor, or just processor, is the primary processor in a given computer. Its electronic circuitry executes instructions of a computer program, such as arithmetic, log ...
(CPUs) access the memory, via a conventional
memory controller
A memory controller, also known as memory chip controller (MCC) or a memory controller unit (MCU), is a digital circuit that manages the flow of data going to and from a computer's main memory. When a memory controller is integrated into anothe ...
or via
non-uniform memory access
Non-uniform memory access (NUMA) is a computer storage, computer memory design used in multiprocessing, where the memory access time depends on the memory location relative to the processor. Under NUMA, a processor can access its own local memory ...
(NUMA). Performance also depends on the size and design of the
CPU cache
A CPU cache is a hardware cache used by the central processing unit (CPU) of a computer to reduce the average cost (time or energy) to access data from the main memory. A cache is a smaller, faster memory, located closer to a processor core, whi ...
.
Performance also depends on
operating system
An operating system (OS) is system software that manages computer hardware and software resources, and provides common daemon (computing), services for computer programs.
Time-sharing operating systems scheduler (computing), schedule tasks for ...
(OS) design. The
huge pages feature in Linux and other OSes can improve the efficiency of
virtual memory
In computing, virtual memory, or virtual storage, is a memory management technique that provides an "idealized abstraction of the storage resources that are actually available on a given machine" which "creates the illusion to users of a ver ...
. The
transparent huge pages feature in Linux can offer better performance for some big-memory workloads. The "Large-Page Support" in Microsoft Windows enables server applications to establish large-page memory regions which are typically three orders of magnitude larger than the native page size.
References
{{reflist
Big data
Data management
Distributed computing problems
Technology forecasting
Transaction processing