Big memory
   HOME

TheInfoList



OR:

Big memory computers are machines with a large amount of random-access memory (RAM). The computers are required for databases, graph analytics, or more generally, high-performance computing,
data science Data science is an interdisciplinary field that uses scientific methods, processes, algorithms and systems to extract or extrapolate knowledge and insights from noisy, structured and unstructured data, and apply knowledge from data across a br ...
and
big data Though used sometimes loosely partly because of a lack of formal definition, the interpretation that seems to best describe Big data is the one associated with large body of information that we could not comprehend when used only in smaller am ...
. Some database systems called in-memory databases are designed to run mostly in memory, rarely if ever retrieving data from disk or flash memory. See list of in-memory databases.


Details

The performance of big memory systems depends on how the
central processing units A central processing unit (CPU), also called a central processor, main processor or just processor, is the electronic circuitry that executes instructions comprising a computer program. The CPU performs basic arithmetic, logic, controlling, an ...
(CPUs) access the memory, via a conventional memory controller or via non-uniform memory access (NUMA). Performance also depends on the size and design of the CPU cache. Performance also depends on operating system (OS) design. The huge pages feature in Linux and other OSes can improve the efficiency of virtual memory. The transparent huge pages feature in Linux can offer better performance for some big-memory workloads. The "Large-Page Support" in Microsoft Windows enables server applications to establish large-page memory regions which are typically three orders of magnitude larger than the native page size.


References

{{reflist Big data Data management Distributed computing problems Technology forecasting Transaction processing