HOME
*





Memory Forensics
Memory forensics is forensic analysis of a computer's memory dump. Its primary application is investigation of advanced computer attacks which are stealthy enough to avoid leaving data on the computer's hard drive. Consequently, the memory (RAM) must be analyzed for forensic information. History Zeroth generation tools Prior to 2004, memory forensics was done on an ''ad hoc'' basis, using generic data analysis tools like strings and grep. These tools are not specifically created for memory forensics, and therefore are difficult to use. They also provide limited information. In general, their primary usage is to extract text from the memory dump. Many operating systems provide features to kernel developers and end-users to actually create a snapshot of the physical memory for either debugging (core dump or Blue Screen of Death) purposes or experience enhancement (Hibernation (computing)). In the case of Microsoft Windows, crash dumps and hibernation had been present since Mic ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Forensic
Forensic science, also known as criminalistics, is the application of science to Criminal law, criminal and Civil law (legal system), civil laws, mainly—on the criminal side—during criminal investigation, as governed by the legal standards of admissible evidence and criminal procedure. Forensic science is a broad field that includes; DNA analysis, fingerprint analysis, blood stain pattern analysis, firearms examination and ballistics, tool mark analysis, serology, toxicology, hair and fiber analysis, entomology, questioned documents, anthropology, odontology, pathology, epidemiology, footwear and tire tread analysis, drug chemistry, paint and glass analysis, digital audio video and photo analysis. Forensic scientists collect, preserve, and analyze scientific evidence during the course of an investigation. While some forensic scientists travel to the scene of the crime to collect the evidence themselves, others occupy a laboratory role, performing analysis on objects brough ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


WinDbg
WinDbg is a multipurpose debugger for the Microsoft Windows computer operating system, distributed by Microsoft. Debugging is the process of finding and resolving errors in a system; in computing it also includes exploring the internal operation of software as a help to development. It can be used to debug user mode applications, device drivers, and the operating system itself in kernel mode. Overview Like the better-known Visual Studio Debugger WinDbg has a graphical user interface (GUI), but is more powerful and has little else in common. WinDbg can automatically load debugging symbol files (e.g., PDB files) from a server by matching various criteria (e.g., timestamp, CRC, single or multiprocessor version) via SymSrv (SymSrv.dll), instead of the more time-consuming task of creating a symbol tree for a debugging target environment. If a private symbol server is configured, the symbols can be correlated with the source code for the binary. This eases the burden of debugging pr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mac OS X
macOS (; previously OS X and originally Mac OS X) is a Unix operating system developed and marketed by Apple Inc. since 2001. It is the primary operating system for Apple's Mac (computer), Mac computers. Within the market of desktop and laptop computers it is the Usage share of operating systems#Desktop and laptop computers, second most widely used desktop OS, after Microsoft Windows and ahead of ChromeOS. macOS succeeded the classic Mac OS, a Mac operating system with nine releases from 1984 to 1999. During this time, Apple cofounder Steve Jobs had left Apple and started another company, NeXT Computer, NeXT, developing the NeXTSTEP platform that would later be acquired by Apple to form the basis of macOS. The first desktop version, Mac OS X 10.0, was released in March 2001, with its first update, 10.1, arriving later that year. All releases from Mac OS X Leopard, Mac OS X 10.5 Leopard and after are UNIX 03 certified, with an exception for OS X Lion, OS X 10. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Academic Research
Research is " creative and systematic work undertaken to increase the stock of knowledge". It involves the collection, organization and analysis of evidence to increase understanding of a topic, characterized by a particular attentiveness to controlling sources of bias and error. These activities are characterized by accounting and controlling for biases. A research project may be an expansion on past work in the field. To test the validity of instruments, procedures, or experiments, research may replicate elements of prior projects or the project as a whole. The primary purposes of basic research (as opposed to applied research) are documentation, discovery, interpretation, and the research and development (R&D) of methods and systems for the advancement of human knowledge. Approaches to research depend on epistemologies, which vary considerably both within and between humanities and sciences. There are several forms of research: scientific, humanities, artistic, econom ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Volatility (memory Forensics)
Volatility is an open-source memory forensics framework for incident response and malware analysis. It is written in Python and supports Microsoft Windows, Mac OS X, and Linux Linux ( or ) is a family of open-source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991, by Linus Torvalds. Linux is typically packaged as a Linux distribution, w ... (as of version 2.5). Volatility was created by Aaron Walters, drawing on academic research he did in memory forensics. Operating system support Volatility supports investigations of the following memory images: Windows: * 32-bit Windows XP (Service Pack 2 and 3) * 32-bit Windows 2003 Server (Service Pack 0, 1, 2) * 32-bit Windows Vista (Service Pack 0, 1, 2) * 32-bit Windows 2008 Server (Service Pack 1, 2) * 32-bit Windows 7 (Service Pack 0, 1) * 32-bit Windows 8, 8.1, and 8.1 Update 1 * 32-bit Windows 10 (initial support) * 64-bit Windows XP (Service ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Open-source Model
Open source is source code that is made freely available for possible modification and redistribution. Products include permission to use the source code, design documents, or content of the product. The open-source model is a decentralized software development model that encourages open collaboration. A main principle of open-source software development is peer production, with products such as source code, blueprints, and documentation freely available to the public. The open-source movement in software began as a response to the limitations of proprietary code. The model is used for projects such as in open-source appropriate technology, and open-source drug discovery. Open source promotes universal access via an open-source or free license to a product's design or blueprint, and universal redistribution of that design or blueprint. Before the phrase ''open source'' became widely adopted, developers and producers have used a variety of other terms. ''Open source'' gained ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Process (computing)
In computing, a process is the instance of a computer program that is being executed by one or many threads. There are many different process models, some of which are light weight, but almost all processes (even entire virtual machines) are rooted in an operating system (OS) process which comprises the program code, assigned system resources, physical and logical access permissions, and data structures to initiate, control and coordinate execution activity. Depending on the OS, a process may be made up of multiple threads of execution that execute instructions concurrently. While a computer program is a passive collection of instructions typically stored in a file on disk, a process is the execution of those instructions after being loaded from the disk into memory. Several processes may be associated with the same program; for example, opening up several instances of the same program often results in more than one process being executed. Multitasking is a method to allow ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Data Structures
In computer science, a data structure is a data organization, management, and storage format that is usually chosen for efficient access to data. More precisely, a data structure is a collection of data values, the relationships among them, and the functions or operations that can be applied to the data, i.e., it is an algebraic structure about data. Usage Data structures serve as the basis for abstract data types (ADT). The ADT defines the logical form of the data type. The data structure implements the physical form of the data type. Different types of data structures are suited to different kinds of applications, and some are highly specialized to specific tasks. For example, relational databases commonly use B-tree indexes for data retrieval, while compiler implementations usually use hash tables to look up identifiers. Data structures provide a means to manage large amounts of data efficiently for uses such as large databases and internet indexing services. Usually, e ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]