HOME





Parchive
Parchive (a portmanteau of parity archive, and formally known as Parity Volume Set Specification) is an erasure code system that produces par files for checksum verification of data integrity, with the capability to perform data recovery operations that can repair or regenerate corrupted or missing data. Parchive was originally written to solve the problem of reliable file sharing on Usenet, but it can be used for protecting any kind of data from data corruption, disc rot, bit rot, and accidental or malicious damage. Despite the name, Parchive uses more advanced techniques (specifically error correction codes) than simplistic parity methods of error detection. As of 2014, PAR1 is obsolete, PAR2 is mature for widespread use, and PAR3 is a discontinued experimental version developed by MultiPar author Yutaka Sawada. The original SourceForge Parchive project has been inactive since April 30, 2015. A new PAR3 specification has been worked on since April 28, 2019 by PAR2 specifi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Erasure Code
In coding theory, an erasure code is a forward error correction (FEC) code under the assumption of bit erasures (rather than bit errors), which transforms a message of ''k'' symbols into a longer message (code word) with ''n'' symbols such that the original message can be recovered from a subset of the ''n'' symbols. The fraction ''r'' = ''k''/''n'' is called the code rate. The fraction ''k’/k'', where ''k’'' denotes the number of symbols required for recovery, is called reception efficiency. The recovery algorithm expects that it is known which of the ''n'' symbols are lost. History Erasure coding was invented by Irving Reed and Gustave Solomon in 1960. There are many different erasure coding schemes. The most popular erasure codes are Reed-Solomon coding, Low-density parity-check code (LDPC codes), and Turbo codes. As of 2023, modern data storage systems can be designed to tolerate the complete failure of a few disks without data loss, using one of 3 ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


QuickPar
QuickPar is a computer program that creates parchives used as verification and recovery information for a file or group of files, and uses the recovery information, if available, to attempt to reconstruct the originals from the damaged files and the PAR volumes. Designed for the Microsoft Windows operating system, in the past it was often used to recover damaged or missing files that have been downloaded through Usenet. QuickPar may also be used under Linux via Wine. There are two main versions of PAR files: PAR and PAR2. The PAR2 file format lifts many of its previous restrictions. QuickPar is freeware but not open-source. It uses the Reed-Solomon error correction algorithm internally to create the error correcting information. Replacement Since QuickPar hasn't been updated in years, it is considered abandonware Abandonware is a term for software, typically video games, that are no longer for sale by conventional means and are distributed by warez websites for free. T ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Erasure Code
In coding theory, an erasure code is a forward error correction (FEC) code under the assumption of bit erasures (rather than bit errors), which transforms a message of ''k'' symbols into a longer message (code word) with ''n'' symbols such that the original message can be recovered from a subset of the ''n'' symbols. The fraction ''r'' = ''k''/''n'' is called the code rate. The fraction ''k’/k'', where ''k’'' denotes the number of symbols required for recovery, is called reception efficiency. The recovery algorithm expects that it is known which of the ''n'' symbols are lost. History Erasure coding was invented by Irving Reed and Gustave Solomon in 1960. There are many different erasure coding schemes. The most popular erasure codes are Reed-Solomon coding, Low-density parity-check code (LDPC codes), and Turbo codes. As of 2023, modern data storage systems can be designed to tolerate the complete failure of a few disks without data loss, using one of 3 ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Checksum
A checksum is a small-sized block of data derived from another block of digital data for the purpose of detecting errors that may have been introduced during its transmission or storage. By themselves, checksums are often used to verify data integrity but are not relied upon to verify data authenticity. The procedure which generates this checksum is called a checksum function or checksum algorithm. Depending on its design goals, a good checksum algorithm usually outputs a significantly different value, even for small changes made to the input. This is especially true of cryptographic hash functions, which may be used to detect many data corruption errors and verify overall data integrity; if the computed checksum for the current data input matches the stored value of a previously computed checksum, there is a very high probability the data has not been accidentally altered or corrupted. Checksum functions are related to hash functions, fingerprints, randomization functio ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Newsgroup
A Usenet newsgroup is a repository usually within the Usenet system for messages posted from users in different locations using the Internet. They are not only discussion groups or conversations, but also a repository to publish articles, start developing tasks like creating Linux, sustain mailing lists and file uploading. That’s thank to the protocol that poses no article size limit, but are to the providers to decide. In the late 1980s, Usenet articles were often limited by the providers to 60,000 characters, but in time, Usenet groups have been split into two types: ''text'' for mainly discussions, conversations, articles, limited by most providers to about 32,000 characters, and ''binary'' for file transfer, with providers setting limits ranging from less than 1 MB to about 4 MB. Newsgroups are technically distinct from, but functionally similar to, discussion forums on the World Wide Web. Newsreader software is used to read the content of newsgroups. Before the adoption ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Usenet
Usenet (), a portmanteau of User's Network, is a worldwide distributed discussion system available on computers. It was developed from the general-purpose UUCP, Unix-to-Unix Copy (UUCP) dial-up network architecture. Tom Truscott and Jim Ellis (computing), Jim Ellis conceived the idea in 1979, and it was established in 1980.''From Usenet to CoWebs: interacting with social information spaces'', Christopher Lueg, Danyel Fisher, Springer (2003), , Users read and post messages (called ''articles'' or ''posts'', and collectively termed ''news'') to one or more topic categories, known as Usenet newsgroup, newsgroups. Usenet resembles a bulletin board system (BBS) in many respects and is the precursor to the Internet forums that have become widely used. Discussions are Threaded discussion, threaded, as with web forums and BBSes, though posts are stored on the server sequentially.
[...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Data Corruption
Data corruption refers to errors in computer data that occur during writing, reading, storage, transmission, or processing, which introduce unintended changes to the original data. Computer, transmission, and storage systems use a number of measures to provide end-to-end data integrity, or lack of errors. In general, when data corruption occurs, a Computer file, file containing that data will produce unexpected results when accessed by the system or the related application. Results could range from a minor loss of data to a system crash. For example, if a Document file format, document file is corrupted, when a person tries to open that file with a document editor they may get an error message, thus the file might not be opened or might open with some of the data corrupted (or in some cases, completely corrupted, leaving the document unintelligible). The adjacent image is a corrupted image file in which most of the information has been lost. Some types of malware may intentional ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Data Degradation
Data degradation is the gradual Data corruption, corruption of Data (computing), computer data due to an accumulation of non-critical failures in a data storage device. It is also referred to as data decay, data rot or bit rot. This results in a decline in data quality over time, even when the data is not being utilized. The concept of data degradation involves progressively minimizing data in interconnected processes, where data is used for multiple purposes at different levels of detail. At specific points in the process chain, data is irreversibly reduced to a level that remains sufficient for the successful completion of the following steps Manifestations Primary storages Data degradation in dynamic random-access memory (DRAM) can occur when the electric charge of a bit in DRAM disperses, possibly altering program code or stored data. DRAM may be altered by cosmic rays or other high-energy particles. Such data degradation is known as a soft error. ECC memory can be used to ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Hash Function
A hash function is any Function (mathematics), function that can be used to map data (computing), data of arbitrary size to fixed-size values, though there are some hash functions that support variable-length output. The values returned by a hash function are called ''hash values'', ''hash codes'', (''hash/message'') ''digests'', or simply ''hashes''. The values are usually used to index a fixed-size table called a ''hash table''. Use of a hash function to index a hash table is called ''hashing'' or ''scatter-storage addressing''. Hash functions and their associated hash tables are used in data storage and retrieval applications to access data in a small and nearly constant time per retrieval. They require an amount of storage space only fractionally greater than the total space required for the data or records themselves. Hashing is a computationally- and storage-space-efficient form of data access that avoids the non-constant access time of ordered and unordered lists and s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Reed–Solomon Error Correction
In information theory and coding theory, Reed–Solomon codes are a group of error-correcting codes that were introduced by Irving S. Reed and Gustave Solomon in 1960. They have many applications, including consumer technologies such as MiniDiscs, CDs, DVDs, Blu-ray discs, QR codes, Data Matrix, data transmission technologies such as DSL and WiMAX, Broadcasting, broadcast systems such as satellite communications, Digital Video Broadcasting, DVB and ATSC Standards, ATSC, and storage systems such as RAID 6. Reed–Solomon codes operate on a block of data treated as a set of finite field, finite-field elements called symbols. Reed–Solomon codes are able to detect and correct multiple symbol errors. By adding check symbols to the data, a Reed–Solomon code can detect (but not correct) any combination of up to erroneous symbols, ''or'' locate and correct up to erroneous symbols at unknown locations. As an erasure code, it can correct up to erasures at locations that are known and ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Download
In computer networks, download means to ''receive'' data from a remote system, typically a server such as a web server, an FTP server, an email server, or other similar systems. This contrasts with uploading, where data is ''sent to'' a remote server. A ''download'' is a file offered for downloading or that has been downloaded, or the process of receiving such a file. Definition Downloading generally transfers entire files for local storage and later use, as contrasted with streaming, where the data is used nearly immediately while the transmission is still in progress and may not be stored long-term. Websites that offer streaming media or media displayed in-browser, such as YouTube, increasingly place restrictions on the ability of users to save these materials to their computers after they have been received. Downloading on computer networks involves retrieving data from a remote system, like a web server, FTP server, or email server, unlike uploading, where data is sent ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


No Starch Press
No Starch Press is an American publishing company, specializing in technical literature often geared towards the geek, hacker, and DIY subcultures. Popular titles include '' Hacking: The Art of Exploitation'', Andrew Huang's ''Hacking the Xbox'', and '' How Wikipedia Works''. Topics No Starch Press publishes books with a focus on networking, computer security, hacking, Linux, programming, technology for kids, Lego, math, and science. The publisher also releases educational comics like ''Super Scratch Programming Adventure'' and ''The Manga Guide to Science'' series. History No Starch Press was founded in 1994 by Bill Pollock. It is headquartered in San Francisco. The company has published titles that have received recognition in the Communication Arts Design Annual and STEP inside 100 competition, and have been awarded the Independent Publisher Book Award (the IPPYs) from Independent Publisher magazine. Availability No Starch Press titles are available online and in ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]