HOME
*





Systematic Code
In coding theory, a systematic code is any error-correcting code in which the input data is embedded in the encoded output. Conversely, in a non-systematic code the output does not contain the input symbols. Systematic codes have the advantage that the parity data can simply be appended to the source block, and receivers do not need to recover the original source symbols if received correctly – this is useful for example if error-correction coding is combined with a hash function for quickly determining the correctness of the received source symbols, or in cases where errors occur in erasures and a received symbol is thus always correct. Furthermore, for engineering purposes such as synchronization and monitoring, it is desirable to get reasonable good estimates of the received source symbols without going through the lengthy decoding process which may be carried out at a remote site at a later time. Properties Every non-systematic linear code can be transformed into a syst ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Coding Theory
Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compression, cryptography, error detection and correction, data transmission and data storage. Codes are studied by various scientific disciplines—such as information theory, electrical engineering, mathematics, linguistics, and computer science—for the purpose of designing efficient and reliable data transmission methods. This typically involves the removal of redundancy and the correction or detection of errors in the transmitted data. There are four types of coding: # Data compression (or ''source coding'') # Error control (or ''channel coding'') # Cryptographic coding # Line coding Data compression attempts to remove unwanted redundancy from the data from a source in order to transmit it more efficiently. For example, ZIP data compression makes data files smaller, for purposes such as to reduce Internet traffic. Data compression and er ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Hash Function
A hash function is any function that can be used to map data of arbitrary size to fixed-size values. The values returned by a hash function are called ''hash values'', ''hash codes'', ''digests'', or simply ''hashes''. The values are usually used to index a fixed-size table called a ''hash table''. Use of a hash function to index a hash table is called ''hashing'' or ''scatter storage addressing''. Hash functions and their associated hash tables are used in data storage and retrieval applications to access data in a small and nearly constant time per retrieval. They require an amount of storage space only fractionally greater than the total space required for the data or records themselves. Hashing is a computationally and storage space-efficient form of data access that avoids the non-constant access time of ordered and unordered lists and structured trees, and the often exponential storage requirements of direct access of state spaces of large or variable-length keys. Use o ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Fountain Code
In coding theory, fountain codes (also known as rateless erasure codes) are a class of erasure codes with the property that a potentially limitless sequence of encoding symbols can be generated from a given set of source symbols such that the original source symbols can ideally be recovered from any subset of the encoding symbols of size equal to or only slightly larger than the number of source symbols. The term ''fountain'' or ''rateless'' refers to the fact that these codes do not exhibit a fixed code rate. A fountain code is optimal if the original ''k'' source symbols can be recovered from any ''k'' successfully received encoding symbols (i.e., excluding those that were erased). Fountain codes are known that have efficient encoding and decoding algorithms and that allow the recovery of the original ''k'' source symbols from any ''k’'' of the encoding symbols with high probability, where ''k’'' is just slightly larger than ''k''. LT codes were the first practical realizati ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Cyclic Redundancy Check
A cyclic redundancy check (CRC) is an error-detecting code commonly used in digital networks and storage devices to detect accidental changes to digital data. Blocks of data entering these systems get a short ''check value'' attached, based on the remainder of a polynomial division of their contents. On retrieval, the calculation is repeated and, in the event the check values do not match, corrective action can be taken against data corruption. CRCs can be used for error correction (see bitfilters). CRCs are so called because the ''check'' (data verification) value is a ''redundancy'' (it expands the message without adding information) and the algorithm is based on ''cyclic'' codes. CRCs are popular because they are simple to implement in binary hardware, easy to analyze mathematically, and particularly good at detecting common errors caused by noise in transmission channels. Because the check value has a fixed length, the function that generates it is occasionally used ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Burst Transmission
In telecommunication, a burst transmission or data burst is the broadcast of a relatively high-bandwidth transmission over a short period. Burst transmission can be intentional, broadcasting a compressed message at a very high data signaling rate within a very short transmission time. In the 1980s, the term "data burst" (and "info burst") was used for a technique used by some United Kingdom and South African TV programmes to transmit large amounts of primarily textual information. They would display multiple pages of text in rapid succession, usually at the end of the programme; viewers would videotape it and then read it later by playing it back using the pause button after each page. Data bursts can occur naturally, such as when the download of data from the internet briefly experiences higher speeds. It can also occur in a computer network where data transmission is interrupted at intervals. Burst transmission enables communications between data terminal equipment (DTEs) and a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Reed–Solomon Error Correction
Reed–Solomon codes are a group of error-correcting codes that were introduced by Irving S. Reed and Gustave Solomon in 1960. They have many applications, the most prominent of which include consumer technologies such as MiniDiscs, CDs, DVDs, Blu-ray discs, QR codes, data transmission technologies such as DSL and WiMAX, broadcast systems such as satellite communications, DVB and ATSC, and storage systems such as RAID 6. Reed–Solomon codes operate on a block of data treated as a set of finite-field elements called symbols. Reed–Solomon codes are able to detect and correct multiple symbol errors. By adding =  −  check symbols to the data, a Reed–Solomon code can detect (but not correct) any combination of up to erroneous symbols, ''or'' locate and correct up to erroneous symbols at unknown locations. As an erasure code, it can correct up to erasures at locations that are known and provided to the algorithm, or it can detect and correct combinations of e ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

DVB-H
DVB-H (Digital Video Broadcasting - Handheld) is one of three prevalent mobile TV formats. It is a technical specification for bringing broadcast services to mobile handsets. DVB-H was formally adopted as ETSI standard EN 302 304 in November 2004. The DVB-H specification (EN 302 304) can be downloaded from the official DVB-H website. From March 2008, DVB-H is officially endorsed by the European Union as the "preferred technology for terrestrial mobile broadcasting". The major competitors of this technology are Qualcomm's MediaFLO system, the 3G cellular system based MBMS mobile-TV standard, and the ATSC-M/H format in the U.S. DVB-SH (Satellite to Handhelds) now and DVB-NGH (Next Generation Handheld) in the future are possible enhancements to DVB-H, providing improved spectral efficiency and better modulation flexibility. DVB-H has been a commercial failure, and the service is no longer on-air. Ukraine was the last country with a nationwide broadcast in DVB-H, which began tr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Viterbi Decoder
A Viterbi decoder uses the Viterbi algorithm for decoding a bitstream that has been encoded using a convolutional code or trellis code. There are other algorithms for decoding a convolutionally encoded stream (for example, the Fano algorithm). The Viterbi algorithm is the most resource-consuming, but it does the maximum likelihood decoding. It is most often used for decoding convolutional codes with constraint lengths k≤3, but values up to k=15 are used in practice. Viterbi decoding was developed by Andrew J. Viterbi and published in the paper There are both hardware (in modems) and software implementations of a Viterbi decoder. Viterbi decoding is used in the iterative Viterbi decoding algorithm. Hardware implementation A hardware Viterbi decoder for basic (not punctured) code usually consists of the following major blocks: *Branch metric unit (BMU) *Path metric unit (PMU) *Traceback unit (TBU) Branch metric unit (BMU) A branch metric unit's function is to cal ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Convolutional Code
In telecommunication, a convolutional code is a type of error-correcting code that generates parity symbols via the sliding application of a boolean polynomial function to a data stream. The sliding application represents the 'convolution' of the encoder over the data, which gives rise to the term 'convolutional coding'. The sliding nature of the convolutional codes facilitates trellis decoding using a time-invariant trellis. Time invariant trellis decoding allows convolutional codes to be maximum-likelihood soft-decision decoded with reasonable complexity. The ability to perform economical maximum likelihood soft decision decoding is one of the major benefits of convolutional codes. This is in contrast to classic block codes, which are generally represented by a time-variant trellis and therefore are typically hard-decision decoded. Convolutional codes are often characterized by the base code rate and the depth (or memory) of the encoder ,k,K/math>. The base code rate is t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Compact Disc
The compact disc (CD) is a digital optical disc data storage format that was co-developed by Philips and Sony to store and play digital audio recordings. In August 1982, the first compact disc was manufactured. It was then released in October 1982 in Japan and branded as '' Digital Audio Compact Disc''. The format was later adapted (as CD-ROM) for general-purpose data storage. Several other formats were further derived, including write-once audio and data storage ( CD-R), rewritable media ( CD-RW), Video CD (VCD), Super Video CD (SVCD), Photo CD, Picture CD, Compact Disc-Interactive ( CD-i) and Enhanced Music CD. Standard CDs have a diameter of and are designed to hold up to 74 minutes of uncompressed stereo digital audio or about 650 MiB of data. Capacity is routinely extended to 80 minutes and 700 MiB by arranging data more closely on the same sized disc. The Mini CD has various diameters ranging from ; they are sometimes used for CD singles, stori ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Checksum
A checksum is a small-sized block of data derived from another block of digital data for the purpose of detecting errors that may have been introduced during its transmission or storage. By themselves, checksums are often used to verify data integrity but are not relied upon to verify data authenticity. The procedure which generates this checksum is called a checksum function or checksum algorithm. Depending on its design goals, a good checksum algorithm usually outputs a significantly different value, even for small changes made to the input. This is especially true of cryptographic hash functions, which may be used to detect many data corruption errors and verify overall data integrity; if the computed checksum for the current data input matches the stored value of a previously computed checksum, there is a very high probability the data has not been accidentally altered or corrupted. Checksum functions are related to hash functions, fingerprints, randomization functi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Error-correcting Code
In computing, telecommunication, information theory, and coding theory, an error correction code, sometimes error correcting code, (ECC) is used for controlling errors in data over unreliable or noisy communication channels. The central idea is the sender encodes the message with redundant information in the form of an ECC. The redundancy allows the receiver to detect a limited number of errors that may occur anywhere in the message, and often to correct these errors without retransmission. The American mathematician Richard Hamming pioneered this field in the 1940s and invented the first error-correcting code in 1950: the Hamming (7,4) code. ECC contrasts with error detection in that errors that are encountered can be corrected, not simply detected. The advantage is that a system using ECC does not require a reverse channel to request retransmission of data when an error occurs. The downside is that there is a fixed overhead that is added to the message, thereby requiring a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]