Erasure Channel
   HOME
*





Erasure Channel
In coding theory and information theory, a binary erasure channel (BEC) is a communications channel model. A transmitter sends a bit (a zero or a one), and the receiver either receives the bit correctly, or with some probability P_e receives a message that the bit was not received ("erased") . Definition A binary erasure channel with erasure probability P_e is a channel with binary input, ternary output, and probability of erasure P_e. That is, let X be the transmitted random variable with alphabet \. Let Y be the received variable with alphabet \, where \text is the erasure symbol. Then, the channel is characterized by the conditional probabilities: :\begin \operatorname X = 0 &= 1 - P_e \\ \operatorname X = 1 &= 0 \\ \operatorname X = 0 &= 0 \\ \operatorname X = 1 &= 1 - P_e \\ \operatorname X = 0 &= P_e \\ \operatorname X = 1 &= P_e \end Capacity The channel capacity of a BEC is 1-P_e, attained with a uniform distribution for X (i.e. half of the inputs ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Coding Theory
Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compression, cryptography, error detection and correction, data transmission and data storage. Codes are studied by various scientific disciplines—such as information theory, electrical engineering, mathematics, linguistics, and computer science—for the purpose of designing efficient and reliable data transmission methods. This typically involves the removal of redundancy and the correction or detection of errors in the transmitted data. There are four types of coding: # Data compression (or ''source coding'') # Error control (or ''channel coding'') # Cryptographic coding # Line coding Data compression attempts to remove unwanted redundancy from the data from a source in order to transmit it more efficiently. For example, ZIP data compression makes data files smaller, for purposes such as to reduce Internet traffic. Data compression a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering (field), information engineering, and electrical engineering. A key measure in information theory is information entropy, entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a dice, die (with six equally likely outcomes). Some other important measures in information theory are mutual informat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Communications Channel
A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking. A channel is used for information transfer of, for example, a digital bit stream, from one or several ''senders'' to one or several '' receivers''. A channel has a certain capacity for transmitting information, often measured by its bandwidth in Hz or its data rate in bits per second. Communicating an information signal across distance requires some form of pathway or medium. These pathways, called communication channels, use two types of media: Transmission line (e.g. twisted-pair, coaxial, and fiber-optic cable) and broadcast (e.g. microwave, satellite, radio, and infrared). In information theory, a channel refers to a theoretical ''channel model'' with certain error characteristics. In this more general view, a storage device is also a communication channel, wh ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Random Variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the possible upper sides of a flipped coin such as heads H and tails T) in a sample space (e.g., the set \) to a measurable space, often the real numbers (e.g., \ in which 1 corresponding to H and -1 corresponding to T). Informally, randomness typically represents some fundamental element of chance, such as in the roll of a dice; it may also represent uncertainty, such as measurement error. However, the interpretation of probability is philosophically complicated, and even in specific cases is not always straightforward. The purely mathematical analysis of random variables is independent of such interpretational difficulties, and can be based upon a rigorous axiomatic setup. In the formal mathematical language of measure theory, a random var ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Conditional Probability
In probability theory, conditional probability is a measure of the probability of an event occurring, given that another event (by assumption, presumption, assertion or evidence) has already occurred. This particular method relies on event B occurring with some sort of relationship with another event A. In this event, the event B can be analyzed by a conditional probability with respect to A. If the event of interest is and the event is known or assumed to have occurred, "the conditional probability of given ", or "the probability of under the condition ", is usually written as or occasionally . This can also be understood as the fraction of probability B that intersects with A: P(A \mid B) = \frac. For example, the probability that any given person has a cough on any given day may be only 5%. But if we know or assume that the person is sick, then they are much more likely to be coughing. For example, the conditional probability that someone unwell (sick) is coughing might be ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Binary Entropy Function
In information theory, the binary entropy function, denoted \operatorname H(p) or \operatorname H_\text(p), is defined as the entropy of a Bernoulli process with probability p of one of two values. It is a special case of \Eta(X), the entropy function. Mathematically, the Bernoulli trial is modelled as a random variable X that can take on only two values: 0 and 1, which are mutually exclusive and exhaustive. If \operatorname(X=1) = p, then \operatorname(X=0) = 1-p and the entropy of X (in shannons) is given by :\operatorname H(X) = \operatorname H_\text(p) = -p \log_2 p - (1 - p) \log_2 (1 - p), where 0 \log_2 0 is taken to be 0. The logarithms in this formula are usually taken (as shown in the graph) to the base 2. See ''binary logarithm''. When p=\tfrac 1 2, the binary entropy function attains its maximum value. This is the case of an unbiased coin flip. \operatorname H(p) is distinguished from the entropy function \Eta(X) in that the former takes a single real number ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Noisy-channel Coding Theorem
In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel. This result was presented by Claude Shannon in 1948 and was based in part on earlier work and ideas of Harry Nyquist and Ralph Hartley. The Shannon limit or Shannon capacity of a communication channel refers to the maximum rate of error-free data that can theoretically be transferred over the channel if the link is subject to random data transmission errors, for a particular noise level. It was first described by Shannon (1948), and shortly after published in a book by Shannon and Warren Weaver entitled ''The Mathematical Theory of Communication'' (1949). This founded the modern discipline of information theory. Overview Stated by Claude Shannon ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Binary Symmetric Channel
A binary symmetric channel (or BSCp) is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit (a zero or a one), and the receiver will receive a bit. The bit will be "flipped" with a "crossover probability" of ''p'', and otherwise is received correctly. This model can be applied to varied communication channels such as telephone lines or disk drive storage. The noisy-channel coding theorem applies to BSCp, saying that information can be transmitted at any rate up to the channel capacity with arbitrarily low error. The channel capacity is 1 - \operatorname H_\text(p) bits, where \operatorname H_\text is the binary entropy function. Codes including Forney's code have been designed to transmit information efficiently across the channel. Definition A binary symmetric channel with crossover probability p, denoted by BSCp, is a channel with binary input and binary output and probability of error p. That i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Deletion Channel
A deletion channel is a communications channel model used in coding theory and information theory. In this model, a transmitter sends a bit (a zero or a one), and the receiver either receives the bit (with probability p) or does not receive anything without being notified that the bit was dropped (with probability 1-p). Determining the capacity of the deletion channel is an open problem.. The deletion channel should not be confused with the binary erasure channel which is much simpler to analyze. Formal description Let p be the deletion probability, 0 < p < 1. The binary deletion channel is defined as follows: Given an input sequence of n bits (X_i) as input, each bit in X_n can be deleted with probability p. The deletion positions are unknow ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Peter Elias
Peter Elias (November 23, 1923 – December 7, 2001) was a pioneer in the field of information theory. Born in New Brunswick, New Jersey, he was a member of the Massachusetts Institute of Technology faculty from 1953 to 1991. In 1955, Elias introduced convolutional codes as an alternative to block codes. He also established the binary erasure channel and proposed list decoding of error-correcting codes as an alternative to unique decoding. Career Peter Elias was a member of the Massachusetts Institute of Technology faculty from 1953 to 1991. From 1957 until 1966, he served as one of three founding editors of Information and Control. Awards Elias received the Claude E. Shannon Award of the IEEE Information Theory Society (1977); the Golden Jubilee Award for Technological Innovation of the IEEE Information Theory Society (1998); and the IEEE Richard W. Hamming Medal (2002). Family background Peter Elias was born on November 23, 1923, in New Brunswick, New Jersey. His mother ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]