Z-channel (information theory)
   HOME

TheInfoList



OR:

In
coding theory Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compression, cryptography, error detection and correction, data transmission and computer data storage, data sto ...
and
information theory Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, a Z-channel or binary asymmetric channel is a
communications channel A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking. A channel is used for inform ...
used to model the behaviour of some data storage systems.


Definition

A Z-channel is a channel with binary input and binary output, where each 0 bit is transmitted correctly, but each 1 bit has probability ''p'' of being transmitted incorrectly as a 0, and probability 1–''p'' of being transmitted correctly as a 1. In other words, if ''X'' and ''Y'' are the
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
s describing the probability distributions of the input and the output of the channel, respectively, then the crossovers of the channel are characterized by the conditional probabilities: :\begin \operatorname X = 0 &= 1 \\ \operatorname X = 1 &= p \\ \operatorname X = 0 &= 0 \\ \operatorname X = 1 &= 1 - p \end


Capacity

The
channel capacity Channel capacity, in electrical engineering, computer science, and information theory, is the theoretical maximum rate at which information can be reliably transmitted over a communication channel. Following the terms of the noisy-channel coding ...
\mathsf(\mathbb) of the Z-channel \mathbb with the crossover 1 → 0 probability ''p'', when the input random variable ''X'' is distributed according to the
Bernoulli distribution In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli, is the discrete probability distribution of a random variable which takes the value 1 with probability p and the value 0 with pro ...
with probability ''\alpha'' for the occurrence of 0, is given by the following equation: :\mathsf(\mathbb) = \mathsf\left(\frac\right) - \frac = \log_2(12^) = \log_2\left(1+(1-p) p^\right) where \mathsf(p) = \frac for the
binary entropy function Binary may refer to: Science and technology Mathematics * Binary number, a representation of numbers using only two values (0 and 1) for each digit * Binary function, a function that takes two arguments * Binary operation, a mathematical op ...
\mathsf(\cdot). This capacity is obtained when the input variable ''X'' has
Bernoulli distribution In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli, is the discrete probability distribution of a random variable which takes the value 1 with probability p and the value 0 with pro ...
with probability \alpha of having value 0 and 1-\alpha of value 1, where: :\alpha = 1 - \frac, For small ''p'', the capacity is approximated by : \mathsf(\mathbb) \approx 1- 0.5 \mathsf(p) as compared to the capacity 1\mathsf(p) of the
binary symmetric channel A binary symmetric channel (or BSCp) is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit (a zero or a one), and the receiver will receive a bit. The bit will be ...
with crossover probability ''p''. : For any p > 0, \alpha>0.5 (i.e. more 0s should be transmitted than 1s) because transmitting a 1 introduces noise. As p\rightarrow 1, the limiting value of \alpha is 1-\frac.


Bounds on the size of an asymmetric-error-correcting code

Define the following distance function \mathsf_A(\mathbf, \mathbf) on the words \mathbf, \mathbf \in \^n of length ''n'' transmitted via a Z-channel :\mathsf_A(\mathbf, \mathbf) \stackrel \max\left\. Define the sphere V_t(\mathbf) of radius ''t'' around a word \mathbf \in \^n of length ''n'' as the set of all the words at distance ''t'' or less from \mathbf, in other words, :V_t(\mathbf) = \. A
code In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form, sometimes shortened or secret, for communication through a communicati ...
\mathcal of length ''n'' is said to be ''t''-asymmetric-error-correcting if for any two codewords \mathbf\ne \mathbf' \in \^n, one has V_t(\mathbf) \cap V_t(\mathbf') = \emptyset. Denote by M(n,t) the maximum number of codewords in a ''t''-asymmetric-error-correcting code of length ''n''. The Varshamov bound. For ''n''≥1 and ''t''≥1, :M(n,t) \leq \frac. The constant-weight code bound. For ''n > 2t ≥ 2'', let the sequence ''B0, B1, ..., Bn-2t-1'' be defined as :B_0 = 2, \quad B_i = \min_\ for i > 0. Then M(n,t) \leq B_.


Notes


References

* * * * {{cite conference, last1=Tallini, first1=L.G. , last2=Al-Bassam , first2=S. , last3=Bose , first3=B. , title=On the capacity and codes for the Z-channel , conference=Proceedings of the IEEE International Symposium on Information Theory , location=Lausanne, Switzerland , year=2002 , page=422 Coding theory Information theory Inequalities (mathematics)