Z-channel (information theory)
   HOME

TheInfoList



OR:

In
coding theory Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compression, cryptography, error detection and correction, data transmission and data storage. Codes are studied ...
and information theory, a Z-channel (binary asymmetric channel) is a
communications channel A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking. A channel is used for informat ...
used to model the behaviour of some data storage systems.


Definition

A Z-channel is a channel with binary input and binary output, where each 0 bit is transmitted correctly, but each 1 bit has probability ''p'' of being transmitted incorrectly as a 0, and probability 1–''p'' of being transmitted correctly as a 1. In other words, if ''X'' and ''Y'' are the random variables describing the probability distributions of the input and the output of the channel, respectively, then the crossovers of the channel are characterized by the
conditional probabilities In probability theory, conditional probability is a measure of the probability of an event occurring, given that another event (by assumption, presumption, assertion or evidence) has already occurred. This particular method relies on event B occur ...
: :\begin \operatorname X = 0 &= 1 \\ \operatorname X = 1 &= p \\ \operatorname X = 0 &= 0 \\ \operatorname X = 1 &= 1 - p \end


Capacity

The channel capacity \mathsf(\mathbb) of the Z-channel \mathbb with the crossover 1 → 0 probability ''p'', when the input random variable ''X'' is distributed according to the
Bernoulli distribution In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli,James Victor Uspensky: ''Introduction to Mathematical Probability'', McGraw-Hill, New York 1937, page 45 is the discrete probabi ...
with probability ''\alpha'' for the occurrence of 0, is given by the following equation: :\mathsf(\mathbb) = \mathsf\left(\frac\right) - \frac = \log_2(12^) = \log_2\left(1+(1-p) p^\right) where \mathsf(p) = \frac for the
binary entropy function In information theory, the binary entropy function, denoted \operatorname H(p) or \operatorname H_\text(p), is defined as the entropy of a Bernoulli process with probability p of one of two values. It is a special case of \Eta(X), the entropy fun ...
\mathsf(\cdot). This capacity is obtained when the input variable ''X'' has
Bernoulli distribution In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli,James Victor Uspensky: ''Introduction to Mathematical Probability'', McGraw-Hill, New York 1937, page 45 is the discrete probabi ...
with probability \alpha of having value 0 and 1-\alpha of value 1, where: :\alpha = 1 - \frac, For small ''p'', the capacity is approximated by : \mathsf(\mathbb) \approx 1- 0.5 \mathsf(p) as compared to the capacity 1\mathsf(p) of the binary symmetric channel with crossover probability ''p''. : For any ''p'', \alpha<0.5 (i.e. more 0s should be transmitted than 1s) because transmitting a 1 introduces noise. As p\rightarrow 1, the limiting value of \alpha is \frac.


Bounds on the size of an asymmetric-error-correcting code

Define the following distance function \mathsf_A(\mathbf, \mathbf) on the words \mathbf, \mathbf \in \^n of length ''n'' transmitted via a Z-channel :\mathsf_A(\mathbf, \mathbf) \stackrel \max\left\. Define the sphere V_t(\mathbf) of radius ''t'' around a word \mathbf \in \^n of length ''n'' as the set of all the words at distance ''t'' or less from \mathbf, in other words, :V_t(\mathbf) = \. A code \mathcal of length ''n'' is said to be ''t''-asymmetric-error-correcting if for any two codewords \mathbf\ne \mathbf' \in \^n, one has V_t(\mathbf) \cap V_t(\mathbf') = \emptyset. Denote by M(n,t) the maximum number of codewords in a ''t''-asymmetric-error-correcting code of length ''n''. The Varshamov bound. For ''n''≥1 and ''t''≥1, :M(n,t) \leq \frac. The constant-weight code bound. For ''n > 2t ≥ 2'', let the sequence ''B0, B1, ..., Bn-2t-1'' be defined as :B_0 = 2, \quad B_i = \min_\ for i > 0. Then M(n,t) \leq B_.


Notes


References

* * * * {{cite conference, last=Tallini, first=L.G. , last2=Al-Bassam , first2=S. , last3=Bose , first3=B. , title=On the capacity and codes for the Z-channel , conference=Proceedings of the IEEE International Symposium on Information Theory , location=Lausanne, Switzerland , year=2002 , page=422 Coding theory Information theory Inequalities