Arbitrarily varying channel
   HOME

TheInfoList



OR:

An arbitrarily varying channel (AVC) is a communication channel model used in coding theory, and was first introduced by Blackwell, Breiman, and Thomasian. This particular
channel Channel, channels, channeling, etc., may refer to: Geography * Channel (geography), in physical geography, a landform consisting of the outline (banks) of the path of a narrow body of water. Australia * Channel Country, region of outback Austral ...
has unknown parameters that can change over time and these changes may not have a uniform pattern during the transmission of a codeword. \textstyle n uses of this
channel Channel, channels, channeling, etc., may refer to: Geography * Channel (geography), in physical geography, a landform consisting of the outline (banks) of the path of a narrow body of water. Australia * Channel Country, region of outback Austral ...
can be described using a stochastic matrix \textstyle W^n: X^n \times \textstyle S^n \rightarrow Y^n, where \textstyle X is the input alphabet, \textstyle Y is the output alphabet, and \textstyle W^n (y , x, s) is the probability over a given set of states \textstyle S, that the transmitted input \textstyle x = (x_1, \ldots, x_n) leads to the received output \textstyle y = (y_1, \ldots, y_n). The state \textstyle s_i in set \textstyle S can vary arbitrarily at each time unit \textstyle i. This
channel Channel, channels, channeling, etc., may refer to: Geography * Channel (geography), in physical geography, a landform consisting of the outline (banks) of the path of a narrow body of water. Australia * Channel Country, region of outback Austral ...
was developed as an alternative to Shannon's
Binary Symmetric Channel A binary symmetric channel (or BSCp) is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit (a zero or a one), and the receiver will receive a bit. The bit will be "f ...
(BSC), where the entire nature of the
channel Channel, channels, channeling, etc., may refer to: Geography * Channel (geography), in physical geography, a landform consisting of the outline (banks) of the path of a narrow body of water. Australia * Channel Country, region of outback Austral ...
is known, to be more realistic to actual network channel situations.


Capacities and associated proofs


Capacity of deterministic AVCs

An AVC's capacity can vary depending on the certain parameters. \textstyle R is an achievable
rate Rate or rates may refer to: Finance * Rates (tax), a type of taxation system in the United Kingdom used to fund local government * Exchange rate, rate at which one currency will be exchanged for another Mathematics and science * Rate (mathema ...
for a deterministic AVC
code In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form, sometimes shortened or secret, for communication through a communication ...
if it is larger than \textstyle 0, and if for every positive \textstyle \varepsilon and \textstyle \delta, and very large \textstyle n, length-\textstyle n block codes exist that satisfy the following equations: \textstyle \frac\log N > R - \delta and \displaystyle \max_ \bar(s) \leq \varepsilon, where \textstyle N is the highest value in \textstyle Y and where \textstyle \bar(s) is the average probability of error for a state sequence \textstyle s. The largest
rate Rate or rates may refer to: Finance * Rates (tax), a type of taxation system in the United Kingdom used to fund local government * Exchange rate, rate at which one currency will be exchanged for another Mathematics and science * Rate (mathema ...
\textstyle R represents the capacity of the AVC, denoted by \textstyle c. As you can see, the only useful situations are when the capacity of the AVC is greater than \textstyle 0, because then the
channel Channel, channels, channeling, etc., may refer to: Geography * Channel (geography), in physical geography, a landform consisting of the outline (banks) of the path of a narrow body of water. Australia * Channel Country, region of outback Austral ...
can transmit a guaranteed amount of data \textstyle \leq c without errors. So we start out with a theorem that shows when \textstyle c is positive in an AVC and the theorems discussed afterward will narrow down the range of \textstyle c for different circumstances. Before stating Theorem 1, a few definitions need to be addressed: * An AVC is ''symmetric'' if \displaystyle \sum_W(y, x, s)U(s, x') = \sum_W(y, x', s)U(s, x) for every \textstyle (x, x', y,s), where \textstyle x,x'\in X, \textstyle y \in Y, and \textstyle U(s, x) is a channel function \textstyle U: X \rightarrow S. * \textstyle X_r, \textstyle S_r, and \textstyle Y_r are all
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s in sets \textstyle X, \textstyle S, and \textstyle Y respectively. * \textstyle P_(x) is equal to the probability that the
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
\textstyle X_r is equal to \textstyle x. * \textstyle P_(s) is equal to the probability that the
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
\textstyle S_r is equal to \textstyle s. * \textstyle P_ is the combined
probability mass function In probability and statistics, a probability mass function is a function that gives the probability that a discrete random variable is exactly equal to some value. Sometimes it is also known as the discrete density function. The probability mass ...
(pmf) of \textstyle P_(x), \textstyle P_(s), and \textstyle W(y, x,s). \textstyle P_ is defined formally as \textstyle P_(x,s,y) = P_(x)P_(s)W(y, x,s). * \textstyle H(X_r) is the entropy of \textstyle X_r. * \textstyle H(X_r, Y_r) is equal to the average probability that \textstyle X_r will be a certain value based on all the values \textstyle Y_r could possibly be equal to. * \textstyle I(X_r \land Y_r) is the mutual information of \textstyle X_r and \textstyle Y_r, and is equal to \textstyle H(X_r) - H(X_r, Y_r). * \displaystyle I(P) = \min_ I(X_r \land Y_r), where the minimum is over all random variables \textstyle Y_r such that \textstyle X_r, \textstyle S_r, and \textstyle Y_r are distributed in the form of \textstyle P_. Theorem 1: \textstyle c > 0 if and only if the AVC is not symmetric. If \textstyle c > 0, then \displaystyle c = \max_P I(P). ''Proof of 1st part for symmetry:'' If we can prove that \textstyle I(P) is positive when the AVC is not symmetric, and then prove that \textstyle c = \max_P I(P), we will be able to prove Theorem 1. Assume \textstyle I(P) were equal to \textstyle 0. From the definition of \textstyle I(P), this would make \textstyle X_r and \textstyle Y_r independent
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s, for some \textstyle S_r, because this would mean that neither
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
's entropy would rely on the other
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
's value. By using equation \textstyle P_, (and remembering \textstyle P_ = P,) we can get, :\displaystyle P_(y) = \sum_ \sum_ P(x)P_(s)W(y, x,s) :\textstyle \equiv (since \textstyle X_r and \textstyle Y_r are independent
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s, \textstyle W(y, x, s) = W'(y, s) for some \textstyle W') :\displaystyle P_(y) = \sum_ \sum_ P(x)P_(s)W'(y, s) :\textstyle \equiv (because only \textstyle P(x) depends on \textstyle x now\textstyle ) :\displaystyle P_(y) = \sum_ P_(s)W'(y, s) \left sum_ P(x)\right/math> :\textstyle \equiv (because \displaystyle \sum_ P(x) = 1) :\displaystyle P_(y) = \sum_ P_(s)W'(y, s) So now we have a
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon i ...
on \textstyle Y_r that is independent of \textstyle X_r. So now the definition of a symmetric AVC can be rewritten as follows: \displaystyle \sum_W'(y, s)P_(s) = \sum_W'(y, s)P_(s) since \textstyle U(s, x) and \textstyle W(y, x, s) are both functions based on \textstyle x, they have been replaced with functions based on \textstyle s and \textstyle y only. As you can see, both sides are now equal to the \textstyle P_(y) we calculated earlier, so the AVC is indeed symmetric when \textstyle I(P) is equal to \textstyle 0. Therefore, \textstyle I(P) can only be positive if the AVC is not symmetric. ''Proof of second part for capacity'': See the paper "The capacity of the arbitrarily varying channel revisited: positivity, constraints," referenced below for full proof.


Capacity of AVCs with input and state constraints

The next theorem will deal with the capacity for AVCs with input and/or state constraints. These constraints help to decrease the very large range of possibilities for transmission and error on an AVC, making it a bit easier to see how the AVC behaves. Before we go on to Theorem 2, we need to define a few definitions and lemmas: For such AVCs, there exists:
:- An input constraint \textstyle \Gamma based on the equation \displaystyle g(x) = \frac\sum_^n g(x_i), where \textstyle x \in X and \textstyle x = (x_1,\dots,x_n). :- A state constraint \textstyle \Lambda, based on the equation \displaystyle l(s) = \frac\sum_^n l(s_i), where \textstyle s \in X and \textstyle s = (s_1,\dots,s_n). :- \displaystyle \Lambda_0(P) = \min \sum_P(x)l(s) :- \textstyle I(P, \Lambda) is very similar to \textstyle I(P) equation mentioned previously, \displaystyle I(P, \Lambda) = \min_ I(X_r \land Y_r), but now any state \textstyle s or \textstyle S_r in the equation must follow the \textstyle l(s) \leq \Lambda state restriction. Assume \textstyle g(x) is a given non-negative-valued function on \textstyle X and \textstyle l(s) is a given non-negative-valued function on \textstyle S and that the minimum values for both is \textstyle 0. In the literature I have read on this subject, the exact definitions of both \textstyle g(x) and \textstyle l(s) (for one variable \textstyle x_i,) is never described formally. The usefulness of the input constraint \textstyle \Gamma and the state constraint \textstyle \Lambda will be based on these equations. For AVCs with input and/or state constraints, the
rate Rate or rates may refer to: Finance * Rates (tax), a type of taxation system in the United Kingdom used to fund local government * Exchange rate, rate at which one currency will be exchanged for another Mathematics and science * Rate (mathema ...
\textstyle R is now limited to codewords of format \textstyle x_1,\dots,x_N that satisfy \textstyle g(x_i) \leq \Gamma, and now the state \textstyle s is limited to all states that satisfy \textstyle l(s) \leq \Lambda. The largest
rate Rate or rates may refer to: Finance * Rates (tax), a type of taxation system in the United Kingdom used to fund local government * Exchange rate, rate at which one currency will be exchanged for another Mathematics and science * Rate (mathema ...
is still considered the capacity of the AVC, and is now denoted as \textstyle c(\Gamma, \Lambda). ''Lemma 1:'' Any codes where \textstyle \Lambda is greater than \textstyle \Lambda_0(P) cannot be considered "good" codes, because those kinds of codes have a maximum average probability of error greater than or equal to \textstyle \frac - \frac\frac, where \textstyle l_ is the maximum value of \textstyle l(s). This isn't a good maximum average error probability because it is fairly large, \textstyle \frac is close to \textstyle \frac, and the other part of the equation will be very small since the \textstyle (\Lambda - \Lambda_0(P)) value is squared, and \textstyle \Lambda is set to be larger than \textstyle \Lambda_0(P). Therefore, it would be very unlikely to receive a codeword without error. This is why the \textstyle \Lambda_0(P) condition is present in Theorem 2. Theorem 2: Given a positive \textstyle \Lambda and arbitrarily small \textstyle \alpha > 0, \textstyle \beta > 0, \textstyle \delta > 0, for any block length \textstyle n \geq n_0 and for any type \textstyle P with conditions \textstyle \Lambda_0(P) \geq \Lambda + \alpha and \displaystyle \min_P(x) \geq \beta, and where \textstyle P_ = P, there exists a
code In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form, sometimes shortened or secret, for communication through a communication ...
with codewords \textstyle x_1,\dots,x_N, each of type \textstyle P, that satisfy the following equations: \textstyle \frac\log N > I(P,\Lambda) - \delta, \displaystyle \max_ \bar(s) \leq \exp(-n\gamma), and where positive \textstyle n_0 and \textstyle \gamma depend only on \textstyle \alpha, \textstyle \beta, \textstyle \delta, and the given AVC. ''Proof of Theorem 2'': See the paper "The capacity of the arbitrarily varying channel revisited: positivity, constraints," referenced below for full proof.


Capacity of randomized AVCs

The next theorem will be for AVCs with randomized
code In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form, sometimes shortened or secret, for communication through a communication ...
. For such AVCs the
code In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form, sometimes shortened or secret, for communication through a communication ...
is a
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
with values from a family of length-n block codes, and these
code In communications and information processing, code is a system of rules to convert information—such as a letter, word, sound, image, or gesture—into another form, sometimes shortened or secret, for communication through a communication ...
s are not allowed to depend/rely on the actual value of the codeword. These codes have the same maximum and average error probability value for any
channel Channel, channels, channeling, etc., may refer to: Geography * Channel (geography), in physical geography, a landform consisting of the outline (banks) of the path of a narrow body of water. Australia * Channel Country, region of outback Austral ...
because of its random nature. These types of codes also help to make certain properties of the AVC more clear. Before we go on to Theorem 3, we need to define a couple important terms first: \displaystyle W_(y, x) = \sum_ W(y, x, s)P_(s)
\textstyle I(P, \zeta) is very similar to the \textstyle I(P) equation mentioned previously, \displaystyle I(P, \zeta) = \min_ I(X_r \land Y_r), but now the pmf \textstyle P_(s) is added to the equation, making the minimum of \textstyle I(P, \zeta) based a new form of \textstyle P_, where \textstyle W_{\zeta}(y, x) replaces \textstyle W(y, x, s). Theorem 3: The capacity for randomized codes of the AVC is \displaystyle c = max_P I(P, \zeta). ''Proof of Theorem 3'': See paper "The Capacities of Certain Channel Classes Under Random Coding" referenced below for full proof.


See also

*
Binary symmetric channel A binary symmetric channel (or BSCp) is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit (a zero or a one), and the receiver will receive a bit. The bit will be "f ...
* Binary erasure channel *
Z-channel (information theory) In coding theory and information theory, a Z-channel (binary asymmetric channel) is a communications channel used to model the behaviour of some data storage systems. Definition A Z-channel is a channel with binary input and binary output, wher ...
* Channel model *
Information theory Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
* Coding theory


References

* Ahlswede, Rudolf and Blinovsky, Vladimir, "Classical Capacity of Classical-Quantum Arbitrarily Varying Channels," https://ieeexplore.ieee.org/document/4069128 * Blackwell, David, Breiman, Leo, and Thomasian, A. J., "The Capacities of Certain Channel Classes Under Random Coding," https://www.jstor.org/stable/2237566 * Csiszar, I. and Narayan, P., "Arbitrarily varying channels with constrained inputs and states," http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=2598&isnumber=154 * Csiszar, I. and Narayan, P., "Capacity and Decoding Rules for Classes of Arbitrarily Varying Channels," http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=32153&isnumber=139 * Csiszar, I. and Narayan, P., "The capacity of the arbitrarily varying channel revisited: positivity, constraints," http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=2627&isnumber=155 * Lapidoth, A. and Narayan, P., "Reliable communication under channel uncertainty," http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=720535&isnumber=15554 Coding theory