HOME

TheInfoList



OR:

The Elias Bassalygo bound is a mathematical limit used in
coding theory Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compression, cryptography, error detection and correction, data transmission and data storage. Codes are stud ...
for
error correction In information theory and coding theory with applications in computer science and telecommunication, error detection and correction (EDAC) or error control are techniques that enable reliable delivery of digital data over unreliable communica ...
during
data transmission Data transmission and data reception or, more broadly, data communication or digital communications is the transfer and reception of data in the form of a digital bitstream or a digitized analog signal transmitted over a point-to-point o ...
or communications.


Definition

Let C be a q-ary code of length n, i.e. a subset of n.Each q-ary block code of length n is a subset of the strings of \mathcal_q^n, where the alphabet set \mathcal_q has q elements. Let R be the ''rate'' of C, \delta the ''relative distance'' and :B_q(y, \rho n) = \left \ be the '' Hamming ball'' of radius \rho n centered at y. Let \text_q(y, \rho n) = , B_q(y, \rho n), be the ''volume'' of the Hamming ball of radius \rho n . It is obvious that the volume of a Hamming Ball is translation-invariant, i.e. indifferent to y. In particular, , B_q(y, \rho n), =, B_q(0, \rho n), . With large enough n, the ''rate'' R and the ''relative distance'' \delta satisfy the Elias-Bassalygo bound: :R \leqslant 1 - H_q ( J_q(\delta))+o(1), where : H_q(x)\equiv_\text -x\log_q \left ( \right)-(1-x)\log_q is the ''q''-ary entropy function and : J_q(\delta) \equiv_ \text \left(1-\right)\left(1-\sqrt\right) is a function related with
Johnson bound In applied mathematics, the Johnson bound (named after Selmer Martin Johnson) is a limit on the size of error-correcting codes, as used in coding theory for data transmission or communications. Definition Let C be a ''q''-ary code of length n, i ...
.


Proof

To prove the Elias–Bassalygo bound, start with the following Lemma: :Lemma. For C\subseteq n and 0\leqslant e\leqslant n, there exists a Hamming ball of radius e with at least ::\frac :codewords in it. :Proof of Lemma. Randomly pick a received word y \in n and let B_q(y, 0) be the Hamming ball centered at y with radius e. Since y is (uniform) randomly selected the expected size of overlapped region , B_q(y,e) \cap C, is ::\frac :Since this is the expected value of the size, there must exist at least one y such that ::, B_q(y,e) \cap C, \geqslant = , :otherwise the expectation must be smaller than this value. Now we prove the Elias–Bassalygo bound. Define e = n J_q(\delta)-1. By Lemma, there exists a Hamming ball with B codewords such that: :B\geqslant By the
Johnson bound In applied mathematics, the Johnson bound (named after Selmer Martin Johnson) is a limit on the size of error-correcting codes, as used in coding theory for data transmission or communications. Definition Let C be a ''q''-ary code of length n, i ...
, we have B\leqslant qdn. Thus, : , C , \leqslant qnd \cdot \leqslant q^ The second inequality follows from lower bound on the volume of a Hamming ball: : \text_q \left (0, \left \lfloor \frac \right \rfloor \right ) \geqslant q^. Putting in d=2e+1 and \delta = \tfrac gives the second inequality. Therefore we have : R= \leqslant 1-H_q(J_q(\delta))+o(1)


See also

*
Singleton bound In coding theory, the Singleton bound, named after Richard Collom Singleton, is a relatively crude upper bound on the size of an arbitrary block code C with block length n, size M and minimum distance d. It is also known as the Joshibound. proved b ...
*
Hamming bound In mathematics and computer science, in the field of coding theory, the Hamming bound is a limit on the parameters of an arbitrary block code: it is also known as the sphere-packing bound or the volume bound from an interpretation in terms of pack ...
*
Plotkin bound In the mathematics of coding theory, the Plotkin bound, named after Morris Plotkin, is a limit (or bound) on the maximum possible number of codewords in binary codes of given length ''n'' and given minimum distance ''d''. Statement of the bound ...
*
Gilbert–Varshamov bound In coding theory, the Gilbert–Varshamov bound (due to Edgar Gilbert and independently Rom Varshamov.) is a limit on the parameters of a (not necessarily linear) code. It is occasionally known as the Gilbert– Shannon–Varshamov bound (or the GSV ...
*
Johnson bound In applied mathematics, the Johnson bound (named after Selmer Martin Johnson) is a limit on the size of error-correcting codes, as used in coding theory for data transmission or communications. Definition Let C be a ''q''-ary code of length n, i ...


References

{{Citation , title = Lower bounds to error probability for coding on discrete memoryless channels. Part II. , year = 1967 , journal = Information and Control , pages = 522–552 , volume = 10 , last1 = Claude E. Shannon , first1 = Robert G. Gallager , last2 = Berlekamp , first2 = Elwyn R. , doi=10.1016/s0019-9958(67)91200-4 , doi-access = free Coding theory Articles containing proofs