The Elias Bassalygo bound is a mathematical limit used in
coding theory
Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compression, cryptography, error detection and correction, data transmission and data storage. Codes are stud ...
for
error correction
In information theory and coding theory with applications in computer science and telecommunication, error detection and correction (EDAC) or error control are techniques that enable reliable delivery of digital data over unreliable communica ...
during
data transmission
Data transmission and data reception or, more broadly, data communication or digital communications is the transfer and reception of data in the form of a digital bitstream or a digitized analog signal transmitted over a point-to-point o ...
or communications.
Definition
Let
be a
-ary code of length
, i.e. a subset of
.
[Each -ary block code of length is a subset of the strings of where the alphabet set has elements.] Let
be the ''rate'' of
,
the ''relative distance'' and
:
be the ''
Hamming ball'' of radius
centered at
. Let
be the ''volume'' of the Hamming ball of radius
. It is obvious that the volume of a Hamming Ball is translation-invariant, i.e. indifferent to
In particular,
With large enough
, the ''rate''
and the ''relative distance''
satisfy the Elias-Bassalygo bound:
:
where
:
is the ''q''-ary entropy function and
:
is a function related with
Johnson bound In applied mathematics, the Johnson bound (named after Selmer Martin Johnson) is a limit on the size of error-correcting codes, as used in coding theory for data transmission or communications.
Definition
Let C be a ''q''-ary code of length n, i ...
.
Proof
To prove the Elias–Bassalygo bound, start with the following Lemma:
:Lemma. For
and
, there exists a Hamming ball of radius
with at least
::
:codewords in it.
:Proof of Lemma. Randomly pick a received word
and let
be the Hamming ball centered at
with radius
. Since
is (uniform) randomly selected the expected size of overlapped region
is
::
:Since this is the expected value of the size, there must exist at least one
such that
::
:otherwise the expectation must be smaller than this value.
Now we prove the Elias–Bassalygo bound. Define
By Lemma, there exists a Hamming ball with
codewords such that:
:
By the
Johnson bound In applied mathematics, the Johnson bound (named after Selmer Martin Johnson) is a limit on the size of error-correcting codes, as used in coding theory for data transmission or communications.
Definition
Let C be a ''q''-ary code of length n, i ...
, we have
. Thus,
:
The second inequality follows from lower bound on the volume of a Hamming ball:
:
Putting in
and
gives the second inequality.
Therefore we have
:
See also
*
Singleton bound
In coding theory, the Singleton bound, named after Richard Collom Singleton, is a relatively crude upper bound on the size of an arbitrary block code C with block length n, size M and minimum distance d. It is also known as the Joshibound. proved b ...
*
Hamming bound
In mathematics and computer science, in the field of coding theory, the Hamming bound is a limit on the parameters of an arbitrary block code: it is also known as the sphere-packing bound or the volume bound from an interpretation in terms of pack ...
*
Plotkin bound
In the mathematics of coding theory, the Plotkin bound, named after Morris Plotkin, is a limit (or bound) on the maximum possible number of codewords in binary codes of given length ''n'' and given minimum distance ''d''.
Statement of the bound
...
*
Gilbert–Varshamov bound In coding theory, the Gilbert–Varshamov bound (due to Edgar Gilbert and independently Rom Varshamov.) is a limit on the parameters of a (not necessarily linear) code. It is occasionally known as the Gilbert– Shannon–Varshamov bound (or the GSV ...
*
Johnson bound In applied mathematics, the Johnson bound (named after Selmer Martin Johnson) is a limit on the size of error-correcting codes, as used in coding theory for data transmission or communications.
Definition
Let C be a ''q''-ary code of length n, i ...
References
{{Citation
, title = Lower bounds to error probability for coding on discrete memoryless channels. Part II.
, year = 1967
, journal = Information and Control
, pages = 522–552
, volume = 10
, last1 = Claude E. Shannon , first1 = Robert G. Gallager
, last2 = Berlekamp
, first2 = Elwyn R.
, doi=10.1016/s0019-9958(67)91200-4
, doi-access = free
Coding theory
Articles containing proofs