A timeline of events related to
information theory,
quantum information theory
Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both t ...
and
statistical physics,
data compression,
error correcting codes and related subjects.
* 1872
–
Ludwig Boltzmann presents his
H-theorem, and with it the formula Σ''p''
i log ''p''
i for the entropy of a single gas particle
* 1878
–
J. Willard Gibbs defines the
Gibbs entropy: the probabilities in the entropy formula are now taken as probabilities of the state of the ''whole'' system
* 1924
–
Harry Nyquist
Harry Nyquist (, ; February 7, 1889 – April 4, 1976) was a Swedish-American physicist and electronic engineer who made important contributions to communication theory.
Personal life
Nyquist was born in the village Nilsby of the parish Stora ...
discusses quantifying "intelligence" and the speed at which it can be transmitted by a communication system
* 1927
–
John von Neumann defines the
von Neumann entropy
In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ...
, extending the Gibbs entropy to quantum mechanics
* 1928
–
Ralph Hartley introduces
Hartley information as the logarithm of the number of possible messages, with information being communicated when the receiver can distinguish one sequence of symbols from any other (regardless of any associated meaning)
* 1929
–
Leó Szilárd analyses
Maxwell's Demon, showing how a
Szilard engine can sometimes transform information into the extraction of useful work
* 1940
–
Alan Turing
Alan Mathison Turing (; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist. Turing was highly influential in the development of theoretical co ...
introduces the
deciban
The hartley (symbol Hart), also called a ban, or a dit (short for decimal digit), is a logarithmic unit that measures information or entropy, based on base 10 logarithms and powers of 10. One hartley is the information content of an event i ...
as a measure of information inferred about the German
Enigma machine cypher settings by the
Banburismus
Banburismus was a cryptanalytic process developed by Alan Turing at Bletchley Park in Britain during the Second World War. It was used by Bletchley Park's Hut 8 to help break German ''Kriegsmarine'' (naval) messages enciphered on Enigma machine ...
process
* 1944
–
Claude Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as a "father of information theory".
As a 21-year-old master's degree student at the Massachusetts Insti ...
's theory of information is substantially complete
* 1947
–
Richard W. Hamming invents
Hamming codes for error detection and correction (to protect patent rights, the result is not published until 1950)
* 1948
–
Claude E. Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as a "father of information theory".
As a 21-year-old master's degree student at the Massachusetts Instit ...
publishes ''
A Mathematical Theory of Communication
"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in '' Bell System Technical Journal'' in 1948. It was renamed ''The Mathematical Theory of Communication'' in the 1949 book of the same name, a sm ...
''
* 1949
–
Claude E. Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as a "father of information theory".
As a 21-year-old master's degree student at the Massachusetts Instit ...
publishes ''Communication in the Presence of Noise'' –
Nyquist–Shannon sampling theorem and
Shannon–Hartley law
* 1949
–
Claude E. Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as a "father of information theory".
As a 21-year-old master's degree student at the Massachusetts Instit ...
's ''
Communication Theory of Secrecy Systems "Communication Theory of Secrecy Systems" is a paper published in 1949 by Claude Shannon discussing cryptography from the viewpoint of information theory. It is one of the foundational treatments (arguably ''the'' foundational treatment) of modern c ...
'' is declassified
* 1949
–
Robert M. Fano publishes ''Transmission of Information''. M.I.T. Press, Cambridge, Massachusetts –
Shannon–Fano coding
In the field of data compression, Shannon–Fano coding, named after Claude Shannon and Robert Fano, is a name given to two different but related techniques for constructing a prefix code based on a set of symbols and their probabilities (estimat ...
* 1949
– Leon G. Kraft discovers
Kraft's inequality, which shows the limits of
prefix codes
* 1949
–
Marcel J. E. Golay
Marcel Jules Edouard Golay (; May 3, 1902 – April 27, 1989) was a Swiss mathematician, physicist, and information theorist, who applied mathematics to real-world military and industrial problems. He was born in Neuchâtel, Switzerland.
C ...
introduces
Golay codes for
forward error correction
* 1951
–
Solomon Kullback and
Richard Leibler introduce the
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how one probability distribution ''P'' is different fro ...
* 1951
–
David A. Huffman invents
Huffman encoding, a method of finding optimal
prefix codes for
lossless
Lossless compression is a class of data compression that allows the original data to be perfectly reconstructed from the compressed data with no loss of information. Lossless compression is possible because most real-world data exhibits statistic ...
data compression
* 1953
–
August Albert Sardinas and George W. Patterson devise the
Sardinas–Patterson algorithm In coding theory, the Sardinas–Patterson algorithm is a classical algorithm for determining in polynomial time whether a given variable-length code is uniquely decodable, named after August Albert Sardinas and George W. Patterson, who published it ...
, a procedure to decide whether a given
variable-length code
In coding theory a variable-length code is a code which maps source symbols to a ''variable'' number of bits.
Variable-length codes can allow sources to be compressed and decompressed with ''zero'' error ( lossless data compression) and still b ...
is uniquely decodable
* 1954
–
Irving S. Reed and
David E. Muller propose
Reed–Muller code
Reed–Muller codes are error-correcting codes that are used in wireless communications applications, particularly in deep-space communication. Moreover, the proposed 5G standard relies on the closely related polar codes for error correction in ...
s
* 1955
–
Peter Elias introduces
convolutional code
In telecommunication, a convolutional code is a type of error-correcting code that generates parity symbols via the sliding application of a boolean polynomial function to a data stream. The sliding application represents the 'convolution' of th ...
s
* 1957
–
Eugene Prange Eugene August Prange (July 30, 1917 – February 12, 2006)[Obituary](_blank) first discusses
cyclic code
In coding theory, a cyclic code is a block code, where the circular shifts of each codeword gives another word that belongs to the code. They are error-correcting codes that have algebraic properties that are convenient for efficient error detecti ...
s
* 1959
–
Alexis Hocquenghem, and independently the next year
Raj Chandra Bose and
Dwijendra Kumar Ray-Chaudhuri, discover
BCH code
In coding theory, the Bose–Chaudhuri–Hocquenghem codes (BCH codes) form a class of cyclic error-correcting codes that are constructed using polynomials over a finite field (also called ''Galois field''). BCH codes were invented in 1959 ...
s
* 1960
–
Irving S. Reed and
Gustave Solomon propose
Reed–Solomon codes
* 1962
–
Robert G. Gallager
Robert Gray Gallager (born May 29, 1931) is an American electrical engineer known for his work on information theory and communications networks.
Gallager was elected a member of the National Academy of Engineering (NAE) in 1979 for contributio ...
proposes
low-density parity-check code
In information theory, a low-density parity-check (LDPC) code is a linear error correcting code, a method of transmitting a message over a noisy transmission channel. An LDPC code is constructed using a sparse Tanner graph (subclass of the bip ...
s; they are unused for 30 years due to technical limitations
* 1965
–
Dave Forney discusses
concatenated code
In coding theory, concatenated codes form a class of error-correcting codes that are derived by combining an inner code and an outer code. They were conceived in 1966 by Dave Forney as a solution to the problem of finding a code that has both expon ...
s
* 1966
–
Fumitada Itakura (
Nagoya University) and Shuzo Saito (
Nippon Telegraph and Telephone
, commonly known as NTT, is a Japanese telecommunications company headquartered in Tokyo, Japan. Ranked 55th in ''Fortune'' Global 500, NTT is the fourth largest telecommunications company in the world in terms of revenue, as well as the third l ...
) develop
linear predictive coding
Linear predictive coding (LPC) is a method used mostly in audio signal processing and speech processing for representing the spectral envelope of a digital signal of speech in compressed form, using the information of a linear predictive model. ...
(LPC), a form of
speech coding
* 1967
–
Andrew Viterbi
Andrew James Viterbi (born Andrea Giacomo Viterbi, March 9, 1935) is an American electrical engineer and businessman who co-founded Qualcomm Inc. and invented the Viterbi algorithm. He is the Presidential Chair Professor of Electrical Engineer ...
reveals the
Viterbi algorithm, making decoding of convolutional codes practicable
* 1968
–
Elwyn Berlekamp
Elwyn Ralph Berlekamp (September 6, 1940 – April 9, 2019) was a professor of mathematics and computer science at the University of California, Berkeley.Contributors, ''IEEE Transactions on Information Theory'' 42, #3 (May 1996), p. 1048. DO1 ...
invents the
Berlekamp–Massey algorithm
The Berlekamp–Massey algorithm is an algorithm that will find the shortest linear-feedback shift register (LFSR) for a given binary output sequence. The algorithm will also find the minimal polynomial of a linearly recurrent sequence in an arb ...
; its application to decoding BCH and Reed–Solomon codes is pointed out by
James L. Massey the following year
* 1968
–
Chris Wallace
Christopher Wallace (born October 12, 1947) is an American broadcast journalist. He is known for his tough and wide-ranging interviews, for which he is often compared to his father, ''60 Minutes'' journalist Mike Wallace. Over his 50-year care ...
and David M. Boulton publish the first of many papers on
Minimum Message Length (
MML) statistical and inductive inference
* 1970
–
Valerii Denisovich Goppa introduces
Goppa code
In mathematics, an algebraic geometric code (AG-code), otherwise known as a Goppa code, is a general type of linear code constructed by using an algebraic curve X over a finite field \mathbb_q. Such codes were introduced by Valerii Denisovich Go ...
s
* 1972
–
Jørn Justesen proposes
Justesen code
In coding theory, Justesen codes form a class of error-correcting codes that have a constant rate, constant relative distance, and a constant alphabet size.
Before the Justesen error correction code was discovered, no error correction code was kno ...
s, an improvement of Reed–Solomon codes
* 1972
–
Nasir Ahmed proposes the
discrete cosine transform (DCT), which he develops with T. Natarajan and
K. R. Rao
Kamisetty Ramamohan Rao was an Indian-American electrical engineer. He was a professor of Electrical Engineering at the University of Texas at Arlington (UT Arlington). Academically known as K. R. Rao, he is credited with the co-invention of di ...
in 1973;
the DCT later became the most widely used
lossy compression
In information technology, lossy compression or irreversible compression is the class of data compression methods that uses inexact approximations and partial data discarding to represent the content. These techniques are used to reduce data size ...
algorithm, the basis for multimedia formats such as
JPEG,
MPEG and
MP3
MP3 (formally MPEG-1 Audio Layer III or MPEG-2 Audio Layer III) is a coding format for digital audio developed largely by the Fraunhofer Society in Germany, with support from other digital scientists in the United States and elsewhere. Origin ...
* 1973
–
David Slepian
David S. Slepian (June 30, 1923 – November 29, 2007) was an American mathematician. He is best known for his work with algebraic coding theory, probability theory, and distributed source coding. He was colleagues with Claude Shannon and Ri ...
and
Jack Wolf
Jack Keil Wolf (March 14, 1935 – May 12, 2011) was an American researcher in information theory and coding theory.
Biography
Wolf was born in 1935 in Newark, New Jersey, and graduated from Weequahic High School in 1952. He received his unde ...
discover and prove the
Slepian–Wolf coding limits for distributed
source coding
In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation. Any particular compression is either lossy or lossless. Lossless compressio ...
* 1976
–
Gottfried Ungerboeck
Gottfried Ungerboeck (born 15 March 1940, Vienna) is an Austrian communications engineer.
Ungerboeck received an electrical engineering degree (with emphasis on telecommunications) from Vienna University of Technology in 1964, and a Ph.D. from th ...
gives the first paper on
trellis modulation
In telecommunication, trellis modulation (also known as trellis coded modulation, or simply TCM) is a modulation scheme that transmits information with high efficiency over band-limited channels such as telephone lines. Gottfried Ungerboeck inven ...
; a more detailed exposition in 1982 leads to a raising of analogue modem
POTS speeds from 9.6 kbit/s to 33.6 kbit/s
* 1976
– Richard Pasco and
Jorma J. Rissanen develop effective
arithmetic coding
Arithmetic coding (AC) is a form of entropy encoding used in lossless data compression. Normally, a string of characters is represented using a fixed number of bits per character, as in the ASCII code. When a string is converted to arithmetic e ...
techniques
* 1977
–
Abraham Lempel
Abraham Lempel ( he, אברהם למפל, born 10 February 1936) is an Israeli computer scientist and one of the fathers of the LZ family of lossless data compression algorithms.
Biography
Lempel was born on 10 February 1936 in Lwów, Poland (n ...
and
Jacob Ziv
Jacob Ziv ( he, יעקב זיו; born 1931) is an Israeli electrical engineer who, along with Abraham Lempel, developed the LZ family of lossless data compression algorithms.
Biography
Ziv was born in Tiberias, British mandate Palestine, on 27 ...
develop Lempel–Ziv compression (
LZ77)
* 1989
–
Phil Katz
Phillip Walter Katz (November 3, 1962 – April 14, 2000) was a computer programmer best known as the co-creator of the Zip file format for data compression, and the author of PKZIP, a program for creating zip files that ran under DOS. A ...
publishes
the .zip
format including
DEFLATE (LZ77 + Huffman coding); later to become the most widely used archive container
* 1993
–
Claude Berrou
Claude Berrou (; born 23 September 1951 in Penmarch) is a French professor in electrical engineering at École Nationale Supérieure des Télécommunications de Bretagne, now IMT Atlantique. He is the sole inventor of a groundbreaking quasi-optim ...
,
Alain Glavieux
Alain Glavieux (; 4 July 1949, Paris – 25 September 2004) was a French professor in electrical engineering at École Nationale Supérieure des Télécommunications de Bretagne. He was the coinventor with Claude Berrou and Punya Thitimajshima ...
and
Punya Thitimajshima
Punya Thitimajshima (9 November 1955 – 9 May 2006), a Thai professor in the department of telecommunications engineering at King Mongkut's Institute of Technology at Ladkrabang, was the co-inventor with Claude Berrou and Alain Glavieux of a ...
introduce
Turbo code
In information theory, turbo codes (originally in French ''Turbocodes'') are a class of high-performance forward error correction (FEC) codes developed around 1990–91, but first published in 1993. They were the first practical codes to closel ...
s
* 1994
–
Michael Burrows and
David Wheeler publish the
Burrows–Wheeler transform
The Burrows–Wheeler transform (BWT, also called block-sorting compression) rearranges a character string into runs of similar characters. This is useful for compression, since it tends to be easy to compress a string that has runs of repeated c ...
, later to find use in
bzip2
* 1995
–
Benjamin Schumacher
Benjamin "Ben" Schumacher is an American theoretical physicist, working mostly in the field of quantum information theory.
He discovered a way of interpreting quantum states as information. He came up with a way of compressing the information in ...
coins the term
qubit and proves the quantum noiseless coding theorem
* 2003
–
David J. C. MacKay shows the connection between information theory, inference and machine learning in his book.
* 2006
– first
Asymmetric numeral systems
Asymmetric numeral systems (ANS)J. Duda, K. Tahboub, N. J. Gadil, E. J. Delp''The use of asymmetric numeral systems as an accurate replacement for Huffman coding'' Picture Coding Symposium, 2015.J. Duda''Asymmetric numeral systems: entropy coding ...
entropy coding: since 2014 popular replacement of
Huffman and
arithmetic coding
Arithmetic coding (AC) is a form of entropy encoding used in lossless data compression. Normally, a string of characters is represented using a fixed number of bits per character, as in the ASCII code. When a string is converted to arithmetic e ...
in compressors like
Facebook Zstandard,
Apple LZFSE,
CRAM
Cram may refer to:
* Cram (surname), a surname, and list of notable persons having the surname
* Cram.com, a website for creating and sharing flashcards
* Cram (Australian game show), a television show
* ''Cram'' (game show), a TV game show th ...
or
JPEG XL
* 2008
–
Erdal Arıkan
Erdal Arıkan is a Turkish professor in Electrical and Electronics Engineering Department at Bilkent University, Ankara, Turkey. He is known for his implementation of polar coding.
Career Academic background
Arıkan briefly served as a tenure ...
introduces
polar codes, the first practical construction of codes that achieves capacity for a wide array of channels
References
{{DEFAULTSORT:Timeline Of Information Theory
Information theory
Information theory