Hamming Scheme
   HOME
*





Hamming Scheme
The Hamming scheme, named after Richard Hamming, is also known as the hyper-cubic association scheme, and it is the most important example for coding theory.F. J. MacWilliams and N. J. A. Sloane, ''The Theory of Error-Correcting Codes'', Elsevier, New York, 1978. In this scheme X=\mathcal^n, the set of binary vectors of length n, and two vectors x, y\in \mathcal^n are i-th associates if they are Hamming distance i apart. Recall that an association scheme is visualized as a complete graph with labeled edges. The graph has v vertices, one for each point of X, and the edge joining vertices x and y is labeled i if x and y are i-th associates. Each edge has a unique label, and the number of triangles with a fixed base labeled k having the other edges labeled i and j is a constant c_, depending on i,j,k but not on the choice of the base. In particular, each vertex is incident with exactly c_=v_i edges labeled i; v_ is the valency of the relation R_i. The c_ in a Hamming scheme are given b ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Richard Hamming
Richard Wesley Hamming (February 11, 1915 – January 7, 1998) was an American mathematician whose work had many implications for computer engineering and telecommunications. His contributions include the Hamming code (which makes use of a Hamming matrix), the Hamming window, Hamming numbers, sphere-packing (or Hamming bound), Hamming graph concepts, and the Hamming distance. Born in Chicago, Hamming attended University of Chicago, University of Nebraska and the University of Illinois at Urbana–Champaign, where he wrote his doctoral thesis in mathematics under the supervision of Waldemar Trjitzinsky (1901–1973). In April 1945 he joined the Manhattan Project at the Los Alamos Laboratory, where he programmed the IBM calculating machines that computed the solution to equations provided by the project's physicists. He left to join the Bell Telephone Laboratories in 1946. Over the next fifteen years he was involved in nearly all of the Laboratories' most prominent achievements ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Coding Theory
Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compression, cryptography, error detection and correction, data transmission and data storage. Codes are studied by various scientific disciplines—such as information theory, electrical engineering, mathematics, linguistics, and computer science—for the purpose of designing efficient and reliable data transmission methods. This typically involves the removal of redundancy and the correction or detection of errors in the transmitted data. There are four types of coding: # Data compression (or ''source coding'') # Error control (or ''channel coding'') # Cryptographic coding # Line coding Data compression attempts to remove unwanted redundancy from the data from a source in order to transmit it more efficiently. For example, ZIP data compression makes data files smaller, for purposes such as to reduce Internet traffic. Data compression a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Hamming Distance
In information theory, the Hamming distance between two strings of equal length is the number of positions at which the corresponding symbols are different. In other words, it measures the minimum number of ''substitutions'' required to change one string into the other, or the minimum number of ''errors'' that could have transformed one string into the other. In a more general context, the Hamming distance is one of several string metrics for measuring the edit distance between two sequences. It is named after the American mathematician Richard Hamming. A major application is in coding theory, more specifically to block codes, in which the equal-length strings are vectors over a finite field. Definition The Hamming distance between two equal-length strings of symbols is the number of positions at which the corresponding symbols are different. Examples The symbols may be letters, bits, or decimal digits, among other possibilities. For example, the Hamming distance between: ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Association Scheme
The theory of association schemes arose in statistics, in the theory of experimental design for the analysis of variance. In mathematics, association schemes belong to both algebra and combinatorics. In algebraic combinatorics, association schemes provide a unified approach to many topics, for example combinatorial designs and coding theory. In algebra, association schemes generalize groups, and the theory of association schemes generalizes the character theory of linear representations of groups. Definition An ''n''-class association scheme consists of a set ''X'' together with a partition ''S'' of ''X'' × ''X'' into ''n'' + 1 binary relations, ''R''0, ''R''1, ..., ''R''''n'' which satisfy: *R_ = \ and is called the identity relation. *Defining R^* := \, if ''R'' in ''S'', then ''R*'' in ''S'' *If (x,y) \in R_, the number of z \in X such that (x,z) \in R_ and (z,y) \in R_ is a constant p^k_ depending on i, j, k but not on the particular choice of x and y. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Complete Graph
In the mathematical field of graph theory, a complete graph is a simple undirected graph in which every pair of distinct vertices is connected by a unique edge. A complete digraph is a directed graph in which every pair of distinct vertices is connected by a pair of unique edges (one in each direction). Graph theory itself is typically dated as beginning with Leonhard Euler's 1736 work on the Seven Bridges of Königsberg. However, drawings of complete graphs, with their vertices placed on the points of a regular polygon, had already appeared in the 13th century, in the work of Ramon Llull. Such a drawing is sometimes referred to as a mystic rose. Properties The complete graph on vertices is denoted by . Some sources claim that the letter in this notation stands for the German word , but the German name for a complete graph, , does not contain the letter , and other sources state that the notation honors the contributions of Kazimierz Kuratowski to graph theory. has edges (a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Adjacency Relation
In discrete mathematics, and more specifically in graph theory, a graph is a structure amounting to a set of objects in which some pairs of the objects are in some sense "related". The objects correspond to mathematical abstractions called '' vertices'' (also called ''nodes'' or ''points'') and each of the related pairs of vertices is called an ''edge'' (also called ''link'' or ''line''). Typically, a graph is depicted in diagrammatic form as a set of dots or circles for the vertices, joined by lines or curves for the edges. Graphs are one of the objects of study in discrete mathematics. The edges may be directed or undirected. For example, if the vertices represent people at a party, and there is an edge between two people if they shake hands, then this graph is undirected because any person ''A'' can shake hands with a person ''B'' only if ''B'' also shakes hands with ''A''. In contrast, if an edge from a person ''A'' to a person ''B'' means that ''A'' owes money to ''B'', th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Relation (mathematics)
In mathematics, a relation on a set may, or may not, hold between two given set members. For example, ''"is less than"'' is a relation on the set of natural numbers; it holds e.g. between 1 and 3 (denoted as 1 is an asymmetric relation, but ≥ is not. Again, the previous 3 alternatives are far from being exhaustive; as an example over the natural numbers, the relation defined by is neither symmetric nor antisymmetric, let alone asymmetric. ; : for all , if and then . A transitive relation is irreflexive if and only if it is asymmetric. For example, "is ancestor of" is a transitive relation, while "is parent of" is not. ; : for all , if then or . This property is sometimes called "total", which is distinct from the definitions of "total" given in the section . ; : for all , or . This property is sometimes called "total", which is distinct from the definitions of "total" given in the section . ; : every nonempty subset of contains a minimal element with respect to ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Matrix (mathematics)
In mathematics, a matrix (plural matrices) is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example, \begin1 & 9 & -13 \\20 & 5 & -6 \end is a matrix with two rows and three columns. This is often referred to as a "two by three matrix", a "-matrix", or a matrix of dimension . Without further specifications, matrices represent linear maps, and allow explicit computations in linear algebra. Therefore, the study of matrices is a large part of linear algebra, and most properties and operations of abstract linear algebra can be expressed in terms of matrices. For example, matrix multiplication represents composition of linear maps. Not all matrices are related to linear algebra. This is, in particular, the case in graph theory, of incidence matrices, and adjacency matrices. ''This article focuses on matrices related to linear algebra, and, unle ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Bose–Mesner Algebra
In mathematics, a Bose–Mesner algebra is a special set of matrices which arise from a combinatorial structure known as an association scheme, together with the usual set of rules for combining (forming the products of) those matrices, such that they form an associative algebra, or, more precisely, a unitary commutative algebra. Among these rules are: :*the result of a product is also within the set of matrices, :*there is an identity matrix in the set, and :*taking products is commutative. Bose–Mesner algebras have applications in physics to spin models, and in statistics to the design of experiments. They are named for R. C. Bose and Dale Marsh Mesner. Definition Let ''X'' be a set of ''v'' elements. Consider a partition of the 2-element subsets of ''X'' into ''n'' non-empty subsets, ''R''1, ..., ''R''''n'' such that: * given an x \in X, the number of y \in X such that \ \in R_i depends only on i (and not on ''x''). This number will be denoted by vi, and * given x,y \in X wit ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]