HOME

TheInfoList



OR:

In
information theory Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
, the graph entropy is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused. This measure, first introduced by Körner in the 1970s, has since also proven itself useful in other settings, including combinatorics.


Definition

Let G = (V, E) be an
undirected graph In discrete mathematics, and more specifically in graph theory, a graph is a structure amounting to a set of objects in which some pairs of the objects are in some sense "related". The objects correspond to mathematical abstractions called '' v ...
. The graph entropy of G, denoted H(G) is defined as ::H(G) = \min_ I(X ; Y) where X is chosen
uniformly Uniform distribution may refer to: * Continuous uniform distribution * Discrete uniform distribution * Uniform distribution (ecology) * Equidistributed sequence See also * * Homogeneous distribution In mathematics, a homogeneous distribution is ...
from V, Y ranges over independent sets of G, the joint distribution of X and Y is such that X\in Y with probability one, and I(X ; Y) is the
mutual information In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the " amount of information" (in units such as ...
of X and Y.G. Simonyi, "Perfect graphs and graph entropy. An updated survey," Perfect Graphs, John Wiley and Sons (2001) pp. 293-328, Definition 2” That is, if we let \mathcal denote the independent vertex sets in G, we wish to find the joint distribution X,Y on V \times \mathcal with the lowest mutual information such that (i) the marginal distribution of the first term is uniform and (ii) in samples from the distribution, the second term contains the first term almost surely. The mutual information of X and Y is then called the entropy of G.


Properties

* Monotonicity. If G_1 is a subgraph of G_2 on the same vertex set, then H(G_1) \leq H(G_2). * Subadditivity. Given two graphs G_1 = (V, E_1) and G_2 = (V, E_2) on the same set of vertices, the
graph union In the mathematical field of graph theory, graph operations are operations which produce new graphs from initial ones. They include both unary (one input) and binary (two input) operations. Unary operations Unary operations create a new graph fr ...
G_1 \cup G_2 = (V, E_1 \cup E_2) satisfies H(G_1 \cup G_2) \leq H(G_1) + H(G_2). * Arithmetic mean of disjoint unions. Let G_1, G_2, \cdots, G_k be a sequence of graphs on disjoint sets of vertices, with n_1, n_2, \cdots, n_k vertices, respectively. Then H(G_1 \cup G_2 \cup \cdots G_k) = \tfrac\sum_^. Additionally, simple formulas exist for certain families classes of graphs. * Edge-less graphs have entropy 0. *
Complete graph In the mathematical field of graph theory, a complete graph is a simple undirected graph in which every pair of distinct vertices is connected by a unique edge. A complete digraph is a directed graph in which every pair of distinct vertices ...
s on n vertices have entropy \log_2 n. * Complete balanced k-partite graphs have entropy \log_2 k. In particular, complete balanced
bipartite graphs In the mathematical field of graph theory, a bipartite graph (or bigraph) is a graph whose vertices can be divided into two disjoint and independent sets U and V, that is every edge connects a vertex in U to one in V. Vertex sets U and V are ...
have entropy 1. * Complete
bipartite graphs In the mathematical field of graph theory, a bipartite graph (or bigraph) is a graph whose vertices can be divided into two disjoint and independent sets U and V, that is every edge connects a vertex in U to one in V. Vertex sets U and V are ...
with n vertices in one partition and m in the other have entropy H\left(\frac\right), where H is the
binary entropy function In information theory, the binary entropy function, denoted \operatorname H(p) or \operatorname H_\text(p), is defined as the entropy of a Bernoulli process with probability p of one of two values. It is a special case of \Eta(X), the entropy fun ...
.


Example

Here, we use properties of graph entropy to provide a simple proof that a complete graph G on n vertices cannot be expressed as the union of fewer than \log_2 n bipartite graphs. ''Proof'' By monotonicity, no bipartite graph can have graph entropy greater than that of a complete bipartite graph, which is bounded by 1. Thus, by sub-additivity, the union of k bipartite graphs cannot have entropy greater than k. Now let G = (V, E) be a complete graph on n vertices. By the properties listed above, H(G) = \log_2 n. Therefore, the union of fewer than \log_2 n bipartite graphs cannot have the same entropy as G, so G cannot be expressed as such a union. \blacksquare


General References

*


Notes

{{reflist Information theory Graph theory