In
information theory
Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
, the graph entropy is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused.
This measure, first introduced by Körner in the 1970s,
has since also proven itself useful in other settings, including combinatorics.
Definition
Let
be an
undirected graph
In discrete mathematics, and more specifically in graph theory, a graph is a structure amounting to a set of objects in which some pairs of the objects are in some sense "related". The objects correspond to mathematical abstractions called '' v ...
. The graph entropy of
, denoted
is defined as
::
where
is chosen
uniformly
Uniform distribution may refer to:
* Continuous uniform distribution
* Discrete uniform distribution
* Uniform distribution (ecology)
* Equidistributed sequence
See also
*
* Homogeneous distribution
In mathematics, a homogeneous distribution is ...
from
,
ranges over
independent sets of G, the joint distribution of
and
is such that
with probability one, and
is the
mutual information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the " amount of information" (in units such as ...
of
and
.
[G. Simonyi, "Perfect graphs and graph entropy. An updated survey," Perfect Graphs, John Wiley and Sons (2001) pp. 293-328, Definition 2”]
That is, if we let
denote the independent vertex sets in
, we wish to find the joint distribution
on
with the lowest mutual information such that (i) the marginal distribution of the first term is uniform and (ii) in samples from the distribution, the second term contains the first term almost surely. The mutual information of
and
is then called the entropy of
.
Properties
* Monotonicity. If
is a subgraph of
on the same vertex set, then
.
* Subadditivity. Given two graphs
and
on the same set of vertices, the
graph union
In the mathematical field of graph theory, graph operations are operations which produce new graphs from initial ones. They include both unary (one input) and binary (two input) operations.
Unary operations
Unary operations create a new graph fr ...
satisfies
.
* Arithmetic mean of disjoint unions. Let
be a sequence of graphs on disjoint sets of vertices, with
vertices, respectively. Then
.
Additionally, simple formulas exist for certain families classes of graphs.
* Edge-less graphs have entropy
.
*
Complete graph
In the mathematical field of graph theory, a complete graph is a simple undirected graph in which every pair of distinct vertices is connected by a unique edge. A complete digraph is a directed graph in which every pair of distinct vertices ...
s on
vertices have entropy
.
* Complete balanced
k-partite graphs have entropy
. In particular, complete balanced
bipartite graphs
In the mathematical field of graph theory, a bipartite graph (or bigraph) is a graph whose vertices can be divided into two disjoint and independent sets U and V, that is every edge connects a vertex in U to one in V. Vertex sets U and V are ...
have entropy
.
* Complete
bipartite graphs
In the mathematical field of graph theory, a bipartite graph (or bigraph) is a graph whose vertices can be divided into two disjoint and independent sets U and V, that is every edge connects a vertex in U to one in V. Vertex sets U and V are ...
with
vertices in one partition and
in the other have entropy
, where
is the
binary entropy function
In information theory, the binary entropy function, denoted \operatorname H(p) or \operatorname H_\text(p), is defined as the entropy of a Bernoulli process with probability p of one of two values. It is a special case of \Eta(X), the entropy fun ...
.
Example
Here, we use properties of graph entropy to provide a simple proof that a complete graph
on
vertices cannot be expressed as the union of fewer than
bipartite graphs.
''Proof'' By monotonicity, no bipartite graph can have graph entropy greater than that of a complete bipartite graph, which is bounded by
. Thus, by sub-additivity, the union of
bipartite graphs cannot have entropy greater than
. Now let
be a complete graph on
vertices. By the properties listed above,
. Therefore, the union of fewer than
bipartite graphs cannot have the same entropy as
, so
cannot be expressed as such a union.
General References
*
Notes
{{reflist
Information theory
Graph theory