HOME
*





Graph Entropy
In information theory, the graph entropy is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused. This measure, first introduced by Körner in the 1970s, has since also proven itself useful in other settings, including combinatorics. Definition Let G = (V, E) be an undirected graph. The graph entropy of G, denoted H(G) is defined as ::H(G) = \min_ I(X ; Y) where X is chosen uniformly from V, Y ranges over independent sets of G, the joint distribution of X and Y is such that X\in Y with probability one, and I(X ; Y) is the mutual information of X and Y.G. Simonyi, "Perfect graphs and graph entropy. An updated survey," Perfect Graphs, John Wiley and Sons (2001) pp. 293-328, Definition 2” That is, if we let \mathcal denote the independent vertex sets in G, we wish to find the joint distribution X,Y on V \times \mathcal with the lowest mutual information such that (i) the marginal distribution of th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering (field), information engineering, and electrical engineering. A key measure in information theory is information entropy, entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a dice, die (with six equally likely outcomes). Some other important measures in information theory are mutual informat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Undirected Graph
In discrete mathematics, and more specifically in graph theory, a graph is a structure amounting to a set of objects in which some pairs of the objects are in some sense "related". The objects correspond to mathematical abstractions called '' vertices'' (also called ''nodes'' or ''points'') and each of the related pairs of vertices is called an ''edge'' (also called ''link'' or ''line''). Typically, a graph is depicted in diagrammatic form as a set of dots or circles for the vertices, joined by lines or curves for the edges. Graphs are one of the objects of study in discrete mathematics. The edges may be directed or undirected. For example, if the vertices represent people at a party, and there is an edge between two people if they shake hands, then this graph is undirected because any person ''A'' can shake hands with a person ''B'' only if ''B'' also shakes hands with ''A''. In contrast, if an edge from a person ''A'' to a person ''B'' means that ''A'' owes money to ''B'', th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Discrete Uniform Distribution
In probability theory and statistics, the discrete uniform distribution is a symmetric probability distribution wherein a finite number of values are equally likely to be observed; every one of ''n'' values has equal probability 1/''n''. Another way of saying "discrete uniform distribution" would be "a known, finite number of outcomes equally likely to happen". A simple example of the discrete uniform distribution is throwing a fair dice. The possible values are 1, 2, 3, 4, 5, 6, and each time the die is thrown the probability of a given score is 1/6. If two dice are thrown and their values added, the resulting distribution is no longer uniform because not all sums have equal probability. Although it is convenient to describe discrete uniform distributions over integers, such as this, one can also consider discrete uniform distributions over any finite set. For instance, a random permutation is a permutation generated uniformly from the permutations of a given length, and a unif ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Independent Set (graph Theory)
In graph theory, an independent set, stable set, coclique or anticlique is a set of vertices in a graph, no two of which are adjacent. That is, it is a set S of vertices such that for every two vertices in S, there is no edge connecting the two. Equivalently, each edge in the graph has at most one endpoint in S. A set is independent if and only if it is a clique in the graph's complement. The size of an independent set is the number of vertices it contains. Independent sets have also been called "internally stable sets", of which "stable set" is a shortening. A maximal independent set is an independent set that is not a proper subset of any other independent set. A maximum independent set is an independent set of largest possible size for a given graph G. This size is called the independence number of ''G'' and is usually denoted by \alpha(G). The optimization problem of finding such a set is called the maximum independent set problem. It is a strongly NP-hard problem. As such ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mutual Information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the " amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable. Not limited to real-valued random variables and linear dependence like the correlation coefficient, MI is more general and determines how different the joint distribution of the pair (X,Y) is from the product of the marginal distributions of X and Y. MI is the expected value of the pointwise mutual information (PMI). The quantity was defined and analyzed by Claude Shannon in his landmark paper "A Mathemati ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Graph Operations
In the mathematical field of graph theory, graph operations are operations which produce new graphs from initial ones. They include both unary (one input) and binary (two input) operations. Unary operations Unary operations create a new graph from a single initial graph. Elementary operations Elementary operations or editing operations, which are also known as graph edit operations, create a new graph from one initial one by a simple local change, such as addition or deletion of a vertex or of an edge, merging and splitting of vertices, edge contraction, etc. The graph edit distance between a pair of graphs is the minimum number of elementary operations required to transform one graph into the other. Advanced operations Advanced operations create a new graph from initial one by a complex changes, such as: * transpose graph; * complement graph; * line graph; * graph minor; * graph rewriting; * power of graph; * dual graph; * medial graph; * quotient graph; * Y-Δ transform; * My ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Complete Graph
In the mathematical field of graph theory, a complete graph is a simple undirected graph in which every pair of distinct vertices is connected by a unique edge. A complete digraph is a directed graph in which every pair of distinct vertices is connected by a pair of unique edges (one in each direction). Graph theory itself is typically dated as beginning with Leonhard Euler's 1736 work on the Seven Bridges of Königsberg. However, drawings of complete graphs, with their vertices placed on the points of a regular polygon, had already appeared in the 13th century, in the work of Ramon Llull. Such a drawing is sometimes referred to as a mystic rose. Properties The complete graph on vertices is denoted by . Some sources claim that the letter in this notation stands for the German word , but the German name for a complete graph, , does not contain the letter , and other sources state that the notation honors the contributions of Kazimierz Kuratowski to graph theory. has edges (a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Multipartite Graph
In graph theory, a part of mathematics, a -partite graph is a graph whose vertices are (or can be) partitioned into different independent sets. Equivalently, it is a graph that can be colored with colors, so that no two endpoints of an edge have the same color. When these are the bipartite graphs, and when they are called the tripartite graphs. Bipartite graphs may be recognized in polynomial time but, for any it is NP-complete, given an uncolored graph, to test whether it is -partite. However, in some applications of graph theory, a -partite graph may be given as input to a computation with its coloring already determined; this can happen when the sets of vertices in the graph represent different types of objects. For instance, folksonomies have been modeled mathematically by tripartite graphs in which the three sets of vertices in the graph represent users of a system, resources that the users are tagging, and tags that the users have applied to the resources. A comple ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Bipartite Graphs
In the mathematical field of graph theory, a bipartite graph (or bigraph) is a graph whose vertices can be divided into two disjoint and independent sets U and V, that is every edge connects a vertex in U to one in V. Vertex sets U and V are usually called the ''parts'' of the graph. Equivalently, a bipartite graph is a graph that does not contain any odd-length cycles. The two sets U and V may be thought of as a coloring of the graph with two colors: if one colors all nodes in U blue, and all nodes in V red, each edge has endpoints of differing colors, as is required in the graph coloring problem.. In contrast, such a coloring is impossible in the case of a non-bipartite graph, such as a triangle: after one node is colored blue and another red, the third vertex of the triangle is connected to vertices of both colors, preventing it from being assigned either color. One often writes G=(U,V,E) to denote a bipartite graph whose partition has the parts U and V, with E denoting ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Binary Entropy Function
In information theory, the binary entropy function, denoted \operatorname H(p) or \operatorname H_\text(p), is defined as the entropy of a Bernoulli process with probability p of one of two values. It is a special case of \Eta(X), the entropy function. Mathematically, the Bernoulli trial is modelled as a random variable X that can take on only two values: 0 and 1, which are mutually exclusive and exhaustive. If \operatorname(X=1) = p, then \operatorname(X=0) = 1-p and the entropy of X (in shannons) is given by :\operatorname H(X) = \operatorname H_\text(p) = -p \log_2 p - (1 - p) \log_2 (1 - p), where 0 \log_2 0 is taken to be 0. The logarithms in this formula are usually taken (as shown in the graph) to the base 2. See ''binary logarithm''. When p=\tfrac 1 2, the binary entropy function attains its maximum value. This is the case of an unbiased coin flip. \operatorname H(p) is distinguished from the entropy function \Eta(X) in that the former takes a single real number ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Information Theory
Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering (field), information engineering, and electrical engineering. A key measure in information theory is information entropy, entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a dice, die (with six equally likely outcomes). Some other important measures in information theory are mutual informat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]