Lindström–Gessel–Viennot Lemma
   HOME
*





Lindström–Gessel–Viennot Lemma
In Mathematics, the Lindström–Gessel–Viennot lemma provides a way to count the number of tuples of non-intersecting lattice paths, or, more generally, paths on a directed graph. It was proved by Gessel–Viennot in 1985, based on previous work of Lindström published in 1973. Statement Let ''G'' be a locally finite directed acyclic graph. This means that each vertex has finite degree, and that ''G'' contains no directed cycles. Consider base vertices A = \ and destination vertices B = \, and also assign a weight \omega_ to each directed edge ''e''. These edge weights are assumed to belong to some commutative ring. For each directed path ''P'' between two vertices, let \omega(P) be the product of the weights of the edges of the path. For any two vertices ''a'' and ''b'', write ''e''(''a'',''b'') for the sum e(a,b) = \sum_ \omega(P) over all paths from ''a'' to ''b''. This is well-defined if between any two points there are only finitely many paths; but even in the gener ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Lattice Path
In combinatorics, a lattice path in the -dimensional integer lattice of length with steps in the set , is a sequence of vectors such that each consecutive difference v_i - v_ lies in . A lattice path may lie in any lattice in , but the integer lattice is most commonly used. An example of a lattice path in of length 5 with steps in S = \lbrace (2,0), (1,1), (0,-1) \rbrace is L = \lbrace (-1,-2), (0,-1), (2,-1), (2,-2), (2,-3), (4,-3) \rbrace . North-East lattice paths A North-East (NE) lattice path is a lattice path in \mathbb^2 with steps in S = \lbrace (0,1), (1,0) \rbrace . The (0,1) steps are called North steps and denoted by N 's; the (1,0) steps are called East steps and denoted by E 's. NE lattice paths most commonly begin at the origin. This convention allows us to encode all the information about a NE lattice path L in a single permutation word. The length of the word gives us the number of steps of the lattice path, k . The order of the N 's an ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Graph (discrete Mathematics)
In discrete mathematics, and more specifically in graph theory, a graph is a structure amounting to a Set (mathematics), set of objects in which some pairs of the objects are in some sense "related". The objects correspond to mathematical abstractions called ''Vertex (graph theory), vertices'' (also called ''nodes'' or ''points'') and each of the related pairs of vertices is called an ''edge'' (also called ''link'' or ''line''). Typically, a graph is depicted in diagrammatic form as a set of dots or circles for the vertices, joined by lines or curves for the edges. Graphs are one of the objects of study in discrete mathematics. The edges may be directed or undirected. For example, if the vertices represent people at a party, and there is an edge between two people if they shake hands, then this graph is undirected because any person ''A'' can shake hands with a person ''B'' only if ''B'' also shakes hands with ''A''. In contrast, if an edge from a person ''A'' to a person ''B'' m ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Degree (graph Theory)
In graph theory, the degree (or valency) of a vertex of a graph is the number of edges that are incident to the vertex; in a multigraph, a loop contributes 2 to a vertex's degree, for the two ends of the edge. The degree of a vertex v is denoted \deg(v) or \deg v. The maximum degree of a graph G, denoted by \Delta(G), and the minimum degree of a graph, denoted by \delta(G), are the maximum and minimum of its vertices' degrees. In the multigraph shown on the right, the maximum degree is 5 and the minimum degree is 0. In a regular graph, every vertex has the same degree, and so we can speak of ''the'' degree of the graph. A complete graph (denoted K_n, where n is the number of vertices in the graph) is a special kind of regular graph where all vertices have the maximum possible degree, n-1. In a signed graph, the number of positive edges connected to the vertex v is called positive deg(v) and the number of connected negative edges is entitled negative deg(v). Handshaking lemma ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Commutative Ring
In mathematics, a commutative ring is a ring in which the multiplication operation is commutative. The study of commutative rings is called commutative algebra. Complementarily, noncommutative algebra is the study of ring properties that are not specific to commutative rings. This distinction results from the high number of fundamental properties of commutative rings that do not extend to noncommutative rings. Definition and first examples Definition A ''ring'' is a set R equipped with two binary operations, i.e. operations combining any two elements of the ring to a third. They are called ''addition'' and ''multiplication'' and commonly denoted by "+" and "\cdot"; e.g. a+b and a \cdot b. To form a ring these two operations have to satisfy a number of properties: the ring has to be an abelian group under addition as well as a monoid under multiplication, where multiplication distributes over addition; i.e., a \cdot \left(b + c\right) = \left(a \cdot b\right) + \left(a \cdot ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Symmetric Group
In abstract algebra, the symmetric group defined over any set is the group whose elements are all the bijections from the set to itself, and whose group operation is the composition of functions. In particular, the finite symmetric group \mathrm_n defined over a finite set of n symbols consists of the permutations that can be performed on the n symbols. Since there are n! (n factorial) such permutation operations, the order (number of elements) of the symmetric group \mathrm_n is n!. Although symmetric groups can be defined on infinite sets, this article focuses on the finite symmetric groups: their applications, their elements, their conjugacy classes, a finite presentation, their subgroups, their automorphism groups, and their representation theory. For the remainder of this article, "symmetric group" will mean a symmetric group on a finite set. The symmetric group is important to diverse areas of mathematics such as Galois theory, invariant theory, the representatio ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Involution (mathematics)
In mathematics, an involution, involutory function, or self-inverse function is a function that is its own inverse, : for all in the domain of . Equivalently, applying twice produces the original value. General properties Any involution is a bijection. The identity map is a trivial example of an involution. Examples of nontrivial involutions include negation (x \mapsto -x), reciprocation (x \mapsto 1/x), and complex conjugation (z \mapsto \bar z) in arithmetic; reflection, half-turn rotation, and circle inversion in geometry; complementation in set theory; and reciprocal ciphers such as the ROT13 transformation and the Beaufort polyalphabetic cipher. The composition of two involutions ''f'' and ''g'' is an involution if and only if they commute: . Involutions on finite sets The number of involutions, including the identity involution, on a set with elements is given by a recurrence relation found by Heinrich August Rothe in 1800: :a_0 = a_1 = 1 and a_n = a_ + ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Schur Polynomial
In mathematics, Schur polynomials, named after Issai Schur, are certain symmetric polynomials in ''n'' variables, indexed by partitions, that generalize the elementary symmetric polynomials and the complete homogeneous symmetric polynomials. In representation theory they are the characters of polynomial irreducible representations of the general linear groups. The Schur polynomials form a linear basis for the space of all symmetric polynomials. Any product of Schur polynomials can be written as a linear combination of Schur polynomials with non-negative integral coefficients; the values of these coefficients is given combinatorially by the Littlewood–Richardson rule. More generally, skew Schur polynomials are associated with pairs of partitions and have similar properties to Schur polynomials. Definition (Jacobi's bialternant formula) Schur polynomials are indexed by integer partitions. Given a partition , where , and each is a non-negative integer, the functions a_ (x_1, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Complete Homogeneous Symmetric Polynomial
In mathematics, specifically in algebraic combinatorics and commutative algebra, the complete homogeneous symmetric polynomials are a specific kind of symmetric polynomials. Every symmetric polynomial can be expressed as a polynomial expression in complete homogeneous symmetric polynomials. Definition The complete homogeneous symmetric polynomial of degree in variables , written for , is the sum of all monomials of total degree in the variables. Formally, :h_k (X_1, X_2, \dots,X_n) = \sum_ X_ X_ \cdots X_. The formula can also be written as: :h_k (X_1, X_2, \dots,X_n) = \sum_ X_^ X_^ \cdots X_^. Indeed, is just the multiplicity of in the sequence . The first few of these polynomials are :\begin h_0 (X_1, X_2, \dots,X_n) &= 1, \\0pxh_1 (X_1, X_2, \dots,X_n) &= \sum_ X_j, \\ h_2 (X_1, X_2, \dots,X_n) &= \sum_ X_j X_k, \\ h_3 (X_1, X_2, \dots,X_n) &= \sum_ X_j X_k X_l. \end Thus, for each nonnegative integer , there exists exactly one complete homogeneous symmetric polynomi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Schur Lattice Paths
Schur is a German or Jewish surname. Notable people with the surname include: * Alexander Schur (born 1971), German footballer * Dina Feitelson-Schur (1926–1992), Israeli educator * Friedrich Schur (1856-1932), German mathematician * Fritz Schur (born 1951), Danish businessman * Issai Schur (1875–1941), Lithuanian-German-Israeli mathematician * Max Schur (1897–1969), Austrian physician * Michael Schur (born 1975), American television producer and writer * Philipp Johann Ferdinand Schur, German-Austrian botanist, 1799-1878 * Täve Schur (born 1939), German cyclist See also * Schor (other) Schor may refer to: * Ilya Schor, (1904–1961), American painter, jeweler, engraver, sculptor, and artist of Judaica * Johann Paul Schor (1615–1674), Austrian painter * Juliet Schor (born 1955), American sociologist * Lynda Schor (born 1950), ... {{surname, Schur Surnames from nicknames German-language surnames Jewish surnames ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Cauchy–Binet Formula
In mathematics, specifically linear algebra, the Cauchy–Binet formula, named after Augustin-Louis Cauchy and Jacques Philippe Marie Binet, is an identity for the determinant of the product of two rectangular matrices of transpose shapes (so that the product is well-defined and square). It generalizes the statement that the determinant of a product of square matrices is equal to the product of their determinants. The formula is valid for matrices with the entries from any commutative ring. Statement Let ''A'' be an ''m''×''n'' matrix and ''B'' an ''n''×''m'' matrix. Write 'n''for the set , and \tbinomm for the set of ''m''-combinations of 'n''(i.e., subsets of 'n''of size ''m''; there are \tbinom nm of them). For S\in\tbinomm, write ''A'' 'm''''S'' for the ''m''×''m'' matrix whose columns are the columns of ''A'' at indices from ''S'', and ''B''''S'', 'm''/sub> for the ''m''×''m'' matrix whose rows are the rows of ''B'' at indices from ''S''. The ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Kirchhoff's Theorem
In the mathematical field of graph theory, Kirchhoff's theorem or Kirchhoff's matrix tree theorem named after Gustav Kirchhoff is a theorem about the number of spanning trees in a graph, showing that this number can be computed in polynomial time from the determinant of a submatrix of the Laplacian matrix of the graph; specifically, the number is equal to ''any'' cofactor of the Laplacian matrix. Kirchhoff's theorem is a generalization of Cayley's formula which provides the number of spanning trees in a complete graph. Kirchhoff's theorem relies on the notion of the Laplacian matrix of a graph that is equal to the difference between the graph's degree matrix (a diagonal matrix with vertex degrees on the diagonals) and its adjacency matrix (a (0,1)-matrix with 1's at places corresponding to entries where the vertices are adjacent and 0's otherwise). For a given connected graph ''G'' with ''n'' labeled vertices, let ''λ''1, ''λ''2, ..., ''λn''−1 be the non-zer ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]