Induced Subgraph Isomorphism Problem
   HOME
*





Induced Subgraph Isomorphism Problem
In complexity theory and graph theory, induced subgraph isomorphism is an NP-complete decision problem that involves finding a given graph as an induced subgraph of a larger graph. Problem statement Formally, the problem takes as input two graphs ''G''1=(''V''1, ''E''1) and ''G''2=(''V''2, ''E''2), where the number of vertices in ''V''1 can be assumed to be less than or equal to the number of vertices in ''V''2. ''G''1 is isomorphic to an induced subgraph of ''G''2 if there is an injective function ''f'' which maps the vertices of ''G''1 to vertices of ''G''2 such that for all pairs of vertices ''x'', ''y'' in ''V''1, edge (''x'', ''y'') is in ''E''1 if and only if the edge (''f''(''x''), ''f''(''y'')) is in ''E''2. The answer to the decision problem is yes if this function ''f'' exists, and no otherwise. This is different from the subgraph isomorphism problem in that the absence of an edge in ''G''1 implies that the corresponding edge in ''G''2 must also be absent. In subgraph i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Computational Complexity Theory
In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm. A problem is regarded as inherently difficult if its solution requires significant resources, whatever the algorithm used. The theory formalizes this intuition, by introducing mathematical models of computation to study these problems and quantifying their computational complexity, i.e., the amount of resources needed to solve them, such as time and storage. Other measures of complexity are also used, such as the amount of communication (used in communication complexity), the number of gates in a circuit (used in circuit complexity) and the number of processors (used in parallel computing). One of the roles of computationa ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

K-vertex-connected Graph
In graph theory, a connected graph is said to be -vertex-connected (or -connected) if it has more than vertices and remains connected whenever fewer than vertices are removed. The vertex-connectivity, or just connectivity, of a graph is the largest for which the graph is -vertex-connected. Definitions A graph (other than a complete graph) has connectivity ''k'' if ''k'' is the size of the smallest subset of vertices such that the graph becomes disconnected if you delete them. Complete graphs are not included in this version of the definition since they cannot be disconnected by deleting vertices. The complete graph with ''n'' vertices has connectivity ''n'' − 1, as implied by the first definition. An equivalent definition is that a graph with at least two vertices is ''k''-connected if, for every pair of its vertices, it is possible to find ''k'' vertex-independent paths connecting these vertices; see Menger's theorem . This definition produces the same ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Clique (graph Theory)
In the mathematical area of graph theory, a clique ( or ) is a subset of vertices of an undirected graph such that every two distinct vertices in the clique are adjacent. That is, a clique of a graph G is an induced subgraph of G that is complete. Cliques are one of the basic concepts of graph theory and are used in many other mathematical problems and constructions on graphs. Cliques have also been studied in computer science: the task of finding whether there is a clique of a given size in a graph (the clique problem) is NP-complete, but despite this hardness result, many algorithms for finding cliques have been studied. Although the study of complete subgraphs goes back at least to the graph-theoretic reformulation of Ramsey theory by , the term ''clique'' comes from , who used complete subgraphs in social networks to model cliques of people; that is, groups of people all of whom know each other. Cliques have many other applications in the sciences and particularly in bioinf ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Maximum Clique Problem
In computer science, the clique problem is the computational problem of finding cliques (subsets of vertices, all adjacent to each other, also called complete subgraphs) in a graph. It has several different formulations depending on which cliques, and what information about the cliques, should be found. Common formulations of the clique problem include finding a maximum clique (a clique with the largest possible number of vertices), finding a maximum weight clique in a weighted graph, listing all maximal cliques (cliques that cannot be enlarged), and solving the decision problem of testing whether a graph contains a clique larger than a given size. The clique problem arises in the following real-world setting. Consider a social network, where the graph's vertices represent people, and the graph's edges represent mutual acquaintance. Then a clique represents a subset of people who all know each other, and algorithms for finding cliques can be used to discover these groups of m ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Independent Set (graph Theory)
In graph theory, an independent set, stable set, coclique or anticlique is a set of vertices in a graph, no two of which are adjacent. That is, it is a set S of vertices such that for every two vertices in S, there is no edge connecting the two. Equivalently, each edge in the graph has at most one endpoint in S. A set is independent if and only if it is a clique in the graph's complement. The size of an independent set is the number of vertices it contains. Independent sets have also been called "internally stable sets", of which "stable set" is a shortening. A maximal independent set is an independent set that is not a proper subset of any other independent set. A maximum independent set is an independent set of largest possible size for a given graph G. This size is called the independence number of ''G'' and is usually denoted by \alpha(G). The optimization problem of finding such a set is called the maximum independent set problem. It is a strongly NP-hard problem. As such ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Maximum Independent Set Problem
In graph theory, an independent set, stable set, coclique or anticlique is a set of vertices in a graph, no two of which are adjacent. That is, it is a set S of vertices such that for every two vertices in S, there is no edge connecting the two. Equivalently, each edge in the graph has at most one endpoint in S. A set is independent if and only if it is a clique in the graph's complement. The size of an independent set is the number of vertices it contains. Independent sets have also been called "internally stable sets", of which "stable set" is a shortening. A maximal independent set is an independent set that is not a proper subset of any other independent set. A maximum independent set is an independent set of largest possible size for a given graph G. This size is called the independence number of ''G'' and is usually denoted by \alpha(G). The optimization problem of finding such a set is called the maximum independent set problem. It is a strongly NP-hard problem. As such ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Snake-in-the-box
The snake-in-the-box problem in graph theory and computer science deals with finding a certain kind of path along the edges of a hypercube. This path starts at one corner and travels along the edges to as many corners as it can reach. After it gets to a new corner, the previous corner and all of its neighbors must be marked as unusable. The path should never travel to a corner which has been marked unusable. In other words, a ''snake'' is a connected open path in the hypercube where each node connected with path, with the exception of the head (start) and the tail (finish), it has exactly two neighbors that are also in the snake. The head and the tail each have only one neighbor in the snake. The rule for generating a snake is that a node in the hypercube may be visited if it is connected to the current node and it is not a neighbor of any previously visited node in the snake, other than the current node. In graph theory terminology, this is called finding the longest possible in ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Hypercube Graph
In graph theory, the hypercube graph is the graph formed from the vertices and edges of an -dimensional hypercube. For instance, the cube graph is the graph formed by the 8 vertices and 12 edges of a three-dimensional cube. has vertices, edges, and is a regular graph with edges touching each vertex. The hypercube graph may also be constructed by creating a vertex for each subset of an -element set, with two vertices adjacent when their subsets differ in a single element, or by creating a vertex for each -digit binary number, with two vertices adjacent when their binary representations differ in a single digit. It is the -fold Cartesian product of the two-vertex complete graph, and may be decomposed into two copies of connected to each other by a perfect matching. Hypercube graphs should not be confused with cubic graphs, which are graphs that have exactly three edges touching each vertex. The only hypercube graph that is a cubic graph is the cubical graph . Constru ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Path (graph Theory)
In graph theory, a path in a graph is a finite or infinite sequence of edges which joins a sequence of vertices which, by most definitions, are all distinct (and since the vertices are distinct, so are the edges). A directed path (sometimes called dipathGraph Structure Theory: Proceedings of the AMS-IMS-SIAM Joint Summer Research Conference on Graph Minors, Held June 22 to July 5, 1991p.205/ref>) in a directed graph is a finite or infinite sequence of edges which joins a sequence of distinct vertices, but with the added restriction that the edges be all directed in the same direction. Paths are fundamental concepts of graph theory, described in the introductory sections of most graph theory texts. See e.g. Bondy and Murty (1976), Gibbons (1985), or Diestel (2005). Korte et al. (1990) cover more advanced algorithmic topics concerning paths in graphs. Definitions Walk, trail, and path * A walk is a finite or infinite sequence of edges which joins a sequence of vertices. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Polynomial Time
In computer science, the time complexity is the computational complexity that describes the amount of computer time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm are taken to be related by a constant factor. Since an algorithm's running time may vary among different inputs of the same size, one commonly considers the worst-case time complexity, which is the maximum amount of time required for inputs of a given size. Less common, and usually specified explicitly, is the average-case complexity, which is the average of the time taken on inputs of a given size (this makes sense because there are only a finite number of possible inputs of a given size). In both cases, the time complexity is generally expresse ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Graph Theory
In mathematics, graph theory is the study of ''graphs'', which are mathematical structures used to model pairwise relations between objects. A graph in this context is made up of '' vertices'' (also called ''nodes'' or ''points'') which are connected by '' edges'' (also called ''links'' or ''lines''). A distinction is made between undirected graphs, where edges link two vertices symmetrically, and directed graphs, where edges link two vertices asymmetrically. Graphs are one of the principal objects of study in discrete mathematics. Definitions Definitions in graph theory vary. The following are some of the more basic ways of defining graphs and related mathematical structures. Graph In one restricted but very common sense of the term, a graph is an ordered pair G=(V,E) comprising: * V, a set of vertices (also called nodes or points); * E \subseteq \, a set of edges (also called links or lines), which are unordered pairs of vertices (that is, an edge is associated with t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Series–parallel Graph
In graph theory, series–parallel graphs are graphs with two distinguished vertices called ''terminals'', formed recursively by two simple composition operations. They can be used to model series and parallel electric circuits. Definition and terminology In this context, the term graph means multigraph. There are several ways to define series–parallel graphs. The following definition basically follows the one used by David Eppstein. A two-terminal graph (TTG) is a graph with two distinguished vertices, ''s'' and ''t'' called ''source'' and ''sink'', respectively. The parallel composition ''Pc = Pc(X,Y)'' of two TTGs ''X'' and ''Y'' is a TTG created from the disjoint union of graphs ''X'' and ''Y'' by merging the sources of ''X'' and ''Y'' to create the source of ''Pc'' and merging the sinks of ''X'' and ''Y'' to create the sink of ''Pc''. The series composition ''Sc = Sc(X,Y)'' of two TTGs ''X'' and ''Y'' is a TTG created from the disjoint union of graphs ''X'' and ''Y'' ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]