Immerman–Szelepcsényi Theorem
   HOME
*





Immerman–Szelepcsényi Theorem
In computational complexity theory, the Immerman–Szelepcsényi theorem states that nondeterministic space complexity classes are closed under complementation. It was proven independently by Neil Immerman and Róbert Szelepcsényi in 1987, for which they shared the 1995 Gödel Prize. In its general form the theorem states that NSPACE(''s''(''n'')) = co-NSPACE(''s''(''n'')) for any function ''s''(''n'') ≥ log ''n''. The result is equivalently stated as NL = co-NL; although this is the special case when ''s''(''n'') = log ''n'', it implies the general theorem by a standard padding argument. The result solved the second LBA problem. In other words, if a nondeterministic machine can solve a problem, another machine with the same resource bounds can solve its complement problem (with the ''yes'' and ''no'' answers reversed) in the same asymptotic amount of space. No similar result is known for the time complexity classes, and indeed it is conjectured that NP is not equal t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Computational Complexity Theory
In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm. A problem is regarded as inherently difficult if its solution requires significant resources, whatever the algorithm used. The theory formalizes this intuition, by introducing mathematical models of computation to study these problems and quantifying their computational complexity, i.e., the amount of resources needed to solve them, such as time and storage. Other measures of complexity are also used, such as the amount of communication (used in communication complexity), the number of gates in a circuit (used in circuit complexity) and the number of processors (used in parallel computing). One of the roles of compu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


USTCON
In computational complexity theory, SL (Symmetric Logspace or Sym-L) is the complexity class of problems log-space reducible to USTCON (''undirected s-t connectivity''), which is the problem of determining whether there exists a path between two vertices in an undirected graph, otherwise described as the problem of determining whether two vertices are in the same connected component. This problem is also called the undirected reachability problem. It does not matter whether many-one reducibility or Turing reducibility is used. Although originally described in terms of symmetric Turing machines, that equivalent formulation is very complex, and the reducibility definition is what is used in practice. USTCON is a special case of STCON (''directed reachability''), the problem of determining whether a directed path between two vertices in a directed graph exists, which is complete for NL. Because USTCON is SL-complete, most advances that impact USTCON have also impacted SL. Thus t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Structural Complexity Theory
In computational complexity theory of computer science, the structural complexity theory or simply structural complexity is the study of complexity classes, rather than computational complexity of individual problems and algorithms. It involves the research of both internal structures of various complexity classes and the relations between different complexity classes.Juris Hartmanis, "New Developments in Structural Complexity Theory" (invited lecture), Proc. 15th International Colloquium on Automata, Languages and Programming, 1988 (ICALP 88), ''Lecture Notes in Computer Science'', vol. 317 (1988), pp. 271-286. History The theory has emerged as a result of (still failing) attempts to resolve the first and still the most important question of this kind, the P = NP problem. Most of the research is done basing on the assumption of P not being equal to NP and on a more far-reaching conjecture that the polynomial time hierarchy of complexity classes is infinite. Important results ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Lance Fortnow
Lance Jeremy Fortnow (born August 15, 1963) is a computer scientist known for major results in computational complexity and interactive proof systems. He is currently Dean of the College of Computing at the Illinois Institute of Technology. Biography Lance Fortnow received a doctorate in applied mathematics from MIT in 1989, supervised by Michael Sipser. Since graduation, he has been on the faculty of the University of Chicago (1989–1999, 2003–2007), Northwestern University (2008–2012) and the Georgia Institute of Technology (2012–2019) as chair of the School of Computer Science. Fortnow was the founding editor-in-chief of the journal ''ACM Transactions on Computation Theory'' in 2009. He was the chair of ACM SIGACT and succeeded by Paul Beame. He was the chair of the IEEE Conference on Computational Complexity from 2000 to 2006. In 2002, he began one of the first blogs devoted to theoretical computer science and has written for it since then. Since 2007, he has had ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Savitch's Theorem
In computational complexity theory, Savitch's theorem, proved by Walter Savitch in 1970, gives a relationship between deterministic and non-deterministic space complexity. It states that for any function f\in\Omega(\log(n)), :\mathsf\left(f\left(n\right)\right) \subseteq \mathsf\left(f\left(n\right)^2\right). In other words, if a nondeterministic Turing machine can solve a problem using f(n) space, a deterministic Turing machine can solve the same problem in the square of that space bound.Arora & Barak (2009) p.86 Although it seems that nondeterminism may produce exponential gains in time (as formalized in the unproven exponential time hypothesis), Savitch's theorem shows that it has a markedly more limited effect on space requirements.Arora & Barak (2009) p.92 Proof The proof relies on an algorithm for STCON, the problem of determining whether there is a path between two vertices in a directed graph, which runs in O\left((\log n)^2\right) space for n vertices. The bas ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Alternating Turing Machine
In computational complexity theory, an alternating Turing machine (ATM) is a non-deterministic Turing machine (NTM) with a rule for accepting computations that generalizes the rules used in the definition of the complexity classes NP (complexity), NP and co-NP. The concept of an ATM was set forth by Ashok K. Chandra, Chandra and Larry Stockmeyer, Stockmeyer and independently by Dexter Kozen, Kozen in 1976, with a joint journal publication in 1981. Definitions Informal description The definition of NP uses the ''existential mode'' of computation: if ''any'' choice leads to an accepting state, then the whole computation accepts. The definition of co-NP uses the ''universal mode'' of computation: only if ''all'' choices lead to an accepting state does the whole computation accept. An alternating Turing machine (or to be more precise, the definition of acceptance for such a machine) alternates between these modes. An alternating Turing machine is a non-deterministic Turing ma ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


FO (complexity)
Descriptive complexity is a branch of computational complexity theory and of finite model theory that characterizes complexity classes by the type of logic needed to express the languages in them. For example, PH, the union of all complexity classes in the polynomial hierarchy, is precisely the class of languages expressible by statements of second-order logic. This connection between complexity and the logic of finite structures allows results to be transferred easily from one area to the other, facilitating new proof methods and providing additional evidence that the main complexity classes are somehow "natural" and not tied to the specific abstract machines used to define them. Specifically, each logical system produces a set of queries expressible in it. The queries – when restricted to finite structures – correspond to the computational problems of traditional complexity theory. The first main result of descriptive complexity was Fagin's theorem, shown by Ronald Fagi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Descriptive Complexity
''Descriptive Complexity'' is a book in mathematical logic and computational complexity theory by Neil Immerman. It concerns descriptive complexity theory, an area in which the expressibility of mathematical properties using different types of logic is shown to be equivalent to their computability in different types of resource-bounded models of computation. It was published in 1999 by Springer-Verlag in their book series Graduate Texts in Computer Science. Topics The book has 15 chapters, roughly grouped into five chapters on first-order logic, three on second-order logic, and seven independent chapters on advanced topics. The first two chapters provide background material in first-order logic (including first-order arithmetic, the BIT predicate, and the notion of a first-order query) and complexity theory (including formal languages, resource-bounded complexity classes, and complete problems). Chapter three begins the connection between logic and complexity, with a proof that ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Order (graph Theory)
In discrete mathematics, and more specifically in graph theory, a graph is a structure amounting to a set of objects in which some pairs of the objects are in some sense "related". The objects correspond to mathematical abstractions called '' vertices'' (also called ''nodes'' or ''points'') and each of the related pairs of vertices is called an ''edge'' (also called ''link'' or ''line''). Typically, a graph is depicted in diagrammatic form as a set of dots or circles for the vertices, joined by lines or curves for the edges. Graphs are one of the objects of study in discrete mathematics. The edges may be directed or undirected. For example, if the vertices represent people at a party, and there is an edge between two people if they shake hands, then this graph is undirected because any person ''A'' can shake hands with a person ''B'' only if ''B'' also shakes hands with ''A''. In contrast, if an edge from a person ''A'' to a person ''B'' means that ''A'' owes money to ''B'', th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Implicit Graph
In the study of graph algorithms, an implicit graph representation (or more simply implicit graph) is a graph whose vertices or edges are not represented as explicit objects in a computer's memory, but rather are determined algorithmically from some other input, for example a computable function. Neighborhood representations The notion of an implicit graph is common in various search algorithms which are described in terms of graphs. In this context, an implicit graph may be defined as a set of rules to define all neighbors for any specified vertex. This type of implicit graph representation is analogous to an adjacency list, in that it provides easy access to the neighbors of each vertex. For instance, in searching for a solution to a puzzle such as Rubik's Cube, one may define an implicit graph in which each vertex represents one of the possible states of the cube, and each edge represents a move from one state to another. It is straightforward to generate the neighbors of any ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


St-connectivity
In computer science, st-connectivity or STCON is a decision problem asking, for vertices ''s'' and ''t'' in a directed graph, if ''t'' is reachable from ''s''. Formally, the decision problem is given by :. Complexity On a sequential computer, st-connectivity can easily be solved in linear time by either depth-first search or breadth-first search. The interest in this problem in computational complexity concerns its complexity with respect to more limited forms of computation. For instance, the complexity class of problems that can be solved by a non-deterministic Turing machine using only a logarithmic amount of memory is called NL. The st-connectivity problem can be shown to be in NL, as a non-deterministic Turing machine can guess the next node of the path, while the only information which has to be stored is the total length of the path and which node is currently under consideration. The algorithm terminates if either the target node ''t'' is reached, or the length of th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Directed Graph
In mathematics, and more specifically in graph theory, a directed graph (or digraph) is a graph that is made up of a set of vertices connected by directed edges, often called arcs. Definition In formal terms, a directed graph is an ordered pair where * ''V'' is a set whose elements are called '' vertices'', ''nodes'', or ''points''; * ''A'' is a set of ordered pairs of vertices, called ''arcs'', ''directed edges'' (sometimes simply ''edges'' with the corresponding set named ''E'' instead of ''A''), ''arrows'', or ''directed lines''. It differs from an ordinary or undirected graph, in that the latter is defined in terms of unordered pairs of vertices, which are usually called ''edges'', ''links'' or ''lines''. The aforementioned definition does not allow a directed graph to have multiple arrows with the same source and target nodes, but some authors consider a broader definition that allows directed graphs to have such multiple arcs (namely, they allow the arc set to be a m ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]