Kleene's Algorithm
   HOME



picture info

Kleene's Algorithm
In theoretical computer science, in particular in formal language theory, Kleene's algorithm transforms a given nondeterministic finite automaton (NFA) into a regular expression. Together with other conversion algorithms, it establishes the equivalence of several description formats for regular languages. Alternative presentations of the same method include the "elimination method" attributed to Janusz Brzozowski (computer scientist), Brzozowski and Edward J. McCluskey, McCluskey, the algorithm of Robert McNaughton, McNaughton and Hisao Yamada, Yamada, and the use of Arden's lemma. Algorithm description According to Gross and Yellen (2004), Here: sect.2.1, remark R13 on p.65 the algorithm can be traced back to Kleene (1956). A presentation of the algorithm in the case of deterministic finite automata (DFAs) is given in Hopcroft and Ullman (1979). The presentation of the algorithm for NFAs below follows Gross and Yellen (2004). Given a Nondeterministic finite automaton#Formal defini ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Theoretical Computer Science
Theoretical computer science is a subfield of computer science and mathematics that focuses on the Abstraction, abstract and mathematical foundations of computation. It is difficult to circumscribe the theoretical areas precisely. The Association for Computing Machinery, ACM's Special Interest Group on Algorithms and Computation Theory (SIGACT) provides the following description: History While logical inference and mathematical proof had existed previously, in 1931 Kurt Gödel proved with his incompleteness theorem that there are fundamental limitations on what statements could be proved or disproved. Information theory was added to the field with A Mathematical Theory of Communication, a 1948 mathematical theory of communication by Claude Shannon. In the same decade, Donald Hebb introduced a mathematical model of Hebbian learning, learning in the brain. With mounting biological data supporting this hypothesis with some modification, the fields of neural networks and para ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Nondeterministic Finite Automaton
In automata theory, a finite-state machine is called a deterministic finite automaton (DFA), if * each of its transitions is ''uniquely'' determined by its source state and input symbol, and * reading an input symbol is required for each state transition. A nondeterministic finite automaton (NFA), or nondeterministic finite-state machine, does not need to obey these restrictions. In particular, every DFA is also an NFA. Sometimes the term NFA is used in a narrower sense, referring to an NFA that is ''not'' a DFA, but not in this article. Using the subset construction algorithm, each NFA can be translated to an equivalent DFA; i.e., a DFA recognizing the same formal language. Like DFAs, NFAs only recognize regular languages. NFAs were introduced in 1959 by Michael O. Rabin and Dana Scott, who also showed their equivalence to DFAs. NFAs are used in the implementation of regular expressions: Thompson's construction is an algorithm for compiling a regular expression to an NFA that ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Algorithms
In mathematics and computer science, an algorithm () is a finite sequence of mathematically rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning). In contrast, a heuristic is an approach to solving problems without well-defined correct or optimal results.David A. Grossman, Ophir Frieder, ''Information Retrieval: Algorithms and Heuristics'', 2nd edition, 2004, For example, although social media recommender systems are commonly called "algorithms", they actually rely on heuristics as there is no truly "correct" recommendation. As an effective method, an algorithm can be expressed within a finite amount of space and time"Any classic ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Thompson's Construction Algorithm
In computer science, Thompson's construction algorithm, also called the McNaughton–Yamada–Thompson algorithm, is a method of transforming a regular expression into an equivalent nondeterministic finite automaton (NFA). This NFA can be used to match strings against the regular expression. This algorithm is credited to Ken Thompson. Regular expressions and nondeterministic finite automata are two representations of formal languages. For instance, text processing utilities use regular expressions to describe advanced search patterns, but NFAs are better suited for execution on a computer. Hence, this algorithm is of practical interest, since it can compile regular expressions into NFAs. From a theoretical point of view, this algorithm is a part of the proof that they both accept exactly the same languages, that is, the regular languages. An NFA can be made deterministic by the powerset construction and then be minimized to get an optimal automaton corresponding to the given ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Star Height
In theoretical computer science, more precisely in the theory of formal languages, the star height is a measure for the structural complexity of regular expressions and regular languages. The star height of a regular ''expression'' equals the maximum nesting depth of stars appearing in that expression. The star height of a regular ''language'' is the least star height of any regular expression for that language. The concept of star height was first defined and studied by Eggan (1963). Formal definition More formally, the star height of a regular expression ''E'' over a finite alphabet ''A'' is inductively defined as follows: * \textstyle h\left(\emptyset\right)\,=\,0, \textstyle h\left(\varepsilon\right)\,=\,0, and \textstyle h\left(a\right)\,=\,0 for all alphabet symbols ''a'' in ''A''. * \textstyle h\left(E F\right)\,=\, h\left(E\, \mid\, F\right)\,=\,\max \left(\, h(E), h(F)\,\right) * \textstyle h\left(E^*\right)\,=\,h(E)+1. Here, \scriptstyle \emptyset is the special regular ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Generalized Star Height Problem
The generalized star-height problem in formal language theory is the open question whether all regular languages can be expressed using generalized regular expressions with a limited nesting depth of Kleene stars. Here, generalized regular expressions are defined like regular expressions, but they have a built-in complement operator. For a regular language, its generalized star height is defined as the minimum nesting depth of Kleene stars needed in order to describe the language by means of a generalized regular expression, hence the name of the problem. More specifically, it is an open question whether a nesting depth of more than 1 is required, and if so, whether there is an algorithm to determine the minimum required star height.Sakarovitch (2009) p.171 Regular languages of star-height 0 are also known as star-free languages. The theorem of Schützenberger provides an algebraic characterization of star-free languages by means of aperiodic syntactic monoids. In particular ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Star Height Problem
The star height problem in formal language theory is the question whether all regular languages can be expressed using regular expressions of limited star height, i.e. with a limited nesting depth of Kleene stars. Specifically, is a nesting depth of one always sufficient? If not, is there an algorithm to determine how many are required? The problem was first introduced by Eggan in 1963. Families of regular languages with unbounded star height The first question was answered in the negative when in 1963, Eggan gave examples of regular languages of star height ''n'' for every ''n''. Here, the star height ''h''(''L'') of a regular language ''L'' is defined as the minimum star height among all regular expressions representing ''L''. The first few languages found by Eggan are described in the following, by means of giving a regular expression for each language: :\begin e_1 &= a_1^* \\ e_2 &= \left(a_1^*a_2^*a_3\right)^*\\ e_3 &= \left(\left(a_1^*a_2^*a_3\right)^*\left(a_4^*a_5^*a_6\r ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Floyd–Warshall Algorithm
In computer science, the Floyd–Warshall algorithm (also known as Floyd's algorithm, the Roy–Warshall algorithm, the Roy–Floyd algorithm, or the WFI algorithm) is an algorithm for finding shortest paths in a directed weighted graph with positive or negative edge weights (but with no negative cycles). See in particular Section 26.2, "The Floyd–Warshall algorithm", pp. 558–565 and Section 26.4, "A general framework for solving path problems in directed graphs", pp. 570–576. A single execution of the algorithm will find the lengths (summed weights) of shortest paths between all pairs of vertices. Although it does not return details of the paths themselves, it is possible to reconstruct the paths with simple modifications to the algorithm. Versions of the algorithm can also be used for finding the transitive closure of a relation R, or (in connection with the Schulze voting system) widest paths between all pairs of vertices in a weighted graph. History and nam ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Kleene Algebra
In mathematics and theoretical computer science, a Kleene algebra ( ; named after Stephen Cole Kleene) is a semiring that generalizes the theory of regular expressions: it consists of a set supporting union (addition), concatenation (multiplication), and Kleene star operations subject to certain algebraic laws. The addition is required to be idempotent (x + x = x for all x), and induces a partial order defined by x \le y if x + y = y. The Kleene star operation, denoted x*, must satisfy the laws of the closure operator. Kleene algebras have their origins in the theory of regular expressions and regular languages introduced by Kleene in 1951 and studied by others including V.N. Redko and John Horton Conway. The term was introduced by Dexter Kozen in the 1980s, who fully characterized their algebraic properties and, in 1994, gave a finite axiomatization. Kleene algebras have a number of extensions that have been studied, including Kleene algebras with tests (KAT) introduced by Koze ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Deterministic Finite Automata
In the theory of computation, a branch of theoretical computer science, a deterministic finite automaton (DFA)—also known as deterministic finite acceptor (DFA), deterministic finite-state machine (DFSM), or deterministic finite-state automaton (DFSA)—is a finite-state machine that accepts or rejects a given string of symbols, by running through a state sequence uniquely determined by the string. ''Deterministic'' refers to the uniqueness of the computation run. In search of the simplest models to capture finite-state machines, Warren McCulloch and Walter Pitts were among the first researchers to introduce a concept similar to finite automata in 1943. The figure illustrates a deterministic finite automaton using a state diagram. In this example automaton, there are three states: S0, S1, and S2 (denoted graphically by circles). The automaton takes a finite sequence of 0s and 1s as input. For each state, there is a transition arrow leading out to a next state for both 0 an ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Formal Language Theory
In logic, mathematics, computer science, and linguistics, a formal language is a set of string (computer science), strings whose symbols are taken from a set called "#Definition, alphabet". The alphabet of a formal language consists of symbols that concatenate into strings (also called "words"). Words that belong to a particular formal language are sometimes called Formal language#Definition, ''well-formed words''. A formal language is often defined by means of a formal grammar such as a regular grammar or context-free grammar. In computer science, formal languages are used, among others, as the basis for defining the grammar of programming languages and formalized versions of subsets of natural languages, in which the words of the language represent concepts that are associated with meanings or semantics. In computational complexity theory, decision problems are typically defined as formal languages, and complexity classes are defined as the sets of the formal languages that ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]