Majority Problem (cellular Automaton)
The majority problem, or density classification task, is the problem of finding one-dimensional cellular automaton rules that accurately perform majority voting. Using local transition rules, cells cannot know the total count of all the ones in system. In order to count the number of ones (or, by symmetry, the number of zeros), the system requires a logarithmic number of bits in the total size of the system. It also requires the system send messages over a distance linear in the size of the system and for the system to recognize a non-regular language. Thus, this problem is an important test case in measuring the computational power of cellular automaton systems. Problem statement Given a configuration of a two-state cellular automaton with ''i'' + ''j'' cells total, ''i'' of which are in the zero state and ''j'' of which are in the one state, a correct solution to the voting problem must eventually set all cells to zero if ''i'' > ''j'' and must eventually set all ce ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Cellular Automaton
A cellular automaton (pl. cellular automata, abbrev. CA) is a discrete model of computation studied in automata theory. Cellular automata are also called cellular spaces, tessellation automata, homogeneous structures, cellular structures, tessellation structures, and iterative arrays. Cellular automata have found application in various areas, including physics, theoretical biology and microstructure modeling. A cellular automaton consists of a regular grid of ''cells'', each in one of a finite number of '' states'', such as ''on'' and ''off'' (in contrast to a coupled map lattice). The grid can be in any finite number of dimensions. For each cell, a set of cells called its ''neighborhood'' is defined relative to the specified cell. An initial state (time ''t'' = 0) is selected by assigning a state for each cell. A new ''generation'' is created (advancing ''t'' by 1), according to some fixed ''rule'' (generally, a mathematical function) that determines the new state of e ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Majority Function
In Boolean logic, the majority function (also called the median operator) is the Boolean function that evaluates to false when half or more arguments are false and true otherwise, i.e. the value of the function equals the value of the majority of the inputs. Representing true values as 1 and false values as 0, we may use the (real-valued) formula: :\langle p_1,\dots,p_n \rangle = \operatorname \left ( p_1,\dots,p_n \right ) = \left \lfloor \frac + \frac \right \rfloor. The "−1/2" in the formula serves to break ties in favor of zeros when the number of arguments ''n'' is even. If the term "−1/2" is omitted, the formula can be used for a function that breaks ties in favor of ones. Most applications deliberately force an odd number of inputs so they don't have to deal with the question of what happens when exactly half the inputs are 0 and exactly half the inputs are 1. The few systems that calculate the majority function on an even number of inputs are often biased to ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Regular Language
In theoretical computer science and formal language theory, a regular language (also called a rational language) is a formal language that can be defined by a regular expression, in the strict sense in theoretical computer science (as opposed to many modern regular expressions engines, which are augmented with features that allow recognition of non-regular languages). Alternatively, a regular language can be defined as a language recognized by a finite automaton. The equivalence of regular expressions and finite automata is known as Kleene's theorem (after American mathematician Stephen Cole Kleene). In the Chomsky hierarchy, regular languages are the languages generated by Type-3 grammars. Formal definition The collection of regular languages over an alphabet Σ is defined recursively as follows: * The empty language Ø is a regular language. * For each ''a'' ∈ Σ (''a'' belongs to Σ), the singleton language is a regular language. * If ''A'' is a regular language, ''A''* ( ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Leonid Levin
Leonid Anatolievich Levin ( ; russian: Леони́д Анато́льевич Ле́вин; uk, Леоні́д Анато́лійович Ле́він; born November 2, 1948) is a Soviet-American mathematician and computer scientist. He is known for his work in randomness in computing, algorithmic complexity and intractability, average-case complexity, foundations of mathematics and computer science, algorithmic probability, theory of computation, and information theory. He obtained his master's degree at Moscow University in 1970 where he studied under Andrey Kolmogorov and completed the Candidate Degree academic requirements in 1972. He and Stephen Cook independently discovered the existence of NP-complete problems. This NP-completeness theorem, often called the Cook–Levin theorem, was a basis for one of the seven Millennium Prize Problems declared by the Clay Mathematics Institute with a $1,000,000 prize offered. The Cook–Levin theorem was a breakthrough in computer s ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Melanie Mitchell
Melanie Mitchell is an American scientist. She is the Davis Professor of Complexity at the Santa Fe Institute. Her major work has been in the areas of analogical reasoning, complex systems, genetic algorithms and cellular automata, and her publications in those fields are frequently cited. She received her PhD in 1990 from the University of Michigan under Douglas Hofstadter and John Holland, for which she developed the Copycat cognitive architecture. She is the author of "Analogy-Making as Perception", essentially a book about Copycat. She has also critiqued Stephen Wolfram's ''A New Kind of Science'' and showed that genetic algorithms could find better solutions to the majority problem for one-dimensional cellular automata. She is the author of ''An Introduction to Genetic Algorithms'', a widely known introductory book published by MIT Press in 1996. She is also author of ''Complexity: A Guided Tour'' (Oxford University Press, 2009), which won the 2010 Phi Beta Kappa Scienc ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Genetic Algorithm
In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems by relying on biologically inspired operators such as mutation, crossover and selection. Some examples of GA applications include optimizing decision trees for better performance, solving sudoku puzzles, hyperparameter optimization, etc. Methodology Optimization problems In a genetic algorithm, a population of candidate solutions (called individuals, creatures, organisms, or phenotypes) to an optimization problem is evolved toward better solutions. Each candidate solution has a set of properties (its chromosomes or genotype) which can be mutated and altered; traditionally, solutions are represented in binary as strings of 0s and 1s, but other encodings are also possible. ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Deterministic Algorithm
In computer science, a deterministic algorithm is an algorithm that, given a particular input, will always produce the same output, with the underlying machine always passing through the same sequence of states. Deterministic algorithms are by far the most studied and familiar kind of algorithm, as well as one of the most practical, since they can be run on real machines efficiently. Formally, a deterministic algorithm computes a mathematical function; a function has a unique value for any input in its domain, and the algorithm is a process that produces this particular value as output. Formal definition Deterministic algorithms can be defined in terms of a state machine: a ''state'' describes what a machine is doing at a particular instant in time. State machines pass in a discrete manner from one state to another. Just after we enter the input, the machine is in its ''initial state'' or ''start state''. If the machine is deterministic, this means that from this point onwards, ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Rule 184
Rule 184 is a one-dimensional binary cellular automaton rule, notable for solving the majority problem as well as for its ability to simultaneously describe several, seemingly quite different, particle systems: * Rule 184 can be used as a simple model for traffic flow in a single lane of a highway, and forms the basis for many cellular automaton models of traffic flow with greater sophistication. In this model, particles (representing vehicles) move in a single direction, stopping and starting depending on the cars in front of them. The number of particles remains unchanged throughout the simulation. Because of this application, Rule 184 is sometimes called the "traffic rule". * Rule 184 also models a form of deposition of particles onto an irregular surface, in which each local minimum of the surface is filled with a particle in each step. At each step of the simulation, the number of particles increases. Once placed, a particle never moves. * Rule 184 can be understood in ter ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Periodic Boundary Conditions
Periodic boundary conditions (PBCs) are a set of boundary conditions which are often chosen for approximating a large (infinite) system by using a small part called a ''unit cell''. PBCs are often used in computer simulations and mathematical models. The topology of two-dimensional PBC is equal to that of a ''world map'' of some video games; the geometry of the unit cell satisfies perfect two-dimensional tiling, and when an object passes through one side of the unit cell, it re-appears on the opposite side with the same velocity. In topological terms, the space made by two-dimensional PBCs can be thought of as being mapped onto a torus (compactification). The large systems approximated by PBCs consist of an infinite number of unit cells. In computer simulations, one of these is the original simulation box, and others are copies called ''images''. During the simulation, only the properties of the original simulation box need to be recorded and propagated. The ''minimum-image conventi ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |