Complexity And Scale Of Structures Within Crystalline Materials
   HOME

TheInfoList



OR:

Complexity characterises the behaviour of a
system A system is a group of Interaction, interacting or interrelated elements that act according to a set of rules to form a unified whole. A system, surrounded and influenced by its environment (systems), environment, is described by its boundaries, ...
or
model A model is an informative representation of an object, person or system. The term originally denoted the plans of a building in late 16th-century English, and derived via French and Italian ultimately from Latin ''modulus'', a measure. Models c ...
whose components
interact Advocates for Informed Choice, dba interACT or interACT Advocates for Intersex Youth, is a 501(c)(3) nonprofit organization using innovative strategies to advocate for the legal and human rights of children with intersex traits. The organizati ...
in multiple ways and follow local rules, leading to nonlinearity, randomness, collective dynamics, hierarchy, and emergence. The term is generally used to characterize something with many parts where those parts interact with each other in multiple ways, culminating in a higher order of
emergence In philosophy, systems theory, science, and art, emergence occurs when an entity is observed to have properties its parts do not have on their own, properties or behaviors that emerge only when the parts interact in a wider whole. Emergence ...
greater than the sum of its parts. The study of these complex linkages at various scales is the main goal of
complex systems theory A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication sy ...
. The intuitive criterion of complexity can be formulated as follows: a system would be more complex if more parts could be distinguished, and if more connections between them existed.
Science Science is a systematic endeavor that builds and organizes knowledge in the form of testable explanations and predictions about the universe. Science may be as old as the human species, and some of the earliest archeological evidence for ...
takes a number of approaches to characterizing complexity; Zayed ''et al.'' reflect many of these. Neil Johnson states that "even among scientists, there is no unique definition of complexity – and the scientific notion has traditionally been conveyed using particular examples..." Ultimately Johnson adopts the definition of "complexity science" as "the study of the phenomena which emerge from a collection of interacting objects".


Overview

Definitions of complexity often depend on the concept of a "
system A system is a group of Interaction, interacting or interrelated elements that act according to a set of rules to form a unified whole. A system, surrounded and influenced by its environment (systems), environment, is described by its boundaries, ...
" – a set of parts or elements that have relationships among them differentiated from relationships with other elements outside the relational regime. Many definitions tend to postulate or assume that complexity expresses a condition of numerous elements in a system and numerous forms of relationships among the elements. However, what one sees as complex and what one sees as simple is relative and changes with time.
Warren Weaver Warren Weaver (July 17, 1894 – November 24, 1978) was an American scientist, mathematician, and science administrator. He is widely recognized as one of the pioneers of machine translation and as an important figure in creating support for scien ...
posited in 1948 two forms of complexity: disorganized complexity, and organized complexity. Phenomena of 'disorganized complexity' are treated using probability theory and statistical mechanics, while 'organized complexity' deals with phenomena that escape such approaches and confront "dealing simultaneously with a sizable number of factors which are interrelated into an organic whole". Weaver's 1948 paper has influenced subsequent thinking about complexity. The approaches that embody concepts of systems, multiple elements, multiple relational regimes, and state spaces might be summarized as implying that complexity arises from the number of distinguishable relational regimes (and their associated state spaces) in a defined system. Some definitions relate to the algorithmic basis for the expression of a complex phenomenon or model or mathematical expression, as later set out herein.


Disorganized vs. organized

One of the problems in addressing complexity issues has been formalizing the intuitive conceptual distinction between the large number of variances in relationships extant in random collections, and the sometimes large, but smaller, number of relationships between elements in systems where constraints (related to correlation of otherwise independent elements) simultaneously reduce the variations from element independence and create distinguishable regimes of more-uniform, or correlated, relationships, or interactions. Weaver perceived and addressed this problem, in at least a preliminary way, in drawing a distinction between "disorganized complexity" and "organized complexity". In Weaver's view, disorganized complexity results from the particular system having a very large number of parts, say millions of parts, or many more. Though the interactions of the parts in a "disorganized complexity" situation can be seen as largely random, the properties of the system as a whole can be understood by using probability and statistical methods. A prime example of disorganized complexity is a gas in a container, with the gas molecules as the parts. Some would suggest that a system of disorganized complexity may be compared with the (relative)
simplicity Simplicity is the state or quality of being simple. Something easy to understand or explain seems simple, in contrast to something complicated. Alternatively, as Herbert A. Simon suggests, something is simple or complex depending on the way we ch ...
of planetary orbits – the latter can be predicted by applying
Newton's laws of motion Newton's laws of motion are three basic laws of classical mechanics that describe the relationship between the motion of an object and the forces acting on it. These laws can be paraphrased as follows: # A body remains at rest, or in moti ...
. Of course, most real-world systems, including planetary orbits, eventually become theoretically unpredictable even using Newtonian dynamics; as discovered by modern
chaos theory Chaos theory is an interdisciplinary area of scientific study and branch of mathematics focused on underlying patterns and deterministic laws of dynamical systems that are highly sensitive to initial conditions, and were once thought to have co ...
. Organized complexity, in Weaver's view, resides in nothing else than the non-random, or correlated, interaction between the parts. These correlated relationships create a differentiated structure that can, as a system, interact with other systems. The coordinated system manifests properties not carried or dictated by individual parts. The organized aspect of this form of complexity vis-a-vis to other systems than the subject system can be said to "emerge," without any "guiding hand". The number of parts does not have to be very large for a particular system to have emergent properties. A system of organized complexity may be understood in its properties (behavior among the properties) through
modeling A model is an informative representation of an object, person or system. The term originally denoted the plans of a building in late 16th-century English, and derived via French and Italian ultimately from Latin ''modulus'', a measure. Models c ...
and
simulation A simulation is the imitation of the operation of a real-world process or system over time. Simulations require the use of Conceptual model, models; the model represents the key characteristics or behaviors of the selected system or proc ...
, particularly modeling and simulation with computers. An example of organized complexity is a city neighborhood as a living mechanism, with the neighborhood people among the system's parts.


Sources and factors

There are generally rules which can be invoked to explain the origin of complexity in a given system. The source of disorganized complexity is the large number of parts in the system of interest, and the lack of correlation between elements in the system. In the case of self-organizing living systems, usefully organized complexity comes from beneficially mutated organisms being selected to survive by their environment for their differential reproductive ability or at least success over inanimate matter or less organized complex organisms. See e.g.
Robert Ulanowicz Robert Edward Ulanowicz ( ) is an American theoretical ecologist and philosopher of Polish descent who in his search for a ''unified theory of ecology'' has formulated a paradigm he calls ''Process Ecology''. He was born September 17, 1943 in B ...
's treatment of ecosystems. Complexity of an object or system is a relative property. For instance, for many functions (problems), such a computational complexity as time of computation is smaller when multitape
Turing machine A Turing machine is a mathematical model of computation describing an abstract machine that manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, it is capable of implementing any computer algori ...
s are used than when Turing machines with one tape are used.
Random Access Machine In computer science, random-access machine (RAM) is an abstract machine in the general class of register machines. The RAM is very similar to the counter machine but with the added capability of 'indirect addressing' of its registers. Like the cou ...
s allow one to even more decrease time complexity (Greenlaw and Hoover 1998: 226), while inductive Turing machines can decrease even the complexity class of a function, language or set (Burgin 2005). This shows that tools of activity can be an important factor of complexity.


Varied meanings

In several scientific fields, "complexity" has a precise meaning: * In
computational complexity theory In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by ...
, the amounts of resources required for the execution of
algorithm In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algorithms are used as specificat ...
s is studied. The most popular types of computational complexity are the time complexity of a problem equal to the number of steps that it takes to solve an instance of the problem as a function of the size of the input (usually measured in bits), using the most efficient algorithm, and the space complexity of a problem equal to the volume of the
memory Memory is the faculty of the mind by which data or information is encoded, stored, and retrieved when needed. It is the retention of information over time for the purpose of influencing future action. If past events could not be remembered, ...
used by the algorithm (e.g., cells of the tape) that it takes to solve an instance of the problem as a function of the size of the input (usually measured in bits), using the most efficient algorithm. This allows classification of computational problems by
complexity class In computational complexity theory, a complexity class is a set of computational problems of related resource-based complexity. The two most commonly analyzed resources are time and memory. In general, a complexity class is defined in terms of ...
(such as P, NP, etc.). An axiomatic approach to computational complexity was developed by
Manuel Blum Manuel Blum (born 26 April 1938) is a Venezuelan-American computer scientist who received the Turing Award in 1995 "In recognition of his contributions to the foundations of computational complexity theory and its application to cryptography and ...
. It allows one to deduce many properties of concrete computational complexity measures, such as time complexity or space complexity, from properties of axiomatically defined measures. * In
algorithmic information theory Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as opposed to stochastically generated), such as st ...
, the ''
Kolmogorov complexity In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a predetermined programming language) that produ ...
'' (also called ''descriptive complexity'', ''algorithmic complexity'' or ''algorithmic entropy'') of a string is the length of the shortest binary
program Program, programme, programmer, or programming may refer to: Business and management * Program management, the process of managing several related projects * Time management * Program, a part of planning Arts and entertainment Audio * Progra ...
that outputs that string.
Minimum message length Minimum message length (MML) is a Bayesian information-theoretic method for statistical model comparison and selection. It provides a formal information theory restatement of Occam's Razor: even when models are equal in their measure of fit-accurac ...
is a practical application of this approach. Different kinds of Kolmogorov complexity are studied: the uniform complexity, prefix complexity, monotone complexity, time-bounded Kolmogorov complexity, and space-bounded Kolmogorov complexity. An axiomatic approach to Kolmogorov complexity based on
Blum axioms In computational complexity theory the Blum axioms or Blum complexity axioms are axioms that specify desirable properties of complexity measures on the set of computable functions. The axioms were first defined by Manuel Blum in 1967. Importantly ...
(Blum 1967) was introduced by Mark Burgin in the paper presented for publication by
Andrey Kolmogorov Andrey Nikolaevich Kolmogorov ( rus, Андре́й Никола́евич Колмого́ров, p=ɐnˈdrʲej nʲɪkɐˈlajɪvʲɪtɕ kəlmɐˈɡorəf, a=Ru-Andrey Nikolaevich Kolmogorov.ogg, 25 April 1903 – 20 October 1987) was a Sovi ...
. The axiomatic approach encompasses other approaches to Kolmogorov complexity. It is possible to treat different kinds of Kolmogorov complexity as particular cases of axiomatically defined generalized Kolmogorov complexity. Instead of proving similar theorems, such as the basic invariance theorem, for each particular measure, it is possible to easily deduce all such results from one corresponding theorem proved in the axiomatic setting. This is a general advantage of the axiomatic approach in mathematics. The axiomatic approach to Kolmogorov complexity was further developed in the book (Burgin 2005) and applied to software metrics (Burgin and Debnath, 2003; Debnath and Burgin, 2003). *In
information theory Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
, information fluctuation complexity is the fluctuation of information about
information entropy In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
. It is derivable from fluctuations in the predominance of order and chaos in a dynamic system and has been used as a measure of complexity in many diverse fields. * In
information processing Information processing is the change (processing) of information in any manner detectable by an observer. As such, it is a process that ''describes'' everything that happens (changes) in the universe, from the falling of a rock (a change in posit ...
, complexity is a measure of the total number of
properties Property is the ownership of land, resources, improvements or other tangible objects, or intellectual property. Property may also refer to: Mathematics * Property (mathematics) Philosophy and science * Property (philosophy), in philosophy and ...
transmitted by an object and detected by an
observer An observer is one who engages in observation or in watching an experiment. Observer may also refer to: Computer science and information theory * In information theory, any system which receives information from an object * State observer in co ...
. Such a collection of properties is often referred to as a
state State may refer to: Arts, entertainment, and media Literature * ''State Magazine'', a monthly magazine published by the U.S. Department of State * ''The State'' (newspaper), a daily newspaper in Columbia, South Carolina, United States * ''Our S ...
. * In
physical systems A physical system is a collection of physical objects. In physics, it is a portion of the physical universe chosen for analysis. Everything outside the system is known as the environment. The environment is ignored except for its effects on the ...
, complexity is a measure of the
probability Probability is the branch of mathematics concerning numerical descriptions of how likely an Event (probability theory), event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and ...
of the state vector of the system. This should not be confused with
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
; it is a distinct mathematical measure, one in which two distinct states are never conflated and considered equal, as is done for the notion of entropy in
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic be ...
. * In
dynamical systems In mathematics, a dynamical system is a system in which a function describes the time dependence of a point in an ambient space. Examples include the mathematical models that describe the swinging of a clock pendulum, the flow of water in a p ...
, statistical complexity measures the size of the minimum program able to statistically reproduce the patterns (configurations) contained in the data set (sequence). While the algorithmic complexity implies a deterministic description of an object (it measures the information content of an individual sequence), the statistical complexity, like forecasting complexity, implies a statistical description, and refers to an ensemble of sequences generated by a certain source. Formally, the statistical complexity reconstructs a minimal model comprising the collection of all histories sharing a similar probabilistic future, and measures the
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
of the probability distribution of the states within this model. It is a computable and observer-independent measure based only on the internal dynamics of the system, and has been used in studies of
emergence In philosophy, systems theory, science, and art, emergence occurs when an entity is observed to have properties its parts do not have on their own, properties or behaviors that emerge only when the parts interact in a wider whole. Emergence ...
and
self-organization Self-organization, also called spontaneous order in the social sciences, is a process where some form of overall order arises from local interactions between parts of an initially disordered system. The process can be spontaneous when suffi ...
. * In
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, Krohn–Rhodes complexity is an important topic in the study of finite
semigroup In mathematics, a semigroup is an algebraic structure consisting of a set together with an associative internal binary operation on it. The binary operation of a semigroup is most often denoted multiplicatively: ''x''·''y'', or simply ''xy'', ...
s and
automata An automaton (; plural: automata or automatons) is a relatively self-operating machine, or control mechanism designed to automatically follow a sequence of operations, or respond to predetermined instructions.Automaton – Definition and More ...
. * In
Network theory Network theory is the study of graphs as a representation of either symmetric relations or asymmetric relations between discrete objects. In computer science and network science, network theory is a part of graph theory: a network can be defi ...
complexity is the product of richness in the connections between components of a system, and defined by a very unequal distribution of certain measures (some elements being highly connected and some very few, see
complex network In the context of network theory, a complex network is a graph (network) with non-trivial topological features—features that do not occur in simple networks such as lattices or random graphs but often occur in networks representing real s ...
). * In
software engineering Software engineering is a systematic engineering approach to software development. A software engineer is a person who applies the principles of software engineering to design, develop, maintain, test, and evaluate computer software. The term '' ...
, programming complexity is a measure of the interactions of the various elements of the software. This differs from the computational complexity described above in that it is a measure of the design of the software. Other fields introduce less precisely defined notions of complexity: * A
complex adaptive system A complex adaptive system is a system that is ''complex'' in that it is a dynamic network of interactions, but the behavior of the ensemble may not be predictable according to the behavior of the components. It is ''adaptive'' in that the individ ...
has some or all of the following attributes: ** The number of parts (and types of parts) in the system and the number of relations between the parts is non-trivial – however, there is no general rule to separate "trivial" from "non-trivial"; ** The system has memory or includes
feedback Feedback occurs when outputs of a system are routed back as inputs as part of a chain of cause-and-effect that forms a circuit or loop. The system can then be said to ''feed back'' into itself. The notion of cause-and-effect has to be handled ...
; ** The system can adapt itself according to its history or feedback; ** The relations between the system and its environment are non-trivial or non-linear; ** The system can be influenced by, or can adapt itself to, its environment; ** The system is highly sensitive to initial conditions.


Study

Complexity has always been a part of our environment, and therefore many scientific fields have dealt with complex systems and phenomena. From one perspective, that which is somehow complex – displaying variation without being
random In common usage, randomness is the apparent or actual lack of pattern or predictability in events. A random sequence of events, symbols or steps often has no :wikt:order, order and does not follow an intelligible pattern or combination. Ind ...
– is most worthy of interest given the rewards found in the depths of exploration. The use of the term complex is often confused with the term complicated. In today's systems, this is the difference between myriad connecting "stovepipes" and effective "integrated" solutions. This means that complex is the opposite of independent, while complicated is the opposite of simple. While this has led some fields to come up with specific definitions of complexity, there is a more recent movement to regroup observations from different fields to study complexity in itself, whether it appears in
anthill An ant colony is a population of a single ant species capable to maintain its complete lifecycle. Ant colonies are eusocial, communal, and efficiently organized and are very much like those found in other social Hymenoptera, though the vario ...
s,
human brain The human brain is the central organ of the human nervous system, and with the spinal cord makes up the central nervous system. The brain consists of the cerebrum, the brainstem and the cerebellum. It controls most of the activities of the ...
s, or
economic system An economic system, or economic order, is a system of Production (economics), production, resource allocation and Distribution (economics), distribution of goods and services within a society or a given geographic area. It includes the combinati ...
s, social systems. One such interdisciplinary group of fields is
relational order theories Relationalism is any theoretical position that gives importance to the relational nature of things. For relationalism, things exist and function only as relational entities. Relationalism may be contrasted with relationism, which tends to emphasize ...
.


Topics


Behaviour

The behavior of a complex system is often said to be due to emergence and
self-organization Self-organization, also called spontaneous order in the social sciences, is a process where some form of overall order arises from local interactions between parts of an initially disordered system. The process can be spontaneous when suffi ...
. Chaos theory has investigated the sensitivity of systems to variations in initial conditions as one cause of complex behaviour.


Mechanisms

Recent developments in
artificial life Artificial life (often abbreviated ALife or A-Life) is a field of study wherein researchers examine systems related to natural life, its processes, and its evolution, through the use of simulations with computer models, robotics, and biochemistry ...
,
evolutionary computation In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, they ...
and
genetic algorithm In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to gene ...
s have led to an increasing emphasis on complexity and
complex adaptive systems A complex adaptive system is a system that is ''complex'' in that it is a dynamic network of interactions, but the behavior of the ensemble may not be predictable according to the behavior of the components. It is ''adaptive'' in that the individ ...
.


Simulations

In
social science Social science is one of the branches of science, devoted to the study of societies and the relationships among individuals within those societies. The term was formerly used to refer to the field of sociology, the original "science of soc ...
, the study on the emergence of macro-properties from the micro-properties, also known as macro-micro view in
sociology Sociology is a social science that focuses on society, human social behavior, patterns of Interpersonal ties, social relationships, social interaction, and aspects of culture associated with everyday life. It uses various methods of Empirical ...
. The topic is commonly recognized as
social complexity In sociology, social complexity is a conceptual framework used in the analysis of society. In the sciences, contemporary definitions of complexity are found in systems theory, wherein the phenomenon being studied has many parts and many possible ...
that is often related to the use of computer simulation in social science, i.e.:
computational sociology Computational sociology is a branch of sociology that uses computationally intensive methods to analyze and model social phenomena. Using computer simulations, artificial intelligence, complex statistical methods, and analytic approaches like soc ...
.


Systems

Systems theory Systems theory is the interdisciplinary study of systems, i.e. cohesive groups of interrelated, interdependent components that can be natural or human-made. Every system has causal boundaries, is influenced by its context, defined by its structu ...
has long been concerned with the study of
complex system A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication ...
s (in recent times, ''complexity theory'' and ''complex systems'' have also been used as names of the field). These systems are present in the research of a variety disciplines, including
biology Biology is the scientific study of life. It is a natural science with a broad scope but has several unifying themes that tie it together as a single, coherent field. For instance, all organisms are made up of cells that process hereditary i ...
,
economics Economics () is the social science that studies the Production (economics), production, distribution (economics), distribution, and Consumption (economics), consumption of goods and services. Economics focuses on the behaviour and intera ...
, social studies and
technology Technology is the application of knowledge to reach practical goals in a specifiable and reproducible way. The word ''technology'' may also mean the product of such an endeavor. The use of technology is widely prevalent in medicine, science, ...
. Recently, complexity has become a natural domain of interest of real world socio-cognitive systems and emerging
systemics In the context of systems science and systems philosophy, systemics is an initiative to study systems. It is an attempt at developing logical, mathematical, engineering and philosophical paradigms and frameworks in which physical, technological, ...
research. Complex systems tend to be high-
dimension In physics and mathematics, the dimension of a Space (mathematics), mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any Point (geometry), point within it. Thus, a Line (geometry), lin ...
al,
non-linear In mathematics and science, a nonlinear system is a system in which the change of the output is not proportional to the change of the input. Nonlinear problems are of interest to engineers, biologists, physicists, mathematicians, and many other ...
, and difficult to model. In specific circumstances, they may exhibit low-dimensional behaviour.


Data

In
information theory Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
, algorithmic information theory is concerned with the complexity of strings of data. Complex strings are harder to compress. While intuition tells us that this may depend on the
codec A codec is a device or computer program that encodes or decodes a data stream or signal. ''Codec'' is a portmanteau of coder/decoder. In electronic communications, an endec is a device that acts as both an encoder and a decoder on a signal or da ...
used to compress a string (a codec could be theoretically created in any arbitrary language, including one in which the very small command "X" could cause the computer to output a very complicated string like "18995316"), any two
Turing-complete In computability theory, a system of data-manipulation rules (such as a computer's instruction set, a programming language, or a cellular automaton) is said to be Turing-complete or computationally universal if it can be used to simulate any Tur ...
languages can be implemented in each other, meaning that the length of two encodings in different languages will vary by at most the length of the "translation" language – which will end up being negligible for sufficiently large data strings. These algorithmic measures of complexity tend to assign high values to
random noise In electronics, noise is an unwanted disturbance in an electrical signal. Noise generated by electronic devices varies greatly as it is produced by several different effects. In particular, noise is inherent in physics, and central to the ...
. However, those studying complex systems would not consider randomness as complexity.
Information entropy In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
is also sometimes used in information theory as indicative of complexity, but entropy is also high for randomness. Information fluctuation complexity, fluctuations of information about entropy, does not consider randomness to be complex and has been useful in many applications. Recent work in
machine learning Machine learning (ML) is a field of inquiry devoted to understanding and building methods that 'learn', that is, methods that leverage data to improve performance on some set of tasks. It is seen as a part of artificial intelligence. Machine ...
has examined the complexity of the data as it affects the performance of supervised classification algorithms. Ho and Basu present a set of complexity measures for
binary classification Binary classification is the task of classifying the elements of a set into two groups (each called ''class'') on the basis of a classification rule. Typical binary classification problems include: * Medical testing to determine if a patient has c ...
problems. The complexity measures broadly cover: * the overlaps in feature values from differing classes. * the separability of the classes. * measures of geometry, topology, and density of
manifold In mathematics, a manifold is a topological space that locally resembles Euclidean space near each point. More precisely, an n-dimensional manifold, or ''n-manifold'' for short, is a topological space with the property that each point has a n ...
s. Instance hardness is another approach seeks to characterize the data complexity with the goal of determining how hard a data set is to classify correctly and is not limited to binary problems. Instance hardness is a bottom-up approach that first seeks to identify instances that are likely to be misclassified (or, in other words, which instances are the most complex). The characteristics of the instances that are likely to be misclassified are then measured based on the output from a set of hardness measures. The hardness measures are based on several supervised learning techniques such as measuring the number of disagreeing neighbors or the likelihood of the assigned class label given the input features. The information provided by the complexity measures has been examined for use in
meta-learning Meta-learning is a branch of metacognition concerned with learning about one's own learning and learning processes. The term comes from the meta prefix's modern meaning of an abstract recursion, or "X about X", similar to its use in metaknowled ...
to determine for which data sets filtering (or removing suspected noisy instances from the training set) is the most beneficial and could be expanded to other areas.


In molecular recognition

A recent study based on molecular simulations and compliance constants describes
molecular recognition The term molecular recognition refers to the specific interaction between two or more molecules through noncovalent bonding such as hydrogen bonding, metal coordination, hydrophobic forces, van der Waals forces, π-π interactions, halogen b ...
as a phenomenon of organisation. Even for small molecules like
carbohydrates In organic chemistry, a carbohydrate () is a biomolecule consisting of carbon (C), hydrogen (H) and oxygen (O) atoms, usually with a hydrogen–oxygen atom ratio of 2:1 (as in water) and thus with the empirical formula (where ''m'' may or may ...
, the recognition process can not be predicted or designed even assuming that each individual
hydrogen bond In chemistry, a hydrogen bond (or H-bond) is a primarily electrostatic force of attraction between a hydrogen (H) atom which is covalently bound to a more electronegative "donor" atom or group (Dn), and another electronegative atom bearing a ...
's strength is exactly known.


The law of requisite complexity

Driving from the law of requisite variety, Boisot and McKelvey formulated the ‘Law of Requisite Complexity’, that holds that, in order to be efficaciously adaptive, the internal complexity of a system must match the external complexity it confronts.


Positive, appropriate and negative complexity

The application in project management of the Law of Requisite Complexity, as proposed by Stefan Morcov, is the analysis of positive, appropriate and negative complexity.Morcov, S. (2021). Managing Positive and Negative Complexity: Design and Validation of an IT Project Complexity Management Framework. KU Leuven University. Available at https://lirias.kuleuven.be/retrieve/637007


In

project management Project management is the process of leading the work of a team to achieve all project goals within the given constraints. This information is usually described in project documentation, created at the beginning of the development process. Th ...

Project complexity Project complexity is the property of a project which makes it difficult to understand, foresee, and keep under control its overall behavior, even when given reasonably complete information about the project system. With a lens of systems thinking, ...
is the property of a project which makes it difficult to understand, foresee, and keep under control its overall behavior, even when given reasonably complete information about the project system.


In systems engineering

Maik Maurer considers complexity as a reality in engineering. He proposed a methodology for managing complexity in systems engineering :                              1.           Define the system.                              2.           Identify the type of complexity.                              3.           Determine the strategy.                              4.           Determine the method.                              5.           Model the system.                              6.           Implement the method.


Applications

Computational complexity theory is the study of the complexity of problems – that is, the difficulty of solving them. Problems can be classified by complexity class according to the time it takes for an algorithm – usually a computer program – to solve them as a function of the problem size. Some problems are difficult to solve, while others are easy. For example, some difficult problems need algorithms that take an exponential amount of time in terms of the size of the problem to solve. Take the
travelling salesman problem The travelling salesman problem (also called the travelling salesperson problem or TSP) asks the following question: "Given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each cit ...
, for example. It can be solved in time O(n^2 2^n) (where ''n'' is the size of the network to visit – the number of cities the travelling salesman must visit exactly once). As the size of the network of cities grows, the time needed to find the route grows (more than) exponentially. Even though a problem may be computationally solvable in principle, in actual practice it may not be that simple. These problems might require large amounts of time or an inordinate amount of space.
Computational complexity In computer science, the computational complexity or simply complexity of an algorithm is the amount of resources required to run it. Particular focus is given to computation time (generally measured by the number of needed elementary operations) ...
may be approached from many different aspects. Computational complexity can be investigated on the basis of time, memory or other resources used to solve the problem. Time and space are two of the most important and popular considerations when problems of complexity are analyzed. There exist a certain class of problems that although they are solvable in principle they require so much time or space that it is not practical to attempt to solve them. These problems are called intractable. There is another form of complexity called hierarchical complexity. It is orthogonal to the forms of complexity discussed so far, which are called horizontal complexity.


See also

*
Assembly theory Assembly theory is a theory that characterizes object complexity. When applied to molecule complexity, its authors claim it to be the first technique that is experimentally verifiable, unlike other molecular complexity algorithms that lack exp ...
*
Chaos theory Chaos theory is an interdisciplinary area of scientific study and branch of mathematics focused on underlying patterns and deterministic laws of dynamical systems that are highly sensitive to initial conditions, and were once thought to have co ...
* Complexity theory (disambiguation page) *
Complex network In the context of network theory, a complex network is a graph (network) with non-trivial topological features—features that do not occur in simple networks such as lattices or random graphs but often occur in networks representing real s ...
*
Complex system A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication ...
*
Cyclomatic complexity Cyclomatic complexity is a software metric used to indicate the complexity of a program. It is a quantitative measure of the number of linearly independent paths through a program's source code. It was developed by Thomas J. McCabe, Sr. in 1976 ...
*
Digital morphogenesis Digital morphogenesis is a type of generative art in which complex shape development, or morphogenesis, is enabled by computation. This concept is applicable in many areas of design, art, architecture, and modeling. The concept was originally deve ...
* Dual-phase evolution *
Emergence In philosophy, systems theory, science, and art, emergence occurs when an entity is observed to have properties its parts do not have on their own, properties or behaviors that emerge only when the parts interact in a wider whole. Emergence ...
*
Evolution of complexity The evolution of biological complexity is one important outcome of the process of evolution. Evolution has produced some remarkably complex organisms – although the actual level of complexity is very hard to define or measure accurately in biolog ...
*
Fractal In mathematics, a fractal is a geometric shape containing detailed structure at arbitrarily small scales, usually having a fractal dimension strictly exceeding the topological dimension. Many fractals appear similar at various scales, as illu ...
*
Game complexity Combinatorial game theory has several ways of measuring game complexity. This article describes five of them: state-space complexity, game tree size, decision complexity, game-tree complexity, and computational complexity. Measures of game comple ...
*
Holism in science Holism in science, holistic science, or methodological holism is an approach to research that emphasizes the study of complex systems. Systems are approached as coherent wholes whose component parts are best understood in context and in relation to ...
* Law of Complexity/Consciousness *
Model of hierarchical complexity The model of hierarchical complexity (MHC) is a framework for scoring how complex a behavior is, such as verbal reasoning or other cognitive tasks. It quantifies the order of hierarchical complexity of a task based on mathematical principles of how ...
*
Names of large numbers Two naming scales for large numbers have been used in English and other European languages since the early modern era: the long and short scales. Most English variants use the short scale today, but the long scale remains dominant in many non-Eng ...
*
Network science Network science is an academic field which studies complex networks such as telecommunication networks, computer networks, biological networks, cognitive and semantic networks, and social networks, considering distinct elements or actors repre ...
*
Network theory Network theory is the study of graphs as a representation of either symmetric relations or asymmetric relations between discrete objects. In computer science and network science, network theory is a part of graph theory: a network can be defi ...
*
Novelty theory Terence Kemp McKenna (November 16, 1946 – April 3, 2000) was an American ethnobotany, ethnobotanist and mysticism, mystic who advocated the responsible use of naturally occurring psychoactive plant, psychedelic plants. He spoke and wrot ...
*
Occam's razor Occam's razor, Ockham's razor, or Ocham's razor ( la, novacula Occami), also known as the principle of parsimony or the law of parsimony ( la, lex parsimoniae), is the problem-solving principle that "entities should not be multiplied beyond neces ...
*
Percolation theory In statistical physics and mathematics, percolation theory describes the behavior of a network when nodes or links are added. This is a geometric type of phase transition, since at a critical fraction of addition the network of small, disconnected ...
*
Process architecture Process architecture is the structural design of general process systems. It applies to fields such as computers (software, hardware, networks, etc.), business processes ( enterprise architecture, policy and procedures, logistics, project managemen ...
* Programming Complexity * Sociology and complexity science *
Systems theory Systems theory is the interdisciplinary study of systems, i.e. cohesive groups of interrelated, interdependent components that can be natural or human-made. Every system has causal boundaries, is influenced by its context, defined by its structu ...
* Thorngate's postulate of commensurate complexity *
Variety (cybernetics) In cybernetics, the term variety denotes the total number of distinguishable elements of a set, most often the set of states, inputs, or outputs of a finite-state machine or transformation, or the binary logarithm of the same quantity. Variety ...
*
Volatility, uncertainty, complexity and ambiguity VUCA is an acronym – first used in 1987, drawing on the leadership theories of Warren Bennis and Burt Nanus – to describe or to reflect on the volatility, uncertainty, complexity and ambiguity of general conditions and situations. Th ...
*
Arthur Winfree Arthur Taylor Winfree (May 15, 1942 – November 5, 2002) was a theoretical biologist at the University of Arizona. He was born in St. Petersburg, Florida, United States. Winfree was noted for his work on the mathematical modeling of biologica ...
*
Computational irreducibility Computational irreducibility is one of the main ideas proposed by Stephen Wolfram in his 2002 book ''A New Kind of Science'', although the concept goes back tstudies from the 1980s The idea Many physical systems are complex enough that they can ...
* Zero-Force Evolutionary Law *
Project complexity Project complexity is the property of a project which makes it difficult to understand, foresee, and keep under control its overall behavior, even when given reasonably complete information about the project system. With a lens of systems thinking, ...


References


Further reading

* * * * * * Burgin, M. (1982) Generalized Kolmogorov complexity and duality in theory of computations, Notices of the Russian Academy of Sciences, v.25, No. 3, pp. 19–23 * Meyers, R.A., (2009) "Encyclopedia of Complexity and Systems Science", * Mitchell, M. (2009). Complexity: A Guided Tour. Oxford University Press, Oxford, UK. * Gershenson, C., Ed. (2008). Complexity: 5 Questions. Automatic Peess / VIP.


External links


Complexity Measures
– an article about the abundance of not-that-useful complexity measures.

– Introductory complex system course by Melanie Mitchell
Santa Fe Institute
focusing on the study of complexity science
Lecture Videos


– Human Sciences and Complexity {{Authority control Abstraction Chaos theory Complex systems theory Holism Systems Transdisciplinarity