HOME

TheInfoList



OR:

Lateral computing is a lateral thinking approach to solving computing problems. Lateral thinking has been made popular by Edward de Bono. This thinking technique is applied to generate creative ideas and solve problems. Similarly, by applying lateral-computing techniques to a problem, it can become much easier to arrive at a computationally inexpensive, easy to implement, efficient, innovative or unconventional solution. The traditional or conventional approach to solving computing problems is to either build mathematical models or have an IF- THEN -ELSE structure. For example, a
brute-force search In computer science, brute-force search or exhaustive search, also known as generate and test, is a very general problem-solving technique and algorithmic paradigm that consists of systematically enumerating all possible candidates for the soluti ...
is used in many chess engines, but this approach is computationally expensive and sometimes may arrive at poor solutions. It is for problems like this that lateral computing can be useful to form a better solution. A simple problem of truck backup can be used for illustrating lateral-computing. This is one of the difficult tasks for traditional computing techniques, and has been efficiently solved by the use of
fuzzy logic Fuzzy logic is a form of many-valued logic in which the truth value of variables may be any real number between 0 and 1. It is employed to handle the concept of partial truth, where the truth value may range between completely true and completely ...
(which is a lateral computing technique). Lateral-computing sometimes arrives at a novel solution for particular computing problem by using the model of how living beings, such as how humans, ants, and honeybees, solve a problem; how pure crystals are formed by annealing, or evolution of living beings or quantum mechanics etc.


From lateral-thinking to lateral-computing

Lateral thinking is technique for creative thinking for solving problems. The brain as center of thinking has a self-organizing information system. It tends to create patterns and traditional thinking process uses them to solve problems. The lateral thinking technique proposes to escape from this patterning to arrive at better solutions through new ideas. Provocative use of information processing is the basic underlying principle of lateral thinking, The provocative operator (PO) is something which characterizes lateral thinking. Its function is to generate new ideas by provocation and providing escape route from old ideas. It creates a provisional arrangement of information.
Water logic Water (chemical formula ) is an inorganic, transparent, tasteless, odorless, and nearly colorless chemical substance, which is the main constituent of Earth's hydrosphere and the fluids of all known living organisms (in which it acts as a s ...
is contrast to traditional or rock logic. Water logic has boundaries which depends on circumstances and conditions while rock logic has hard boundaries. Water logic, in someways, resembles
fuzzy logic Fuzzy logic is a form of many-valued logic in which the truth value of variables may be any real number between 0 and 1. It is employed to handle the concept of partial truth, where the truth value may range between completely true and completely ...
.


Transition to lateral-computing

Lateral computing does a provocative use of
information processing Information processing is the change (processing) of information in any manner detectable by an observer. As such, it is a process that ''describes'' everything that happens (changes) in the universe, from the falling of a rock (a change in posit ...
similar to lateral-thinking. This is explained with the use of evolutionary computing which is a very useful lateral-computing technique. The evolution proceeds by change and selection. While random mutation provides change, the selection is through
survival of the fittest "Survival of the fittest" is a phrase that originated from Darwinian evolutionary theory as a way of describing the mechanism of natural selection. The biological concept of fitness is defined as reproductive success. In Darwinian terms, th ...
. The random mutation works as a provocative information processing and provides a new avenue for generating better solutions for the computing problem. The term "Lateral Computing" was first proposed by Prof CR SUTHIKSHN Kumar and First World Congress on Lateral Computin
WCLC 2004
was organized with international participants during December 2004. Lateral computing takes the analogies from real-world examples such as: * How slow cooling of the hot gaseous state results in pure crystals ( Annealing) * How the
neural network A neural network is a network or circuit of biological neurons, or, in a modern sense, an artificial neural network, composed of artificial neurons or nodes. Thus, a neural network is either a biological neural network, made up of biological ...
s in the brain solve such problems as face and
speech recognition Speech recognition is an interdisciplinary subfield of computer science and computational linguistics that develops methodologies and technologies that enable the recognition and translation of spoken language into text by computers with the m ...
* How simple insects such as ants and honeybees solve some sophisticated problems * How
evolution Evolution is change in the heritable characteristics of biological populations over successive generations. These characteristics are the expressions of genes, which are passed on from parent to offspring during reproduction. Variation ...
of human beings from molecular life forms are mimicked by evolutionary computing * How living organisms defend themselves against diseases and heal their wounds * How electricity is distributed by grids Differentiating factors of "lateral computing": * Does not directly approach the problem through mathematical means. * Uses indirect models or looks for analogies to solve the problem. * Radically different from what is in vogue, such as using "photons" for computing in optical computing. This is rare as most conventional computers use electrons to carry signals. * Sometimes the Lateral Computing techniques are surprisingly simple and deliver high performance solutions to very complex problems. * Some of the techniques in lateral computing use "unexplained jumps". These jumps may not look logical. The example is the use of "Mutation" operator in genetic algorithms.


Convention – lateral

It is very hard to draw a clear boundary between conventional and lateral computing. Over a period of time, some unconventional computing techniques become integral part of mainstream computing. So there will always be an overlap between conventional and lateral computing. It will be tough task classifying a computing technique as a conventional or lateral computing technique as shown in the figure. The boundaries are fuzzy and one may approach with fuzzy sets.


Formal definition

Lateral computing is a fuzzy set of all computing techniques which use unconventional computing approach. Hence Lateral computing includes those techniques which use semi-conventional or hybrid computing. The degree of membership for lateral computing techniques is greater than 0 in the fuzzy set of unconventional computing techniques. The following brings out some important differentiators for lateral computing. ;Conventional computing: * The problem and technique are directly
correlated In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statistics ...
. * Treats the problem with rigorous mathematical analysis. * Creates mathematical models. * The computing technique can be analyzed mathematically. ;Lateral computing: * The problem may hardly have any relation to the computing technique used * Approaches problems by analogies such as human information processing model, annealing, etc. * Sometimes the computing technique cannot be mathematically analyzed.


Lateral computing and parallel computing

Parallel computing Parallel computing is a type of computation in which many calculations or processes are carried out simultaneously. Large problems can often be divided into smaller ones, which can then be solved at the same time. There are several different fo ...
focuses on improving the performance of the computers/algorithms through the use of several computing elements (such as processing elements). The computing speed is improved by using several computing elements. Parallel computing is an extension of conventional
sequential computing In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members (also called ''elements'', or ''terms''). The number of elements (possibly infinite) is called th ...
. However, in lateral computing, the problem is solved using unconventional information processing whether using a sequential or parallel computing.


A review of lateral-computing techniques

There are several computing techniques which fit the Lateral computing paradigm. Here is a brief description of some of the Lateral Computing techniques:


Swarm intelligence

Swarm intelligence Swarm intelligence (SI) is the collective behavior of decentralized, self-organized systems, natural or artificial. The concept is employed in work on artificial intelligence. The expression was introduced by Gerardo Beni and Jing Wang in 1989, in ...
(SI) is the property of a system whereby the collective behaviors of (unsophisticated) agents, interacting locally with their environment, cause coherent functional global patterns to emerge. SI provides a basis with which it is possible to explore collective (or distributed) problem solving without centralized control or the provision of a global model. One interesting swarm intelligent technique is the
Ant Colony algorithm In computer science and operations research, the ant colony optimization algorithm (ACO) is a probabilistic technique for solving computational problems which can be reduced to finding good paths through graphs. Artificial ants stand for multi- ...
: * Ants are behaviorally unsophisticated; collectively they perform complex tasks. Ants have highly developed sophisticated sign-based communication. * Ants communicate using pheromones; trails are laid that can be followed by other ants. * Routing Problem Ants drop different pheromones used to compute the "shortest" path from source to destination(s).


Agent-based systems

Agents are encapsulated computer systems that are situated in some environment and are capable of flexible, autonomous action in that environment in order to meet their design objectives. Agents are considered to be autonomous (independent, not-controllable), reactive (responding to events), pro-active (initiating actions of their own volition), and social (communicative). Agents vary in their abilities: they can be static or mobile, or may or may not be intelligent. Each agent may have its own task and/or role. Agents, and multi-agent systems, are used as a
metaphor A metaphor is a figure of speech that, for rhetorical effect, directly refers to one thing by mentioning another. It may provide (or obscure) clarity or identify hidden similarities between two different ideas. Metaphors are often compared wit ...
to model complex distributed processes. Such agents invariably need to interact with one another in order to manage their inter-dependencies. These interactions involve agents cooperating, negotiating and coordinating with one another. Agent-based systems are computer programs that try to simulate various complex phenomena via virtual "agents" that represent the components of a business system. The behaviors of these agents are programmed with rules that realistically depict how business is conducted. As widely varied individual agents interact in the model, the
simulation A simulation is the imitation of the operation of a real-world process or system over time. Simulations require the use of Conceptual model, models; the model represents the key characteristics or behaviors of the selected system or proc ...
shows how their collective behaviors govern the performance of the entire system - for instance, the emergence of a successful product or an optimal schedule. These simulations are powerful strategic tools for "what-if" scenario analysis: as managers change agent characteristics or "rules," the impact of the change can be easily seen in the model output


Grid computing

By
analogy Analogy (from Greek ''analogia'', "proportion", from ''ana-'' "upon, according to" lso "against", "anew"+ ''logos'' "ratio" lso "word, speech, reckoning" is a cognitive process of transferring information or meaning from a particular subject ( ...
, a computational grid is a hardware and
software Software is a set of computer programs and associated documentation and data. This is in contrast to hardware, from which the system is built and which actually performs the work. At the lowest programming level, executable code consists ...
infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities. The applications of
grid computing Grid computing is the use of widely distributed computer resources to reach a common goal. A computing grid can be thought of as a distributed system with non-interactive workloads that involve many files. Grid computing is distinguished from co ...
are in: * Chip design,
cryptographic Cryptography, or cryptology (from grc, , translit=kryptós "hidden, secret"; and ''graphein'', "to write", or '' -logia'', "study", respectively), is the practice and study of techniques for secure communication in the presence of adve ...
problems, medical instrumentation, and
supercomputing A supercomputer is a computer with a high level of performance as compared to a general-purpose computer. The performance of a supercomputer is commonly measured in floating-point operations per second (FLOPS) instead of million instructions ...
. * Distributed supercomputing applications use grids to aggregate substantial computational resources in order to tackle problems that cannot be solved on a single system.


Autonomic computing

The
autonomic nervous system The autonomic nervous system (ANS), formerly referred to as the vegetative nervous system, is a division of the peripheral nervous system that supplies viscera, internal organs, smooth muscle and glands. The autonomic nervous system is a control ...
governs our heart rate and body temperature, thus freeing our conscious brain from the burden of dealing with these and many other low-level, yet vital, functions. The essence of autonomic computing is self-management, the intent of which is to free system administrators from the details of system operation and maintenance. Four aspects of autonomic computing are: * Self-configuration * Self-optimization * Self-healing * Self-protection This is a grand challenge promoted by IBM.


Optical computing

Optical computing Optical computing or photonic computing uses light waves produced by lasers or incoherent sources for data processing, data storage or data communication for computing. For decades, photons have shown promise to enable a higher bandwidth than the ...
is to use photons rather than conventional electrons for computing. There are quite a few instances of optical computers and successful use of them. The conventional logic gates use
semiconductor A semiconductor is a material which has an electrical resistivity and conductivity, electrical conductivity value falling between that of a electrical conductor, conductor, such as copper, and an insulator (electricity), insulator, such as glas ...
s, which use electrons for transporting the signals. In case of optical computers, the photons in a light beam are used to do computation. There are numerous advantages of using optical devices for computing such as immunity to electromagnetic interference, large bandwidth, etc.


DNA computing

DNA computing uses strands of DNA to encode the instance of the problem and to manipulate them using techniques commonly available in any molecular biology laboratory in order to simulate operations that select the solution of the problem if it exists. Since the DNA molecule is also a code, but is instead made up of a sequence of four bases that pair up in a predictable manner, many scientists have thought about the possibility of creating a molecular computer. These computers rely on the much faster reactions of DNA nucleotides binding with their complements, a
brute force Brute Force or brute force may refer to: Techniques * Brute force method or proof by exhaustion, a method of mathematical proof * Brute-force attack, a cryptanalytic attack * Brute-force search, a computer problem-solving technique People * Brut ...
method that holds enormous potential for creating a new generation of computers that would be 100 billion times faster than today's fastest PC. DNA computing has been heralded as the "first example of true
nanotechnology Nanotechnology, also shortened to nanotech, is the use of matter on an atomic, molecular, and supramolecular scale for industrial purposes. The earliest, widespread description of nanotechnology referred to the particular technological goal o ...
", and even the "start of a new era", which forges an unprecedented link between computer science and life science. Example applications of DNA computing are in solution for the
Hamiltonian path problem In the mathematical field of graph theory the Hamiltonian path problem and the Hamiltonian cycle problem are problems of determining whether a Hamiltonian path (a path in an undirected or directed graph that visits each vertex exactly once) or a ...
which is a known NP complete one. The number of required lab operations using DNA grows linearly with the number of vertices of the graph. Molecular algorithms have been reported that solves the cryptographic problem in a
polynomial In mathematics, a polynomial is an expression consisting of indeterminates (also called variables) and coefficients, that involves only the operations of addition, subtraction, multiplication, and positive-integer powers of variables. An exa ...
number of steps. As known, factoring large numbers is a relevant problem in many cryptographic applications.


Quantum computing

In a
quantum computer Quantum computing is a type of computation whose operations can harness the phenomena of quantum mechanics, such as superposition, interference, and entanglement. Devices that perform quantum computations are known as quantum computers. Though ...
, the fundamental unit of information (called a quantum bit or qubit), is not binary but rather more
quaternary The Quaternary ( ) is the current and most recent of the three periods of the Cenozoic Era in the geologic time scale of the International Commission on Stratigraphy (ICS). It follows the Neogene Period and spans from 2.58 million years ...
in nature. This qubit property arises as a direct consequence of its adherence to the laws of quantum mechanics, which differ radically from the laws of classical physics. A qubit can exist not only in a state corresponding to the logical state 0 or 1 as in a classical bit, but also in states corresponding to a blend or
quantum superposition Quantum superposition is a fundamental principle of quantum mechanics. It states that, much like waves in classical physics, any two (or more) quantum states can be added together ("superposed") and the result will be another valid quantum ...
of these classical states. In other words, a qubit can exist as a zero, a one, or simultaneously as both 0 and 1, with a numerical coefficient representing the probability for each state. A quantum computer manipulates qubits by executing a series of quantum gates, each a unitary transformation acting on a single qubit or pair of qubits. In applying these gates in succession, a quantum computer can perform a complicated unitary transformation to a set of qubits in some initial state.


Reconfigurable computing

Field-programmable gate array A field-programmable gate array (FPGA) is an integrated circuit designed to be configured by a customer or a designer after manufacturinghence the term '' field-programmable''. The FPGA configuration is generally specified using a hardware d ...
s (FPGA) are making it possible to build truly reconfigurable computers. The computer architecture is transformed by on the fly reconfiguration of the FPGA circuitry. The optimal matching between architecture and algorithm improves the performance of the reconfigurable computer. The key feature is hardware performance and software flexibility. For several applications such as fingerprint matching, DNA sequence comparison, etc., reconfigurable computers have been shown to perform several orders of magnitude better than conventional computers.


Simulated annealing

The Simulated annealing algorithm is designed by looking at how the pure crystals form from a heated
gaseous state Gas is one of the four fundamental state of matter, states of matter (the others being solid, liquid, and plasma (physics), plasma). A pure gas may be made up of individual atoms (e.g. a noble gas like neon), chemical element, elemental molec ...
while the system is cooled slowly. The computing problem is redesigned as a simulated annealing exercise and the solutions are arrived at. The working principle of simulated annealing is borrowed from metallurgy: a piece of metal is heated (the atoms are given thermal agitation), and then the metal is left to cool slowly. The slow and regular cooling of the metal allows the atoms to slide progressively their most stable ("minimal energy") positions. (Rapid cooling would have "frozen" them in whatever position they happened to be at that time.) The resulting structure of the metal is stronger and more stable. By simulating the process of annealing inside a computer program, it is possible to find answers to difficult and very complex problems. Instead of minimizing the energy of a block of metal or maximizing its strength, the program minimizes or maximizes some objective relevant to the problem at hand.


Soft computing

One of the main components of "Lateral-computing" is
soft computing Soft computing is a set of algorithms, including neural networks, fuzzy logic, and evolutionary algorithms. These algorithms are tolerant of imprecision, uncertainty, partial truth and approximation. It is contrasted with hard computing: al ...
which approaches problems with human information processing model. The Soft Computing technique comprises Fuzzy logic, neuro-computing, evolutionary-computing, machine learning and probabilistic-chaotic computing.


Neuro computing

Instead of solving a problem by creating a non-linear equation model of it, the biological neural network analogy is used for solving the problem. The neural network is trained like a human brain to solve a given problem. This approach has become highly successful in solving some of the pattern recognition problems.


Evolutionary computing

The
genetic algorithm In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to gene ...
(GA) resembles the natural evolution to provide a universal optimization. Genetic algorithms start with a population of chromosomes which represent the various solutions. The solutions are evaluated using a fitness function and a selection process determines which solutions are to be used for competition process. These algorithms are highly successful in solving search and optimization problems. The new solutions are created using evolutionary principles such as mutation and crossover.


Fuzzy logic

Fuzzy logic Fuzzy logic is a form of many-valued logic in which the truth value of variables may be any real number between 0 and 1. It is employed to handle the concept of partial truth, where the truth value may range between completely true and completely ...
is based on the fuzzy sets concepts proposed by Lotfi Zadeh. The degree of membership concept is central to fuzzy sets. The fuzzy sets differ from crisp sets since they allow an element to belong to a set to a degree (degree of membership). This approach finds good applications for control problems. The Fuzzy logic has found enormous applications and has already found a big market presence in consumer electronics such as washing machines, microwaves, mobile phones, Televisions, Camcoders etc.


Probabilistic/chaotic computing

Probabilistic computing engines, e.g. use of probabilistic graphical model such as
Bayesian network A Bayesian network (also known as a Bayes network, Bayes net, belief network, or decision network) is a probabilistic graphical model that represents a set of variables and their conditional dependencies via a directed acyclic graph (DAG). Bay ...
. Such computational techniques are referred to as randomization, yielding probabilistic algorithms. When interpreted as a physical phenomenon through classical statistical thermodynamics, such techniques lead to energy savings that are proportional to the probability p with which each primitive computational step is guaranteed to be correct (or equivalently to the probability of error, (1–p). Chaotic Computing is based on the chaos theory.


Fractals

Fractal Computing are objects displaying self-similarity at different scales. Fractals generation involves small iterative algorithms. The fractals have dimensions greater than their topological dimensions. The length of the fractal is infinite and size of it cannot be measured. It is described by an iterative algorithm unlike a Euclidean shape which is given by a simple formula. There are several types of fractals and Mandelbrot sets are very popular. Fractals have found applications in image processing, image compression music generation, computer games etc. Mandelbrot set is a fractal named after its creator. Unlike the other fractals, even though the Mandelbrot set is self-similar at magnified scales, the small scale details are not identical to the whole. I.e., the Mandelbrot set is infinitely complex. But the process of generating it is based on an extremely simple equation. The Mandelbrot set ''M'' is a collection of complex numbers. The numbers ''Z'' which belong to ''M'' are computed by iteratively testing the Mandelbrot equation. ''C'' is a constant. If the equation converges for chosen ''Z'', then ''Z'' belongs to ''M''. Mandelbrot equation:


Randomized algorithm

A
Randomized algorithm A randomized algorithm is an algorithm that employs a degree of randomness as part of its logic or procedure. The algorithm typically uses uniformly random bits as an auxiliary input to guide its behavior, in the hope of achieving good performan ...
makes arbitrary choices during its execution. This allows a savings in execution time at the beginning of a program. The disadvantage of this method is the possibility that an incorrect solution will occur. A well-designed randomized algorithm will have a very high probability of returning a correct answer. The two categories of randomized algorithms are: * Monte Carlo algorithm * Las Vegas algorithm Consider an algorithm to find the ''k''th element of an array. A deterministic approach would be to choose a pivot element near the median of the list and partition the list around that element. The randomized approach to this problem would be to choose a pivot at random, thus saving time at the beginning of the process. Like approximation algorithms, they can be used to more quickly solve tough NP-complete problems. An advantage over the approximation algorithms, however, is that a randomized algorithm will eventually yield an exact answer if executed enough times


Machine learning

Human beings/animals learn new skills, languages/concepts. Similarly,
machine learning Machine learning (ML) is a field of inquiry devoted to understanding and building methods that 'learn', that is, methods that leverage data to improve performance on some set of tasks. It is seen as a part of artificial intelligence. Machine ...
algorithms provide capability to generalize from training data. There are two classes of Machine Learning (ML): * Supervised ML * Unsupervised ML One of the well known machine learning technique is Back Propagation Algorithm. This mimics how humans learn from examples. The training patterns are repeatedly presented to the network. The error is back propagated and the network weights are adjusted using gradient descent. The network converges through several hundreds of iterative computations.


Support vector machines

This is another class of highly successful machine learning techniques successfully applied to tasks such as text classification,
speaker recognition Speaker recognition is the identification of a person from characteristics of voices. It is used to answer the question "Who is speaking?" The term voice recognition can refer to ''speaker recognition'' or speech recognition. Speaker verification ...
, image recognition etc.


Example applications

There are several successful applications of lateral-computing techniques. Here is a small set of applications that illustrates lateral computing: *
Bubble sort Bubble sort, sometimes referred to as sinking sort, is a simple sorting algorithm that repeatedly steps through the input list element by element, comparing the current element with the one after it, swapping their values if needed. These passes ...
ing: Here the computing problem of sorting is approached with an analogy of bubbles rising in water. This is by treating the numbers as bubbles and floating them to their natural position. * Truck backup problem: This is an interesting problem of reversing a truck and parking it at a particular location. The traditional computing techniques have found it difficult to solve this problem. This has been successfully solved by Fuzzy system. * Balancing an inverted pendulum: This problem involves balancing and inverted pendulum. This problem has been efficiently solved by neural networks and fuzzy systems. * Smart volume control for mobile phones: The volume control in mobile phones depend on the background noise levels, noise classes, hearing profile of the user and other parameters. The measurement on noise level and loudness level involve imprecision and subjective measures. The authors have demonstrated the successful use of fuzzy logic system for volume control in mobile handsets. * Optimization using
genetic algorithm In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to gene ...
s and simulated annealing: The problems such as
traveling salesman problem The travelling salesman problem (also called the travelling salesperson problem or TSP) asks the following question: "Given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each cit ...
have been shown to be
NP complete In computational complexity theory, a problem is NP-complete when: # it is a problem for which the correctness of each solution can be verified quickly (namely, in polynomial time) and a brute-force search algorithm can find a solution by trying ...
problems. Such problems are solved using algorithms which benefit by heuristics. Some of the applications are in VLSI routing, partitioning etc. Genetic algorithms and Simulated annealing have been successful in solving such optimization problems. * Programming The Unprogrammable (PTU) involving the automatic creation of computer programs for unconventional computing devices such as cellular automata,
multi-agent system A multi-agent system (MAS or "self-organized system") is a computerized system composed of multiple interacting intelligent agents.Hu, J.; Bhowmick, P.; Jang, I.; Arvin, F.; Lanzon, A.,A Decentralized Cluster Formation Containment Framework fo ...
s, parallel systems,
field-programmable gate array A field-programmable gate array (FPGA) is an integrated circuit designed to be configured by a customer or a designer after manufacturinghence the term '' field-programmable''. The FPGA configuration is generally specified using a hardware d ...
s, field-programmable analog arrays, ant colonies,
swarm intelligence Swarm intelligence (SI) is the collective behavior of decentralized, self-organized systems, natural or artificial. The concept is employed in work on artificial intelligence. The expression was introduced by Gerardo Beni and Jing Wang in 1989, in ...
, distributed systems, and the like.Koza et al., 2003


Summary

Above is a review of lateral-computing techniques. Lateral-computing is based on the lateral-thinking approach and applies unconventional techniques to solve computing problems. While, most of the problems are solved in conventional techniques, there are problems which require lateral-computing. Lateral-computing provides advantage of computational efficiency, low cost of implementation, better solutions when compared to conventional computing for several problems. The lateral-computing successfully tackles a class of problems by exploiting tolerance for imprecision, uncertainty and partial truth to achieve tractability, robustness and low solution cost. Lateral-computing techniques which use the human like information processing models have been classified as "Soft Computing" in literature. Lateral-computing is valuable while solving numerous computing problems whose mathematical models are unavailable. They provide a way of developing innovative solutions resulting in smart systems with Very High Machine IQ (VHMIQ). This article has traced the transition from lateral-thinking to lateral-computing. Then several lateral-computing techniques have been described followed by their applications. Lateral-computing is for building new generation artificial intelligence based on unconventional processing.


See also

*
Calculation A calculation is a deliberate mathematical process that transforms one or more inputs into one or more outputs or ''results''. The term is used in a variety of senses, from the very definite arithmetical calculation of using an algorithm, to th ...
*
Computing Computing is any goal-oriented activity requiring, benefiting from, or creating computing machinery. It includes the study and experimentation of algorithmic processes, and development of both hardware and software. Computing has scientific, e ...
* Computationalism *
Real computation In computability theory, the theory of real computation deals with hypothetical computing machines using infinite-precision real numbers. They are given this name because they operate on the set of real numbers. Within this theory, it is possible ...
*
Reversible computation Reversible computing is any model of computation where the computational process, to some extent, is time-reversible. In a model of computation that uses deterministic transitions from one state of the abstract machine to another, a necessary c ...
* Hypercomputation *
Computation Computation is any type of arithmetic or non-arithmetic calculation that follows a well-defined model (e.g., an algorithm). Mechanical or electronic devices (or, historically, people) that perform computations are known as ''computers''. An es ...
* Computational problem * Unconventional computing


References


Sources

*Suthikshn Kumar CR(2022), Lateral Computing Algorithms: workbook for Programmers, Second Edition, 202
Lateral Computing Algorithms Book
last accessed on 28th Nov 2022) * * Proceedings of IEEE (2001)

September. * T. Ross (2004)
Fuzzy Logic With Engineering Applications
McGraw-Hill Inc Publishers. * B. Kosko (1994); Fuzzy Thinking, Flamingo Publishers. * E. Aarts and J. Krost (1997); Simulated Annealing and Boltzmann Machines, John Wiley And Sons Publishers. * K.V. Palem (2003)
Energy Aware Computing through Probabilistic Switching: A study of limits
Technical Report GIT-CC-03-16 May 2003. * M. Sima, S. Vassiliadis, S. Cotofona, J. T. J. Van Eijndoven, and K. A. Vissers (2000); A taxonomy of custom computing machines, in Proceedings of the Progress workshop, October. * J. Gleick (1998); Choas: Making a New Science, Vintage Publishers. * B. Mandelbrot (1997); The Fractal Geometry of Nature, Freeman Publishers, New York. * D.R. Hofstadter (1999); Godel, Escher, Bach: An Eternal Golden Braid, Harper Collins Publishers. * R.A. Aliev and R.R. Aliev (2001)
Soft Computing and Its Applications
World Scientific Publishers. * Jyh-Shing Roger Jang, Chuen-Tsai Sun & Eiji Mizutani (1997); Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence, Prentice Hall Publishers. * John R. Koza, Martin A. Keane, Matthew J. Streeter, William Mydlowec, Jessen Yu, and Guido Lanza (2003); Genetic Programming IV: Routine Human-Competitive Machine Intelligence, Kluwer Academic. * James Allen (1995); Natural Language Understanding, 2nd Edition, Pearson Education Publishers. * R. Herken (1995); Universal Turing Machine, Springer-Verlag 2nd Edition. * Harry R. Lewis, Christos H. Papadimtrou (1997); Elements of Theory of Computation, 2nd edition, Prentice Hall Publishers. * M. Garey and D. Johnson (1979); Computers and Intractability: A theory of NP Completeness, W.H. Freeman and Company Publishers. * M. Sipser (2001); Introduction to the Theory of Computation, Thomson/Brooks/Cole Publishers. * K. Compton and S. Hauck (2002); Reconfigurable Computing: A survey of Systems and Software, ACM Computing Surveys, Vo. 34, No.2, June 2002, pp. 171–210. * D.W. Patterson (1990); Introduction to Artificial Intelligence and Expert Systems, Prentice Hall Inc. Publishers. * E. Charniak and D. Mcdermott (1999); Introduction to Artificial Intelligence, Addison Wesley. * * R.L. Epstein and W.A. Carnielli (1989); Computability, Computable Functions, Logic and The Foundations of Mathematics, Wadsworth & Brooks/Cole Advanced Books and Software. * T. Joachims (2002);
Learning to Classify Text using Support Vector Machines
Kluwer Academic Publishers. * T. Mitchell (1997); Machine Learning, McGraw Hill Publishers. * R. Motwani and P. Raghavan (1995)
Randomized Algorithms
Cambridge International Series in Parallel Computation, Cambridge University Press. * Sun Microsystems (2003); Introduction to Throughput Computing, Technical Report.


Conferences

* ''First World Congress on Lateral Computing'', IISc, Bangalore India, December 200

*''Second World Congress on Lateral Computing'', WCLC 2005, PESIT, Bangalore, India {{DEFAULTSORT:Lateral Computing Problem solving methods Computational science