Synthetic Nervous System (SNS) is a
computational neuroscience
Computational neuroscience (also known as theoretical neuroscience or mathematical neuroscience) is a branch of neuroscience which employs mathematics, computer science, theoretical analysis and abstractions of the brain to understand th ...
model that may be developed with the Functional Subnetwork Approach (FSA) to create biologically plausible models of circuits in a nervous system.
The FSA enables the direct analytical tuning of dynamical networks that perform specific operations within the nervous system without the need for global optimization methods like
genetic algorithm
In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to g ...
s and
reinforcement learning
Reinforcement learning (RL) is an interdisciplinary area of machine learning and optimal control concerned with how an intelligent agent should take actions in a dynamic environment in order to maximize a reward signal. Reinforcement learnin ...
. The primary use case for a SNS is system control, where the system is most often a simulated biomechanical model or a physical robotic platform.
An SNS is a form of a
neural network
A neural network is a group of interconnected units called neurons that send signals to one another. Neurons can be either biological cells or signal pathways. While individual neurons are simple, many of them together in a network can perfor ...
much like
artificial neural network
In machine learning, a neural network (also artificial neural network or neural net, abbreviated ANN or NN) is a computational model inspired by the structure and functions of biological neural networks.
A neural network consists of connected ...
s (ANNs),
convolutional neural network
A convolutional neural network (CNN) is a type of feedforward neural network that learns features via filter (or kernel) optimization. This type of deep learning network has been applied to process and make predictions from many different ty ...
s (CNN), and
recurrent neural network
Recurrent neural networks (RNNs) are a class of artificial neural networks designed for processing sequential data, such as text, speech, and time series, where the order of elements is important. Unlike feedforward neural networks, which proces ...
s (RNN). The building blocks for each of these neural networks is a series of nodes and connections denoted as neurons and synapses. More conventional artificial neural networks rely on
training phases where they use large data sets to form correlations and thus “learn” to identify a given object or pattern. When done properly this training results in systems that can produce a desired result, sometimes with impressive accuracy. However, the systems themselves are typically “
black box
In science, computing, and engineering, a black box is a system which can be viewed in terms of its inputs and outputs (or transfer characteristics), without any knowledge of its internal workings. Its implementation is "opaque" (black). The te ...
es” meaning there is no readily distinguishable mapping between structure and function of the network. This makes it difficult to alter the function, without simply starting over, or extract biological meaning except in specialized cases. The SNS method differentiates itself by using details of both structure and function of biological nervous systems. The neurons and synapse connections are intentionally designed rather than iteratively changed as part of a
learning algorithm.

As in many other computational neuroscience models (Rybak,
Eliasmith
), the details of a neural model are informed by experimental data wherever possible. Not every study can measure every parameter of the network under investigation, requiring the modeler to make assumptions regarding plausible parameter values. Rybak uses a sampling method where each node is composed of many neurons and each particular neuron’s parameters are pulled from a probability distribution.
Eliasmith uses what they call the Neural Engineering Framework (NEF) in which the user specifies the functions of the network and the synaptic and neural properties are learned over time.
SNS follows a similar approach via the Functional Subnetwork Approach (FSA). FSA allows parameters within the network (e.g., membrane conductances, synaptic conductances) to be designed analytically based on their intended function. As a result, it is possible to use this approach to directly assemble networks that perform basic functions, like addition or subtraction, as well as dynamical operations like differentiation and integration.
Background and History of Synthetic Nervous Systems
Background
The details of the underlying control networks for many biological systems are not very well understood. However, recent advancements in neuroscience tools and techniques have clarified the cellular and biophysical mechanisms of these networks, and their operation during behavior in complex environments. Although there is a long-standing interest in biologically-inspired robots and robotic platforms, there is a recent interest in incorporating features of biomechanics and neural control, e.g.,
biomimicry
Biomimetics or biomimicry is the emulation of the models, systems, and elements of nature for the purpose of solving complex human problems. The terms "biomimetics" and "biomimicry" are derived from (''bios''), life, and μίμησις ('' mīm ...
. The SNS method uses data from neuroscience in control systems for neuromechanical simulations and robots. Designing both a robot’s mechanics and controller to capture key aspects of a particular animal may lead to more flexible functionality while suggesting new hypotheses for how the animal’s nervous system works.
Keeping neural models simple facilitates analysis, real time operation, and tuning. To this end, SNSs primarily model neurons as
leaky integrator
In mathematics, a leaky integrator equation is a specific differential equation, used to describe a component or system that takes the integral of an input, but gradually leaks a small amount of input over time. It appears commonly in hydraulics ...
s, which are reasonable approximations of sub-threshold passive membrane dynamics. The leaky integrator also models non-spiking interneurons which contribute to motor control in some invertebrates (locust,
stick insect, ''C. elegans'' ). If spiking needs to be incorporated into the model, nodes may be represented using the leaky integrate-and-fire models.
In addition, other conductances like those of the
Hodgkin-Huxley model can be incorporated into the model.
A model may be initialized with simple components (e.g., leaky integrators), and then details added to incorporate additional biological details. The modeler may then increase or decrease the level of biological detail depending upon the intended application. Keeping models simple in this way offers:
* The ability to use
dynamical system
In mathematics, a dynamical system is a system in which a Function (mathematics), function describes the time dependence of a Point (geometry), point in an ambient space, such as in a parametric curve. Examples include the mathematical models ...
s analysis by way of balancing biological detail with analytical tractability.
* Fast and computationally inexpensive network dynamic simulations to work effectively in a robotic controller. Thus, complex traditional models, like the
cable equation or the full Hodgkin-Huxley action potential model, are typically avoided or simplified for the sake of computational efficiency.
* Sparse function-dependent connectivity via the Functional Subnetwork (FSA) instead of fully connected (i.e., all-to-all connected) topologies, common in
machine learning
Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of Computational statistics, statistical algorithms that can learn from data and generalise to unseen data, and thus perform Task ( ...
.
While the neuroscientific models are typically simplified for SNS, the method is flexible enough that more features can be incorporated. Consequently, the SNS method can accommodate demand driven complexity, only adding features specifically where they are needed. For example, persistent sodium channels can be added to just two neurons in a neural circuit to create a half- center oscillator
pattern generator
A pattern is a regularity in the world, in human-made design, or in abstraction, abstract ideas. As such, the elements of a pattern repeat in a predictable manner. A geometric pattern is a kind of pattern formed of geometry, geometric shapes and ...
without changing the other neurons in the circuit. While these additions may increase computational cost, they grant the system the ability to perform a wider array of interesting behaviors.
History
The term “synthetic nervous system” (SNS) has appeared in the literature since the year 2000 to describe several different computational frameworks for mimicking the functionality of biological nervous systems.
Cynthia Breazeal developed a social robot named “Kismet” while at MIT in the early 2000s. She used the term SNS to refer to her biologically-inspired hierarchical model of cognition, which included systems for low-level sensory feature extraction, attention, perception, motivation, behavior, and motor output. Using this framework, Kismet could respond to people by abstracting its sensory information into motivation for responsive behaviors and the corresponding motor output.
In 2005, Inman Harvey used the term in a review article on his field, Evolutionary Robotics. In his article, Harvey uses the term SNS to refer to the evolved neural controller for a simulated agent. He does not explicitly define the term SNS; instead, he uses the term to differentiate the evolved neural controller from one created ''via'' alternative approaches, e.g.,
multi-layer perceptron (MLP) networks.
In 2008, Thomas R. Insel, MD, the director of the National Institute of Mental Health, was quoted in an American Academy of Neurology interview calling for a “clear moon shot…
o motivatea decade of new discovery
ndbasic research on brain anatomy”. As part of that interview, Dr. Insel suggested building a “synthetic nervous system” as one such motivational moon shot to drive ongoing and future research. The technical details of what such a SNS would entail were not described.
An article published as part of the International Work-Conference on Artificial Neural Networks (IWANN) proposes a “synthetic nervous system” as an alternative to artificial neural networks (ANNs) based in machine learning. In particular, SNS should be able to include or learn new information without forgetting what it has already learned. However, the authors do not propose a computational neuroscience framework for constructing such networks. Instead, they propose a homeostatic network of the robot’s “needs”, in which the robot takes actions to satisfy its needs and return to homeostasis. Over time, the robot learns which actions to take in response to its needs.
A dissertation from Prof. Joseph Ayer’s lab at Northeastern University uses a similar term in its title but never explicitly defines it. The topic of the dissertation is “RoboLobster, a biomimetic robot controlled by an electronic nervous system simulation”.
Other publications from Prof. Ayers use the term “electronic nervous system” (ENS) to describe similar work. In each of these studies, Prof. Ayers uses a robot that is controlled by a network of simplified dynamical neural models whose structure mimic specific networks from the model organism.
The choice of neural model reflects a balance between simulating the dynamics of the nervous system, which motivates mathematical complexity, while ensuring the simulation runs in real time, which motivates mathematical simplicity.
A 2017 research article from Prof. Alexander Hunt, Dr. Nicholas Szczecinski, and Prof. Roger Quinn use the term SNS and implicitly define it as “neural
rneuro-mechanical models…composed of non-spiking leaky integrator neuron models”.
Similar to work by Ayers et al., Hunt et al. apply the term SNS to refer to a simplified dynamical simulation of neurons and synapses used in the
closed-loop control
A closed-loop controller or feedback controller is a control loop which incorporates feedback, in contrast to an ''open-loop controller'' or ''non-feedback controller''.
A closed-loop controller uses feedback to control states or outputs of a dy ...
of robotic hardware. Subsequent articles by these authors present the Functional Subnetwork Approach for tuning SNS constructed from these and other simplified dynamical neural models
(i.e., leaky integrate-and-fire), as well as further SNS models of the nervous system
Comparing the diversity of works that use the term SNS produces an implicit definition of SNS:
* Their Network’s structure and behavioral goals are grounded in biology
* Their priority is to learn more about nervous system function, with the secondary goal of creating a more effective robot control system
* They are typically posed as an alternative to more abstracted neural networks with simplified (e.g., all-to-all) network structure (e.g., multi-layer perceptron networks, deep neural networks)
* They are computational models of the nervous system meant for closed-loop control of the behavior of simulated or robotic agent within an environment.
Comparison to Other Neural Networks
SNSs share some features with machine learning networks like
Artificial Neural Networks
In machine learning, a neural network (also artificial neural network or neural net, abbreviated ANN or NN) is a computational model inspired by the structure and functions of biological neural networks.
A neural network consists of connected ...
(ANN),
Convolutional Neural Networks
A convolutional neural network (CNN) is a type of feedforward neural network that learns features via filter (or kernel) optimization. This type of deep learning network has been applied to process and make predictions from many different type ...
(CNN), and
Recurrent Neural Networks (RNN). All of these networks are composed of neurons and synapses inspired in some way by biological nervous systems. These components are used to build neural circuits with the express purpose of accomplishing a specific task. ANN simply refers to a collection of nodes (neurons) connected such that they loosely model a biological brain. This is a rather broad definition and as a consequence there are many subcategories of ANN, two of which are CNN and RNN. CNNs are primarily used for image recognition and classification. Their layer-to-layer connections implement
convolutional kernels across small areas of the image, which map the input to the system (typically an image) onto a collection of features. ANNs and CNNs are only loosely associated with SNS in that they share the same general building blocks of neurons and synapses, though the methods used to model each component varies between the networks. Of the three, RNNs are the most closely related to SNS. SNSs use the same leaky-integrator neuron models utilized in RNNs. This is advantageous as neurons inherently act as
low pass filters, which is useful for robotic applications where such filtering is often applied to reduce noise for both sensing and control purposes. Both models also exhibit dynamic responses to inputs. While predicting the responses of a complicated network can be difficult, the dynamics of each node are relatively simple in that each is a system of first order differential equations (as opposed to fractional derivatives). The key difference that distinguishes SNS from these neural networks are the synaptic connections and the general architecture of the neural circuit.
RNN structures generally present as large, highly connected or even all-to- all connected layers of neurons. Instead of these layers, SNS relies on functional subnetworks which are tuned to perform specific operations and then assembled into larger networks with explainable functions. These are significantly more tractable than a typical machine learning network. The tradeoff of SNS is that it typically takes more time to design and tune the network but it does not require a training phase involving large amounts of computing power and training data. The other key difference is that SNS synapses are conductance based rather than current based which makes the dynamics non-linear, unlike an RNN. This allows for the modelling of
modulatory neural pathways since the synapses can alter the net membrane conductance of a postsynaptic neuron without injecting current. It also enables the functional subnetwork approach to encompass addition, subtraction, multiplication, division, differentiation, and integration operations using the same family of functions.
Neuron and Synapse Models
Non-spiking Neuron
SNS networks are composed mainly of non-spiking
leaky integrator
In mathematics, a leaky integrator equation is a specific differential equation, used to describe a component or system that takes the integral of an input, but gradually leaks a small amount of input over time. It appears commonly in hydraulics ...
nodes to which complexity may be added if needed. Such dynamics model non-spiking neurons like those studied extensively in invertebrates (e.g., nematode, locust,
cockroach ) or may represent the mean activity of a population of spiking neurons. The dynamics of the membrane voltage
of a non-spiking neuron are governed by the differential equation
where
is the membrane capacitance,
is an arbitrary current injected into the cell e.g., ''via'' a current clamp, and
and
are the leak and synaptic currents, respectively. The leak current
where
is the conductance of the cell membrane and
is the rest potential across the cell membrane. The synaptic current
where
is the number of synapses that impinge on the cell,
is the instantaneous synaptic conductance of the
incoming synapse, and
is the reversal potential of the
incoming synapse.
Graded Chemical Synapse

Non-spiking neurons communicate ''via'' graded chemical synapses:. Typically, synaptic conductances are modeled with a continuous function like a sigmoid but in an SNS this conductance is approximated by the following piecewise-linear function
As shown in the corresponding figure this allows the conductance to vary between 0 and a prescribed or designed maximum value (
) depending on the presynaptic potential (
). A piecewise approach is used to ensure exactly 0 conductance, and therefore current, at low activation potentials. This approximates a feature of spiking neuron activity in that no information is transmitted when the neuron isn’t spiking/active. Furthermore, this approximation eliminates transcendental functions enabling analytical calculations of dynamical properties. While this does prevent the network activity from being differentiable, since no
gradient-based learning methods are employed (like
backpropagation
In machine learning, backpropagation is a gradient computation method commonly used for training a neural network to compute its parameter updates.
It is an efficient application of the chain rule to neural networks. Backpropagation computes th ...
) this is not a drawback.
Persistent Sodium Current
It was previously mentioned that additional ion channels could be incorporated to elicit more interesting behaviors from non-spiking neuron models. The
persistent sodium current is one such addition. A persistent sodium current can depolarize a membrane enough to induce action potential firings at sub-threshold membrane potentials while also being slow to inactivate. In the context of neuroscientific models, this is useful for applications such as pattern generators where it is desired that a neuron’s potential can be rapidly increased and remain elevated until inhibited by another neural signal or applied current.
The model for the behavior of this channel is based on the m and h gating present in the full Hodgkin-Huxley model. The main difference is that this model only uses one m gate instead of three. The equations governing this behavior can be found
here
Here may refer to:
Music
* ''Here'' (Adrian Belew album), 1994
* ''Here'' (Alicia Keys album), 2016
* ''Here'' (Cal Tjader album), 1979
* ''Here'' (Edward Sharpe album), 2012
* ''Here'' (Idina Menzel album), 2004
* ''Here'' (Merzbow album), ...
and in this paper.
Integrate-and-Fire
Unless explicitly studying or utilizing the Hodgkin-Huxley model for action potentials, spiking neurons can be modeled via the integrate-and-fire method. This is significantly more computationally efficient than Hodgkin-Huxley making it easier to simulate much larger networks. In particular,
leaky integrate-and-fire (LIF) neurons are used for SNS.
As the name suggests, this model accounts for membrane potential leak behavior representing ion diffusion across the membrane. This integrate-and-fire model is very similar to the non-spiking neuron described above with the key addition of a firing threshold parameter. When the neuron potential depolarizes to this threshold the neuron “spikes” by instantaneously resetting to its resting potential
While these do not provide the same diversity of dynamical responses as Hodgkin-Huxley, they are usually sufficient for SNS applications and can be analyzed mathematically which is crucial for network tractability. Please refer to the linked Wikipedia article and paper for the equations associated with the LIF neuron model.
Izhikevich Model
Spiking neurons can also be modeled in a computationally efficient manner without sacrificing the rich behaviors exhibited in biological neural activity.
The Izhikevich model can produce spiking behaviors approximately as plausible as Hodgkin-Huxley but with comparable computational efficiency to the integrate-and-fire method. To accomplish this, Izhikevich reduces the Hodgkin-Huxley model to a two-dimensional set of
ordinary differential equation
In mathematics, an ordinary differential equation (ODE) is a differential equation (DE) dependent on only a single independent variable (mathematics), variable. As with any other DE, its unknown(s) consists of one (or more) Function (mathematic ...
s via bifurcation methods.
These can be seen here:
Where the membrane potential resets after spiking as described by:
is a dimensionless variable representing the membrane potential.
is a dimensionless variable representing membrane recovery which accounts for the ion current behaviors, specifically those of
and
.
,
,
, and
are dimensionless parameters that can be altered to shape the signal into different neuronal response patterns. This enables chattering, bursting, and continuous spiking with frequency adaptation which constitute a richer array of behaviors than the basic integrate-and-fire method can produce.
The coefficients in the
equation were acquired via data fitting to a particular neuron’s spiking patterns (a cortical neuron in this case) to get the potentials in the mV range and time on the scale of ms. It is possible to use other neurons to fit the spike initiation dynamics, they will simply produce different coefficients.
For more information on the Izhikevich model and the bifurcation methods used to develop it please read the following.
Rulkov Map
The Rulkov map forgoes complex ion channel-based models composed of many non-linear differential equations in favor of a two-dimensional map.
This map expresses slow and fast dynamics which is vital for representing both slow oscillations and fast spikes and bursts. The model is shown below:
is the fast dynamical variable and represents the membrane potential while
is the slow dynamical variable and does not have explicit biological meaning.
and
are used to describe external influences and help model the dynamics of stimuli like injected/synaptic and tonic/bias currents. Small values of
result in slow changes in
that account for its slower behavior. Assuming a constant external influence (
) the function
can be written as the following discontinuous function: