Hebbian theory is a
neuroscientific theory claiming that an increase in
synaptic efficacy arises from a
presynaptic cell
Chemical synapses are biological junctions through which neurons' signals can be sent to each other and to non-neuronal cells such as those in neuromuscular junction, muscles or glands. Chemical synapses allow neurons to form biological neural ...
's repeated and persistent stimulation of a postsynaptic cell. It is an attempt to explain
synaptic plasticity
In neuroscience, synaptic plasticity is the ability of synapses to strengthen or weaken over time, in response to increases or decreases in their activity. Since memories are postulated to be represented by vastly interconnected neural circuit ...
, the adaptation of brain
neuron
A neuron, neurone, or nerve cell is an electrically excitable cell that communicates with other cells via specialized connections called synapses. The neuron is the main component of nervous tissue in all animals except sponges and placozoa. N ...
s during the learning process. It was introduced by
Donald Hebb
Donald Olding Hebb (July 22, 1904 – August 20, 1985) was a Canadian psychologist who was influential in the area of neuropsychology, where he sought to understand how the function of neurons contributed to psychological processes such as learn ...
in his 1949 book ''
The Organization of Behavior
''Organization of Behavior'' is a 1949 book by the psychologist Donald O. Hebb
Donald Olding Hebb (July 22, 1904 – August 20, 1985) was a Canadian psychologist who was influential in the area of neuropsychology, where he sought to understa ...
.''
The theory is also called Hebb's rule, Hebb's postulate, and cell assembly theory. Hebb states it as follows:
Let us assume that the persistence or repetition of a reverberatory activity (or "trace") tends to induce lasting cellular changes that add to its stability. ... When an axon
An axon (from Greek ἄξων ''áxōn'', axis), or nerve fiber (or nerve fibre: see spelling differences), is a long, slender projection of a nerve cell, or neuron, in vertebrates, that typically conducts electrical impulses known as action po ...
of cell ''A'' is near enough to excite a cell ''B'' and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that ''A''’s efficiency, as one of the cells firing ''B'', is increased.
The theory is often summarized as "Cells that fire together wire together." However, Hebb emphasized that cell ''A'' needs to "take part in firing" cell ''B'', and such causality can occur only if cell ''A'' fires just before, not at the same time as, cell ''B''. This aspect of causation in Hebb's work foreshadowed what is now known about ''
spike-timing-dependent plasticity
Spike-timing-dependent plasticity (STDP) is a biological process that adjusts the strength of connections between neurons in the brain. The process adjusts the connection strengths based on the relative timing of a particular neuron's output and in ...
'', which requires temporal precedence.
The theory attempts to explain
associative or ''Hebbian learning'', in which simultaneous activation of cells leads to pronounced increases in
synaptic strength between those cells. It also provides a biological basis for
errorless learning
Errorless learning was an instructional design introduced by psychologist Charles Ferster in the 1950s as part of his studies on what would make the most effective learning environment. B. F. Skinner was also influential in developing the techniqu ...
methods for education and memory rehabilitation. In the study of
neural networks
A neural network is a network or circuit of biological neurons, or, in a modern sense, an artificial neural network, composed of artificial neurons or nodes. Thus, a neural network is either a biological neural network, made up of biological ...
in cognitive function, it is often regarded as the neuronal basis of
unsupervised learning
Unsupervised learning is a type of algorithm that learns patterns from untagged data. The hope is that through mimicry, which is an important mode of learning in people, the machine is forced to build a concise representation of its world and t ...
.
Hebbian engrams and cell assembly theory
Hebbian theory concerns how neurons might connect themselves to become
engrams. Hebb's theories on the form and function of cell assemblies can be understood from the following:
The general idea is an old one, that any two cells or systems of cells that are repeatedly active at the same time will tend to become 'associated' so that activity in one facilitates activity in the other.
Hebb also wrote:
When one cell repeatedly assists in firing another, the axon of the first cell develops synaptic knobs (or enlarges them if they already exist) in contact with the soma of the second cell.
. Alan Allportposits additional ideas regarding cell assembly theory and its role in forming engrams, along the lines of the concept of auto-association, described as follows:
If the inputs to a system cause the same pattern of activity to occur repeatedly, the set of active elements constituting that pattern will become increasingly strongly interassociated. That is, each element will tend to turn on every other element and (with negative weights) to turn off the elements that do not form part of the pattern. To put it another way, the pattern as a whole will become 'auto-associated'. We may call a learned (auto-associated) pattern an engram.
Work in the laboratory of
Eric Kandel
Eric Richard Kandel (; born Erich Richard Kandel, November 7, 1929) is an Austrian-born American medical doctor who specialized in psychiatry, a neuroscientist and a professor of biochemistry and biophysics at the College of Physicians and Surge ...
has provided evidence for the involvement of Hebbian learning mechanisms at synapses in the marine
gastropod
The gastropods (), commonly known as snails and slugs, belong to a large taxonomic class of invertebrates within the phylum Mollusca called Gastropoda ().
This class comprises snails and slugs from saltwater, from freshwater, and from land. T ...
''
Aplysia californica''. Experiments on Hebbian synapse modification mechanisms at the
central nervous system
The central nervous system (CNS) is the part of the nervous system consisting primarily of the brain and spinal cord. The CNS is so named because the brain integrates the received information and coordinates and influences the activity of all par ...
synapses of
vertebrate
Vertebrates () comprise all animal taxa within the subphylum Vertebrata () ( chordates with backbones), including all mammals, birds, reptiles, amphibians, and fish. Vertebrates represent the overwhelming majority of the phylum Chordata, ...
s are much more difficult to control than are experiments with the relatively simple
peripheral nervous system
The peripheral nervous system (PNS) is one of two components that make up the nervous system of bilateral animals, with the other part being the central nervous system (CNS). The PNS consists of nerves and ganglia, which lie outside the brain ...
synapses studied in marine invertebrates. Much of the work on long-lasting synaptic changes between vertebrate neurons (such as
long-term potentiation
In neuroscience, long-term potentiation (LTP) is a persistent strengthening of synapses based on recent patterns of activity. These are patterns of synaptic activity that produce a long-lasting increase in signal transmission between two neurons ...
) involves the use of non-physiological experimental stimulation of brain cells. However, some of the physiologically relevant synapse modification mechanisms that have been studied in vertebrate brains do seem to be examples of Hebbian processes. One such study reviews results from experiments that indicate that long-lasting changes in synaptic strengths can be induced by physiologically relevant synaptic activity working through both Hebbian and non-Hebbian mechanisms.
Principles
From the point of view of
artificial neuron
An artificial neuron is a mathematical function conceived as a model of biological neurons, a neural network. Artificial neurons are elementary units in an artificial neural network. The artificial neuron receives one or more inputs (representing ...
s and
artificial neural network
Artificial neural networks (ANNs), usually simply called neural networks (NNs) or neural nets, are computing systems inspired by the biological neural networks that constitute animal brains.
An ANN is based on a collection of connected unit ...
s, Hebb's principle can be described as a method of determining how to alter the weights between model neurons. The weight between two neurons increases if the two neurons activate simultaneously, and reduces if they activate separately. Nodes that tend to be either both positive or both negative at the same time have strong positive weights, while those that tend to be opposite have strong negative weights.
The following is a formulaic description of Hebbian learning: (many other descriptions are possible)
:
where
is the weight of the connection from neuron
to neuron
and
the input for neuron
. Note that this is pattern learning (weights updated after every training example). In a
Hopfield network
A Hopfield network (or Ising model of a neural network or Ising–Lenz–Little model) is a form of recurrent artificial neural network and a type of spin glass system popularised by John Hopfield in 1982 as described earlier by Little in 1974 b ...
, connections
are set to zero if
(no reflexive connections allowed). With binary neurons (activations either 0 or 1), connections would be set to 1 if the connected neurons have the same activation for a pattern.
When several training patterns are used the expression becomes an average of individual ones:
:
where
is the weight of the connection from neuron
to neuron
,
is the number of training patterns,
the
th input for neuron
and <> is the average over all training patterns. This is learning by epoch (weights updated after all the training examples are presented), being last term applicable to both discrete and continuous training sets. Again, in a Hopfield network, connections
are set to zero if
(no reflexive connections).
A variation of Hebbian learning that takes into account phenomena such as blocking and many other neural learning phenomena is the mathematical model of
Harry Klopf
Harry may refer to:
TV shows
* ''Harry'' (American TV series), a 1987 American comedy series starring Alan Arkin
* ''Harry'' (British TV series), a 1993 BBC drama that ran for two seasons
* ''Harry'' (talk show), a 2016 American daytime talk show ...
.
Klopf's model reproduces a great many biological phenomena, and is also simple to implement.
Relationship to unsupervised learning, stability, and generalization
Because of the simple nature of Hebbian learning, based only on the coincidence of pre- and post-synaptic activity, it may not be intuitively clear why this form of plasticity leads to meaningful learning. However, it can be shown that Hebbian plasticity does pick up the statistical properties of the input in a way that can be categorized as unsupervised learning.
This can be mathematically shown in a simplified example. Let us work under the simplifying assumption of a single rate-based neuron of rate
, whose inputs have rates
. The response of the neuron
is usually described as a linear combination of its input,
, followed by a
response function
In signal processing and electronics, the frequency response of a system is the quantitative measure of the magnitude and phase of the output as a function of input frequency. The frequency response is widely used in the design and analysis of s ...
:
:
As defined in the previous sections, Hebbian plasticity describes the evolution in time of the synaptic weight
:
:
Assuming, for simplicity, an identity response function
, we can write
:
or in
matrix
Matrix most commonly refers to:
* ''The Matrix'' (franchise), an American media franchise
** ''The Matrix'', a 1999 science-fiction action film
** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchis ...
form:
:
As in previous chapter, if training by epoch is done an average
over discrete or continuous (time) training set of
can be done:
where
is the
correlation matrix of the input under the additional assumption that
(i.e. the average of the inputs is zero). This is a system of
coupled linear differential equations. Since
is
symmetric
Symmetry (from grc, συμμετρία "agreement in dimensions, due proportion, arrangement") in everyday language refers to a sense of harmonious and beautiful proportion and balance. In mathematics, "symmetry" has a more precise definiti ...
, it is also
diagonalizable
In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P and a diagonal matrix D such that or equivalently (Such D are not unique.) F ...
, and the solution can be found, by working in its eigenvectors basis, to be of the form
:
where
are arbitrary constants,
are the eigenvectors of
and
their corresponding eigenvalues.
Since a correlation matrix is always a
positive-definite matrix
In mathematics, a symmetric matrix M with real entries is positive-definite if the real number z^\textsfMz is positive for every nonzero real column vector z, where z^\textsf is the transpose of More generally, a Hermitian matrix (that is, a ...
, the eigenvalues are all positive, and one can easily see how the above solution is always exponentially divergent in time.
This is an intrinsic problem due to this version of Hebb's rule being unstable, as in any network with a dominant signal the synaptic weights will increase or decrease exponentially. Intuitively, this is because whenever the presynaptic neuron excites the postsynaptic neuron, the weight between them is reinforced, causing an even stronger excitation in the future, and so forth, in a self-reinforcing way. One may think a solution is to limit the firing rate of the postsynaptic neuron by adding a non-linear, saturating response function
, but in fact, it can be shown that for ''any'' neuron model, Hebb's rule is unstable. Therefore, network models of neurons usually employ other learning theories such as
BCM theory
BCM theory, BCM synaptic modification, or the BCM rule, named for Elie Bienenstock, Leon Cooper, and Paul Munro, is a physical theory of learning in the visual cortex developed in 1981.
The BCM model proposes a sliding threshold for long-term pote ...
,
Oja's rule, or the
generalized Hebbian algorithm.
Regardless, even for the unstable solution above, one can see that, when sufficient time has passed, one of the terms dominates over the others, and
:
where
is the ''largest'' eigenvalue of
. At this time, the postsynaptic neuron performs the following operation:
:
Because, again,
is the eigenvector corresponding to the largest eigenvalue of the correlation matrix between the
s, this corresponds exactly to computing the first
principal component
Principal may refer to:
Title or rank
* Principal (academia), the chief executive of a university
** Principal (education), the office holder/ or boss in any school
* Principal (civil service) or principal officer, the senior management level ...
of the input.
This mechanism can be extended to performing a full PCA (principal component analysis) of the input by adding further postsynaptic neurons, provided the postsynaptic neurons are prevented from all picking up the same principal component, for example by adding
lateral inhibition
In neurobiology, lateral inhibition is the capacity of an excited neuron to reduce the activity of its neighbors. Lateral inhibition disables the spreading of action potentials from excited neurons to neighboring neurons in the lateral direction ...
in the postsynaptic layer. We have thus connected Hebbian learning to PCA, which is an elementary form of unsupervised learning, in the sense that the network can pick up useful statistical aspects of the input, and "describe" them in a distilled way in its output.
Limitations
Despite the common use of Hebbian models for long-term potentiation, Hebb's principle does not cover all forms of synaptic long-term plasticity. Hebb did not postulate any rules for inhibitory synapses, nor did he make predictions for anti-causal spike sequences (presynaptic neuron fires ''after'' the postsynaptic neuron). Synaptic modification may not simply occur only between activated neurons A and B, but at neighboring synapses as well. All forms of
heterosynaptic and
homeostatic plasticity
In neuroscience, homeostatic plasticity refers to the capacity of neurons to regulate their own excitability relative to network activity. The term homeostatic plasticity derives from two opposing concepts: 'homeostatic' (a product of the Greek w ...
are therefore considered non-Hebbian. An example is
retrograde signaling
Retrograde signaling in biology is the process where a signal travels backwards from a target source to its original source. For example, the nucleus of a cell is the original source for creating signaling proteins. During retrograde signaling, ins ...
to presynaptic terminals. The compound most commonly identified as fulfilling this retrograde transmitter role is
nitric oxide
Nitric oxide (nitrogen oxide or nitrogen monoxide) is a colorless gas with the formula . It is one of the principal oxides of nitrogen. Nitric oxide is a free radical: it has an unpaired electron, which is sometimes denoted by a dot in its che ...
, which, due to its high solubility and diffusivity, often exerts effects on nearby neurons. This type of diffuse synaptic modification, known as volume learning, is not included in the traditional Hebbian model.
Hebbian learning account of mirror neurons
Hebbian learning and spike-timing-dependent plasticity have been used in an influential theory of how
mirror neuron
A mirror neuron is a neuron that fires both when an animal acts and when the animal observes the same action performed by another. Thus, the neuron "mirrors" the behavior of the other, as though the observer were itself acting. Such neurons ha ...
s emerge. Mirror neurons are neurons that fire both when an individual performs an action and when the individual sees or hears another perform a similar action. The discovery of these neurons has been very influential in explaining how individuals make sense of the actions of others, by showing that, when a person perceives the actions of others, the person activates the motor programs which they would use to perform similar actions. The activation of these motor programs then adds information to the perception and helps predict what the person will do next based on the perceiver's own motor program. A challenge has been to explain how individuals come to have neurons that respond both while performing an action and while hearing or seeing another perform similar actions.
Christian Keysers
Christian Keysers is a French and German neuroscientist.
Education and career
He finished his school education at the European School, Munich and studied psychology and biology at the University of Konstanz, the Ruhr University Bochum, Univer ...
and David Perrett suggested that as an individual performs a particular action, the individual will see, hear, and feel the performing of the action. These re-afferent sensory signals will trigger activity in neurons responding to the sight, sound, and feel of the action. Because the activity of these sensory neurons will consistently overlap in time with those of the motor neurons that caused the action, Hebbian learning predicts that the synapses connecting neurons responding to the sight, sound, and feel of an action and those of the neurons triggering the action should be potentiated. The same is true while people look at themselves in the mirror, hear themselves babble, or are imitated by others. After repeated experience of this re-afference, the synapses connecting the sensory and motor representations of an action are so strong that the motor neurons start firing to the sound or the vision of the action, and a mirror neuron is created.
Evidence for that perspective comes from many experiments that show that motor programs can be triggered by novel auditory or visual stimuli after repeated pairing of the stimulus with the execution of the motor program (for a review of the evidence, see Giudice et al., 2009). For instance, people who have never played the piano do not activate brain regions involved in playing the piano when listening to piano music. Five hours of piano lessons, in which the participant is exposed to the sound of the piano each time they press a key is proven sufficient to trigger activity in motor regions of the brain upon listening to piano music when heard at a later time. Consistent with the fact that spike-timing-dependent plasticity occurs only if the presynaptic neuron's firing predicts the post-synaptic neuron's firing,
the link between sensory stimuli and motor programs also only seem to be potentiated if the stimulus is contingent on the motor program.
See also
*
Dale's principle
In neuroscience, Dale's principle (or Dale's law) is a rule attributed to the English neuroscientist Henry Hallett Dale. The principle basically states that a neuron performs the same chemical action at all of its synaptic connections to other c ...
*
Coincidence detection in neurobiology
Coincidence detection in the context of neurobiology is a process by which a neuron or a neural circuit can encode information by detecting the occurrence of temporally close but spatially distributed input signals. Coincidence detectors influen ...
*
Leabra Leabra stands for local, error-driven and associative, biologically realistic algorithm. It is a model of learning which is a balance between Hebbian and error-driven learning with other network-derived characteristics. This model is used to mathe ...
*
Metaplasticity Metaplasticity is a term originally coined by W.C. Abraham and M.F. Bear to refer to the plasticity of synaptic plasticity. Until that time synaptic plasticity had referred to the plastic nature of ''individual'' synapses. However this new form re ...
*
Tetanic stimulation
*
Synaptotropic hypothesis
The synaptotropic hypothesis, also called the synaptotrophic hypothesis, is a neurobiological hypothesis of neuronal growth and synapse formation. The hypothesis was first formulated by J.E. Vaughn in 1988, and remains a focus of current research ...
*
Neuroplasticity
Neuroplasticity, also known as neural plasticity, or brain plasticity, is the ability of Neural circuit, neural networks in the brain to change through growth and reorganization. It is when the brain is rewired to function in some way that diffe ...
*
Behaviorism
Behaviorism is a systematic approach to understanding the behavior of humans and animals. It assumes that behavior is either a reflex evoked by the pairing of certain antecedent (behavioral psychology), antecedent stimuli in the environment, o ...
References
Further reading
*
*
*
*
External links
Overview* Hebbian Learning tutorial
Part 1: Novelty FilteringPart 2: PCA
{{DEFAULTSORT:Hebbian Theory
Unsupervised learning
Neuroplasticity
*