HOME

TheInfoList



OR:

Bidirectional associative memory (BAM) is a type of
recurrent neural network A recurrent neural network (RNN) is a class of artificial neural networks where connections between nodes can create a cycle, allowing output from some nodes to affect subsequent input to the same nodes. This allows it to exhibit temporal dynamic ...
. BAM was introduced by
Bart Kosko Bart Andrew Kosko (born February 7, 1960) is a writer and professor of electrical engineering and law at the University of Southern California (USC). He is a researcher and popularizer of fuzzy logic, neural networks, and noise, and author of sev ...
in 1988. There are two types of associative memory, auto-associative and hetero-associative. BAM is hetero-associative, meaning given a pattern it can return another pattern which is potentially of a different size. It is similar to the
Hopfield network A Hopfield network (or Ising model of a neural network or Ising–Lenz–Little model) is a form of recurrent artificial neural network and a type of spin glass system popularised by John Hopfield in 1982 as described earlier by Little in 1974 b ...
in that they are both forms of associative
memory Memory is the faculty of the mind by which data or information is encoded, stored, and retrieved when needed. It is the retention of information over time for the purpose of influencing future action. If past events could not be remembered ...
. However, Hopfield nets return patterns of the same size. It is said to be bi-directional as it can respond to inputs from either the input or the output layer.


Topology

A BAM contains two layers of
neuron A neuron, neurone, or nerve cell is an electrically excitable cell that communicates with other cells via specialized connections called synapses. The neuron is the main component of nervous tissue in all animals except sponges and placozoa. ...
s, which we shall denote X and Y. Layers X and Y are fully connected to each other. Once the weights have been established, input into layer X presents the pattern in layer Y, and vice versa. The layers can be connected in both directions (bidirectional) with the result the weight matrix sent from the X layer to the Y layer is W and the weight matrix for signals sent from the Y layer to the X layer is W^T. Thus, the weight matrix is calculated in both directions.


Procedure


Learning

Imagine we wish to store two associations, A1:B1 and A2:B2. * A1 = (1, 0, 1, 0, 1, 0), B1 = (1, 1, 0, 0) * A2 = (1, 1, 1, 0, 0, 0), B2 = (1, 0, 1, 0) These are then transformed into the bipolar forms: * X1 = (1, -1, 1, -1, 1, -1), Y1 = (1, 1, -1, -1) * X2 = (1, 1, 1, -1, -1, -1), Y2 = (1, -1, 1, -1) From there, we calculate M = \sum where X_i^T denotes the transpose. So, M = \left \right/math>


Recall

To retrieve the association A1, we multiply it by M to get (4, 2, -2, -4), which, when run through a threshold, yields (1, 1, 0, 0), which is B1. To find the reverse association, multiply this by the transpose of M.


Capacity

The memory or storage capacity of BAM may be given as \min(m,n), where "n" is the number of units in the X layer and "m" is the number of units in the Y layer. The internal matrix has n x p independent degrees of freedom, where n is the dimension of the first vector (6 in this example) and p is the dimension of the second vector (4). This allows the BAM to be able to reliably store and recall a total of up to min(n,p) independent vector pairs, or min(6,4) = 4 in this example. The capacity can be increased above by sacrificing reliability (incorrect bits on the output).


Stability

A pair (A, B) defines the state of a BAM. To store a pattern, the energy function value for that pattern has to occupy a minimum point in the energy landscape. The stability analysis of a BAM is based on the definition of
Lyapunov function In the theory of ordinary differential equations (ODEs), Lyapunov functions, named after Aleksandr Lyapunov, are scalar functions that may be used to prove the stability of an equilibrium of an ODE. Lyapunov functions (also called Lyapunov’s s ...
(energy function) E , with each state (A, B) . When a paired pattern (A, B) is presented to BAM, the neurons change states until a bi-directionally stable state (A_f, B_f) is reached, which Kosko proved to correspond to a local minimum of the energy function. The discrete BAM is proved to converge to a stable state. The Energy Function proposed by Kosko is E(A,B) = -AMB^T for the bidirectional case, which for a particular case A=B corresponds to Hopfield's Auto-associative Energy Function. (i.e. E(A,B) = -AMA^T).


See also

* Autoassociative memory * Self-organizing feature map


References

{{reflist, 2


External links


Bidirectional Associative Memory – Python source code for the Wiki article

Bidirectional associative memories – ACM Portal Reference
Artificial neural networks