Hyperdimensional Computing
   HOME

TheInfoList



OR:

Hyperdimensional computing (HDC) is an approach to computation, particularly
artificial intelligence Artificial intelligence (AI) is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machines, as opposed to intelligence displayed by animals and humans. Example tasks in which this is done include speech re ...
, where information is represented as a hyperdimensional (long) vector, an array of numbers. A hyperdimensional vector (hypervector) could include thousands of numbers that represent a point in a space of thousands of dimensions. Vector Symbolic Architectures is an older name for the same broad approach.


Process

Data is mapped from the input space to sparse HD space under an encoding function φ : X → H. HD representations are stored in data structures that are subject to corruption by noise/hardware failures. Noisy/corrupted HD representations can still serve as input for learning, classification, etc. They can also be decoded to recover the input data. H is typically restricted to range-limited integers (-v-v) This is analogous to the learning process conducted by fruit flies olfactory system. The input is a roughly 50-dimensional vector corresponding to odor receptor neuron types. The HD representation uses ~2,000-dimensions.


Transparency

HDC algebra reveals the logic of how and why systems makes decisions, unlike
artificial neural network Artificial neural networks (ANNs), usually simply called neural networks (NNs) or neural nets, are computing systems inspired by the biological neural networks that constitute animal brains. An ANN is based on a collection of connected unit ...
s. Physical world objects can be mapped to hypervectors, to be processed by the algebra.


Performance

HDC is suitable for "in-memory computing systems", which compute and hold data on a single chip, avoiding data transfer delays. Analog devices operate at low voltages. They are energy-efficient, but prone to error-generating noise. HDC's can tolerate such errors. Various teams have developed low-power HDC hardware accelerators. Nanoscale memristive devices can be exploited to perform computation. An in-memory hyperdimensional computing system can implement operations on two memristive crossbar engines together with peripheral digital
CMOS Complementary metal–oxide–semiconductor (CMOS, pronounced "sea-moss", ) is a type of metal–oxide–semiconductor field-effect transistor (MOSFET) fabrication process that uses complementary and symmetrical pairs of p-type and n-type MOSFE ...
circuits. Experiments using 760,000 phase-change memory devices performing analog in-memory computing achieved accuracy comparable to software implementations.


Errors

HDC is robust to errors such as an individual bit error (a 0 flips to 1 or vice versa) missed by error-correcting mechanisms. Eliminating such error-correcting mechanisms can save up to 25% of compute cost. This is possible because such errors leave the result "close" to the correct vector. Reasoning using vectors is not compromised. HDC is at least 10x more error tolerant than traditional
artificial neural network Artificial neural networks (ANNs), usually simply called neural networks (NNs) or neural nets, are computing systems inspired by the biological neural networks that constitute animal brains. An ANN is based on a collection of connected unit ...
s, which are already orders of magnitude more tolerant than traditional computing.


Example

A simple example considers images containing black circles and white squares. Hypervectors can represent SHAPE and COLOR variables and hold the corresponding values: CIRCLE, SQUARE, BLACK and WHITE. Bound hypervectors can hold the pairs BLACK and CIRCLE, etc.


Orthogonality

High-dimensional space allows many mutually
orthogonal In mathematics, orthogonality is the generalization of the geometric notion of ''perpendicularity''. By extension, orthogonality is also used to refer to the separation of specific features of a system. The term also has specialized meanings in ...
vectors. However, If vectors are instead allowed to be ''nearly orthogonal'', the number of distinct vectors in high-dimensional space is vastly larger. HDC uses the concept of distributed representations, in which an object/observation is represented by a pattern of values across many dimensions rather than a single constant.


Operations

HDC can combine hypervectors into new hypervectors using well-defined
vector space In mathematics and physics, a vector space (also called a linear space) is a set whose elements, often called ''vectors'', may be added together and multiplied ("scaled") by numbers called '' scalars''. Scalars are often real numbers, but can ...
operations.
Groups A group is a number of persons or things that are located, gathered, or classed together. Groups of people * Cultural group, a group whose members share the same cultural identity * Ethnic group, a group whose members share the same ethnic iden ...
, rings, and
fields Fields may refer to: Music *Fields (band), an indie rock band formed in 2006 *Fields (progressive rock band), a progressive rock band formed in 1971 * ''Fields'' (album), an LP by Swedish-based indie rock band Junip (2010) * "Fields", a song by ...
over hypervectors become the underlying computing structures with addition, multiplication, permutation, mapping, and inverse as primitive computing operations. All computational tasks are performed in high-dimensional space using simple operations like element-wise additions and
dot product In mathematics, the dot product or scalar productThe term ''scalar product'' means literally "product with a scalar as a result". It is also used sometimes for other symmetric bilinear forms, for example in a pseudo-Euclidean space. is an algebra ...
s. Binding creates ordered point tuples and is also a function ⊗ : H × H → H. The input is two points in , while the output is a dissimilar point. Multiplying the SHAPE vector with CIRCLE ''binds'' the two, representing the idea “SHAPE is CIRCLE”. This vector is "nearly orthogonal" to SHAPE and CIRCLE. The components are recoverable from the vector (e.g., answer the question "is the shape a circle?"). Addition creates a vector that combines concepts. For example, adding “SHAPE is CIRCLE” to “COLOR is RED,” creates a vector that represents a red circle. Permutation rearranges the vector elements. For example, permuting a three-dimensional vector with values labeled ''x'', ''y'' and ''z'', can interchange ''x'' to ''y'', ''y'' to ''z'', and ''z'' to ''x''. Events represented by hypervectors A and B can be added, forming one vector, but that would sacrifice the event sequence. Combining addition with permutation preserves the order; the event sequence can be retrieved by reversing the operations. Bundling combines a set of elements in H as function ⊕ : H ×H → H. The input is two points in H and the output is a third point that is similar to both.


History

Vector symbolic architectures (VSA) provided a systematic approach to high-dimensional symbol representations to support operations such as establishing relationships. Early examples include holographic reduced representations, binary spatter codes, and matrix binding of additive terms. HD computing advanced these models, particularly emphasizing hardware efficiency. In 2018, Eric Weiss showed how to fully represent an image as a hypervector. A vector could contain information about all the objects in the image, including properties such as color, position, and size. In 2023, Abbas Rahimi et al., used HDC with neural networks to solve
Raven's progressive matrices Raven's Progressive Matrices (often referred to simply as Raven's Matrices) or RPM is a non-verbal test typically used to measure general human intelligence and abstract reasoning and is regarded as a non-verbal estimate of fluid intelligence. It ...
.


Applications


Image recognition

HDC algorithms can replicate tasks long completed by deep neural networks, such as classifying images. Classifying an annotated set of handwritten digits uses an algorithm to analyze the features of each image, yielding a hypervector per image. The algorithm then adds the hypervectors for all labeled images of e.g., zero, to create a prototypical hypervector for the concept of zero and repeats this for the other digits. Classifying an unlabeled image involves creating a hypervector for it and comparing it to the reference hypervectors. This comparison identifies the digit that the new image most resembles. Given labeled example set S = \_^N, \ \ x_ \in X \ \ y_ \in \_^K is the class of a particular ''xi''. Given query xq ∈ X the most similar prototype can be found with k^* = _^ \ p(\phi(x_)), \phi(c_)). The similarity metric ρ is typically the dot-product.


Reasoning

Hypervectors can also be used for reasoning. Raven's progressive matrices presents images of objects in a grid. One position in the grid is blank. The test is to choose from candidate images the one that best fits. A dictionary of hypervectors represents individual objects. Each hypervector represents an object concept with its attributes. For each test image a neural network generates a binary hypervector (valus are +1 or −1) that is as close as possible to some set of dictionary hypervectors. The generated hypervector thus describes all the objects and their attributes in the image. Another algorithm creates probability distributions for the number of objects in each image and their characteristics. These probability distributions describe the likely characteristics of both the context and candidate images. They too are transformed into hypervectors, then algebra predicts the most likely candidate image to fill the slot. This approach achieved 88% accuracy on one problem set, beating neural network–only solutions that were 61% accurate. For 3-by-3 grids, the system was 250x faster than a method that used
symbolic logic Mathematical logic is the study of formal logic within mathematics. Major subareas include model theory, proof theory, set theory, and recursion theory. Research in mathematical logic commonly addresses the mathematical properties of formal ...
to reason, because of the size of the associated rulebook.


Other

Other applications include bio-signal processing, natural language processing, and robotics.


See also

*
Support vector machine In machine learning, support vector machines (SVMs, also support vector networks) are supervised learning models with associated learning algorithms that analyze data for classification and regression analysis. Developed at AT&T Bell Laboratorie ...


References


External links

* * * * * * {{Cite magazine , last=Ananthaswamy , first=Anil , title=A New Approach to Computation Reimagines Artificial Intelligence , language=en-US , magazine=Quanta Magazine , url=https://www.quantamagazine.org/a-new-approach-to-computation-reimagines-artificial-intelligence-20230413/ , access-date=2023-06-13 , date = 2023-04-13 Artificial neural networks Deep learning