HOME

TheInfoList



OR:

A physical neural network is a type of
artificial neural network Artificial neural networks (ANNs), usually simply called neural networks (NNs) or neural nets, are computing systems inspired by the biological neural networks that constitute animal brains. An ANN is based on a collection of connected unit ...
in which an electrically adjustable material is used to emulate the function of a
neural synapse In the nervous system, a synapse is a structure that permits a neuron (or nerve cell) to pass an electrical or chemical signal to another neuron or to the target effector cell. Synapses are essential to the transmission of nervous impulses from ...
or a higher-order (dendritic) neuron model. "Physical" neural network is used to emphasize the reliance on physical hardware used to emulate
neurons A neuron, neurone, or nerve cell is an electrically excitable cell that communicates with other cells via specialized connections called synapses. The neuron is the main component of nervous tissue in all animals except sponges and placozoa. N ...
as opposed to software-based approaches. More generally the term is applicable to other artificial neural networks in which a
memristor A memristor (; a portmanteau of ''memory resistor'') is a non-linear two-terminal electrical component relating electric charge and magnetic flux linkage. It was described and named in 1971 by Leon Chua, completing a theoretical quartet of fu ...
or other electrically adjustable resistance material is used to emulate a neural synapse.


Types of physical neural networks


ADALINE

In the 1960s
Bernard Widrow Bernard Widrow (born December 24, 1929) is a U.S. professor of electrical engineering at Stanford University. He is the co-inventor of the Widrow–Hoff least mean squares filter (LMS) adaptive algorithm with his then doctoral student Ted Hoff. ...
and
Ted Hoff Marcian Edward "Ted" Hoff Jr. (born October 28, 1937 in Rochester, New York) is one of the inventors of the microprocessor. Education and work history Hoff received a bachelor's degree in electrical engineering from the Rensselaer Polytechnic Inst ...
developed
ADALINE ADALINE (Adaptive Linear Neuron or later Adaptive Linear Element) is an early single-layer artificial neural network and the name of the physical device that implemented this network. The network uses memistors. It was developed by Professor Bern ...
(Adaptive Linear Neuron) which used electrochemical cells called
memistor A memistor is a nanoelectric circuitry element used in parallel computing memory technology. Essentially, a resistor with memory able to perform logic operations and store information, it is a three-terminal implementation of the memristor. His ...
s (memory resistors) to emulate synapses of an artificial neuron. The memistors were implemented as 3-terminal devices operating based on the reversible electroplating of copper such that the resistance between two of the terminals is controlled by the integral of the current applied via the third terminal. The ADALINE circuitry was briefly commercialized by the Memistor Corporation in the 1960s enabling some applications in pattern recognition. However, since the memistors were not fabricated using integrated circuit fabrication techniques the technology was not scalable and was eventually abandoned as
solid-state electronics Solid-state electronics means semiconductor electronics: electronic equipment using semiconductor devices such as transistors, diodes and integrated circuits (ICs). The term is also used as an adjective for devices in which semiconductor electr ...
became mature.


Analog VLSI

In 1989
Carver Mead Carver Andress Mead (born May 1, 1934) is an American scientist and engineer. He currently holds the position of Gordon and Betty Moore Professor Emeritus of Engineering and Applied Science at the California Institute of Technology (Caltech), ...
published his book ''Analog VLSI and Neural Systems'', which spun off perhaps the most common variant of analog neural networks. The physical realization is implemented in analog VLSI. This is often implemented as field effect transistors in low inversion. Such devices can be modelled as translinear circuits. This is a technique described by
Barrie Gilbert Barrie Gilbert (5 June 1937 – 30 January 2020) was an English-American electrical engineer. He was well known for his invention of numerous analog circuit concepts, holding over 100 patents worldwide, and for the discovery of the Translinear Pri ...
in several papers around mid 1970th, and in particular his ''Translinear Circuits'' from 1981. With this method circuits can be analyzed as a set of well-defined functions in steady-state, and such circuits assembled into complex networks.


Physical Neural Network

Alex Nugent describes a physical neural network as one or more nonlinear neuron-like nodes used to sum signals and nanoconnections formed from nanoparticles, nanowires, or nanotubes which determine the signal strength input to the nodes. Alignment or self-assembly of the nanoconnections is determined by the history of the applied electric field performing a function analogous to neural synapses. Numerous applications for such physical neural networks are possible. For example, a temporal summation device can be composed of one or more nanoconnections having an input and an output thereof, wherein an input signal provided to the input causes one or more of the nanoconnection to experience an increase in connection strength thereof over time. Another example of a physical neural network is taught by U.S. Patent No. 7,039,619 entitled "Utilized
nanotechnology Nanotechnology, also shortened to nanotech, is the use of matter on an atomic, molecular, and supramolecular scale for industrial purposes. The earliest, widespread description of nanotechnology referred to the particular technological goal o ...
apparatus using a neural network, a solution and a connection gap," which issued to Alex Nugent by the U.S. Patent & Trademark Office on May 2, 2006. A further application of physical neural network is shown in U.S. Patent No. 7,412,428 entitled "Application of
hebbian Hebbian theory is a neuroscientific theory claiming that an increase in synaptic efficacy arises from a presynaptic cell's repeated and persistent stimulation of a postsynaptic cell. It is an attempt to explain synaptic plasticity, the adaptation ...
and anti-hebbian learning to nanotechnology-based physical neural networks," which issued on August 12, 2008. Nugent and Molter have shown that universal computing and general-purpose machine learning are possible from operations available through simple memristive circuits operating the AHaH plasticity rule. More recently, it has been argued that also complex networks of purely memristive circuits can serve as neural networks.


Phase change neural network

In 2002, Stanford Ovshinsky described an analog neural computing medium in which
phase-change material A phase change material (PCM) is a substance which releases/absorbs sufficient energy at phase transition to provide useful heat or cooling. Generally the transition will be from one of the first two fundamental states of matter - solid and liq ...
has the ability to cumulatively respond to multiple input signals. An electrical alteration of the resistance of the phase change material is used to control the weighting of the input signals.


Memristive neural network

Greg Snider of
HP Labs HP Labs is the exploratory and advanced research group for HP Inc. HP Labs' headquarters is in Palo Alto, California and the group has research and development facilities in Bristol, UK. The development of programmable desktop calculators, ink ...
describes a system of cortical computing with memristive nanodevices. The
memristors A memristor (; a portmanteau of ''memory resistor'') is a non-linear two-terminal electrical component relating electric charge and magnetic flux linkage. It was described and named in 1971 by Leon Chua, completing a theoretical quartet of fu ...
(memory resistors) are implemented by thin film materials in which the resistance is electrically tuned via the transport of ions or oxygen vacancies within the film.
DARPA The Defense Advanced Research Projects Agency (DARPA) is a research and development agency of the United States Department of Defense responsible for the development of emerging technologies for use by the military. Originally known as the Adv ...
's SyNAPSE project has funded IBM Research and HP Labs, in collaboration with the Boston University Department of Cognitive and Neural Systems (CNS), to develop neuromorphic architectures which may be based on memristive systems.


Protonic artificial synapses

In 2022, researchers reported the development of nanoscale brain-inspired artificial synapses, using the ion proton (), for 'analog
deep learning Deep learning (also known as deep structured learning) is part of a broader family of machine learning methods based on artificial neural networks with representation learning. Learning can be supervised, semi-supervised or unsupervised. De ...
'.


See also

*
AI accelerator An AI accelerator is a class of specialized hardware accelerator or computer system designed to accelerate artificial intelligence and machine learning applications, including artificial neural networks and machine vision. Typical applications in ...
*
Brain simulation Brain simulation is the concept of creating a functioning computer model of a brain or part of a brain. Brain simulation projects intend to contribute to a complete understanding of the brain, and eventually also assist the process of treating and ...
*
Neuromorphic engineering Neuromorphic engineering, also known as neuromorphic computing, is the use of electronic circuits to mimic neuro-biological architectures present in the nervous system. A neuromorphic computer/chip is any device that uses physical artificial ne ...
*
Optical neural network An optical neural network is a physical implementation of an artificial neural network with photonics, optical components. Early optical neural networks used a photorefractive Volume hologram to interconnect arrays of input neurons to arrays of o ...
*
Quantum neural network Quantum neural networks are computational neural network models which are based on the principles of quantum mechanics. The first ideas on quantum neural computation were published independently in 1995 by Subhash Kak and Ron Chrisley, engaging w ...


References

{{Reflist


External links


Information on DARPA's SyNAPSE project
2009 Artificial neural networks