These Scientists Have Created An Artificial Brain With A Single Neuron

Researchers at the Technische Universität Berlin (TU Berlin) used a single neuron coded into software to simulate a computer’s neural network comprising hundreds of nerve cells. A temporal delay stimulates the neuron, allowing it to operate in split seconds. As a result of the impact, a new form of neural network develops that is distributed in time.

This technique might enable innovative solutions to integrate artificial neurons directly into hardware platforms in the long term. Moreover, this technology is eco-friendly and energy-effective.

Researchers have developed artificial neural networks that can “learn” like their human counterparts for years. They’ve known for a long time that brain cells have varying degrees of connection. The neural network can recognise objects, categorise photos, and generate sentences on its own after initial training.

“Our research addresses two limitations current neural networks face,” says PD Dr Serhiy Yanchuk, head of the Applied Dynamical Systems Research Area at TU Berlin.

A single training cycle for one of the best AI algorithms for language formation consumes as much power as driving 700,000 kilometres in an automobile, according to a University of Copenhagen study. After training, if such an AI application is used in industry, the total power consumption may be higher.

“On the other hand, there are also neural networks where the neurons are built as real physical systems,” explains Yanchuk. “These can be achieved as purely electrical semiconductor devices or with the aid of optical technologies based on lasers. And, of course, there are limits to the number of these hardware neurons and the connections between them.

Although supercomputers can program billions of neurons, the latest hardware solutions have only achieved a few thousand artificial nerve cells. According to the TU Berlin researchers, the solution to both of these issues may lie in using a single neuron that assumes the job of all neurons in a neural network in a time-delayed input/output feedback loop.

“We have now demonstrated with the computer that this is in principle possible. Laser-based circuits would be particularly suitable for implementation in hardware because they are so fast that the time delays are particularly short,” explains Florian Stelzer, lead author of the study.

According to this technique, the spatial gap between two neurons in the network would be replaced by a temporal delay.

“The option to influence the strength of individual neural connections, which is essential to training,” Stelzer adds, “can be achieved here by further manipulating the time delays.”

Artificial Neurons Could Be As Efficient As The Human Brain - RankRed

The hardware implementation will help the fabrication of a single neuron. However, the researchers believe their strategy will allow artificial neural networks to use less energy than traditional software solutions.

“Our system functions as a sort of expansion to reservoir computing with an artificial nerve cell at the core,” says Stelzer.

“This artificial nerve cell is nothing more than a mathematical, non-linear function that we can precisely define. Additionally, we are able for the first time to simulate the different network levels of a deep neural network.”

This is called a “Folded-in-time Deep Neural Network” (Fit-DNN). The researchers credit their accomplishment to their team’s multidisciplinary composition of mathematicians, physicists, and software engineers.

Beyond AI system improvements, more research into the folded-in-time neural network might lead to a plethora of new discoveries.

“If the time delay between two neurons “located” directly next to each other in time is shortened further, it would theoretically be possible to create a limitless number of neurons,” explains Serhiy Yanchuk.

The findings were recently recognised in Nature Communications as an “Editors’ Highlight”.

Source: TU Berlin

Leave a Reply

Your email address will not be published. Required fields are marked *