Science

Artificial Intelligence: Faster Calculation with Much Less Energy – Enerzine

As scientists push the boundaries of artificial intelligence, the time, energy, and money required to train increasingly complex neural network models is skyrocketing. A new field of artificial intelligence called analog deep learning promises faster computing with less energy.

Programmable resistors are key elements of analog deep learning, just as transistors are the building blocks of digital processors. By iterating networks of programmable resistors in complex layers, researchers can create a network of analog artificial “neurons” and “synapses” that perform calculations similar to a digital neural network. This network can then be trained to perform complex AI tasks such as image recognition and natural language processing.

An interdisciplinary team of researchers at the Massachusetts Institute of Technology set out to push the speed limits of a human-made analog synapse they had previously developed. They used a handy inorganic material in the manufacturing process that allows their devices to run 1 million times faster than previous versions, which is also about 1 million times faster than the synapses of the human brain.

In addition, this inorganic material also makes the resistor extremely energy efficient. Unlike the materials used in the previous version of their device, the new material is compatible with silicon fabrication technologies. This change has enabled the fabrication of nanoscale devices and may pave the way for integration into commercial computing hardware for deep learning applications.

“Thanks to this key idea and the very powerful nanofabrication techniques that we have at MIT nano, we were able to put the pieces together and demonstrate that these devices are inherently very fast and operate at reasonable voltages,” says lead author Jesús A del. Alamo, Donner Professor in the Department of Electrical and Computer Science (EECS) at the Massachusetts Institute of Technology. “This work has really taken these devices to the point where they now look promising for future applications.”

“The working mechanism of the device is the electrochemical introduction of a tiny ion, a proton, into an insulating oxide to modulate its electronic conductivity. Since we are working with very thin devices, we can accelerate the movement of this ion using a strong electric field and bring these ion devices to nanosecond operating mode,” says lead author Bilge Yildiz, Bryne Prof. Kerr in the departments. nuclear science and engineering; and materials science and engineering.

“The action potential in biological cells rises and falls in milliseconds because a voltage difference of about 0.1 volts is limited by the stability of water,” explains lead author Yu Li, professor of nuclear science and engineering at Battelle. Energy Alliance and professor of materials science and engineering. And the stronger the field, the faster the ion devices work.”

These programmable resistors greatly increase the learning rate of the neural network while drastically reducing the cost and energy required to train it. This could help scientists develop deep learning models much faster, which could then be applied to applications such as self-driving cars, fraud detection, or medical image analysis.

“Once you have an analog processor, you can’t form networks that everyone is working on. You will create networks of unprecedented complexity that no one else can afford, and therefore you will be well ahead of them all. In other words, it’s not a faster car, it’s a spaceship,” adds Murat Onen, lead author and MIT postdoctoral fellow.

Co-authors include Francis M. Ross, Ellen Swallow Richards, chair of the Department of Materials Science and Engineering, postdocs Nicholas Emond and Baoming Wang, and EECS PhD student Difei Zhang. This study is published today in the journal Science.

Accelerate deep learning

Analog deep learning is faster and more energy efficient than its digital counterpart for two main reasons. “First, the calculations are done in memory, so huge amounts of data aren’t being transferred back and forth between memory and the processor.” Analog processors also perform operations in parallel. By increasing the size of the matrix, the analog processor does not need more time to perform new operations, since all calculations are performed simultaneously.

The key element of MIT’s new analog processor technology is known as the proton programmable resistor. These resistors, measured in nanometers (a nanometer is one billionth of a meter), are arranged in a grid like a chessboard.

In the human brain, learning occurs by strengthening and weakening connections between neurons called synapses. Deep neural networks have been using this strategy for a long time, where the network weights are programmed with learning algorithms. In the case of this new processor, increasing and decreasing the electrical conductance of the proton resistors is powered by analog machine learning.

Conductivity is controlled by the movement of protons. To increase the conductivity, more protons are pushed into the channel of the resistor, and to decrease the conductivity, protons are removed. To do this, we use an electrolyte (similar to a battery) that conducts protons but blocks electrons.

To develop an ultra-fast and highly energy-efficient programmable proton resistor, the researchers considered various electrolyte materials. While other devices used organic compounds, Onen focused on inorganic phosphosilicate glass (PSG).

PSG is basically silicon dioxide, a powdered dehumidifying material found in small packets that come with new furniture to remove moisture. It is also the best known oxide used in silicon processing. To make PSG, a tiny amount of phosphorus is added to silicon to give it special proton-conducting characteristics.

Onen suggested that the optimized PSG could have high proton conductivity at room temperature without the need for water, making it an ideal solid electrolyte for this application. He was right.

Amazing speed

PSG enables ultra-fast proton movement because it contains many nanometer-sized pores whose surfaces provide pathways for protons to scatter. It can also withstand very strong pulsed electric fields. This is important, Onen says, because applying a higher voltage to the device allows the protons to travel at incredible speeds.

“Speed, of course, surprised. Normally, we wouldn’t apply such extreme fields to devices, lest they turn to ashes. But instead, the protons were moving at tremendous speed through the stack of devices, a million times faster, to be exact, than before. And this movement does not damage anything, thanks to the small size and low mass of protons. It’s almost like teleportation,” he explains.

“The nanosecond time scale means that we are close to the ballistic regime, or even the proton quantum tunneling regime in such an extreme field,” Li adds.

Since protons do not damage the material, the resistor can operate millions of cycles without fail. This new electrolyte made it possible to create a programmable proton resistor that is a million times faster than the previous device and can operate efficiently at room temperature, which is important for inclusion in computer equipment.

Due to the insulating properties of PSG, almost no electric current passes through the material during the movement of protons. This makes the device extremely energy efficient, Onen adds.

Now that they’ve demonstrated the effectiveness of these programmable resistors, the researchers plan to redesign them to produce them in larger quantities, del Alamo says. They will then be able to study the properties of resistor networks and scale them up so they can be integrated into systems.

At the same time, they plan to study materials to remove bottlenecks that limit the voltage required to efficiently transport protons in, through, and out of an electrolyte.

“Another interesting avenue that these ionic devices could open up is energy-efficient hardware to emulate neural circuits and the rules of synaptic plasticity that are inferred in neuroscience, in addition to analog deep neural networks,” Yildiz adds.

“The collaboration we have will be essential to innovation in the future. The road ahead will continue to be very difficult, but at the same time very exciting,” adds Mr. del Alamo.

[ Communiqué ]Main link: www.mit.edu
Another link: dx.doi.org/10.1126/science.abp8064

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker.