Next-gen Analog AI Chip Will Be Faster Than the Human Brain

Depending on who you talk to, AI is either fast approaching its limits, or there is room yet for a new generation of jaw-dropping performance as AI models go over the trillion-parameter mark.

But there is one thing that everyone will agree on: The most powerful AI models today require massive amounts of energy to train, with hardware requirements that put it far beyond the reach of most individuals and corporations – even if they have the know-how.

Analog deep learning

It turns out that a new area of artificial intelligence known as analog deep learning might one day offer faster computation with far less energy consumption.

According to a report on MIT News, a multidisciplinary team of MIT researchers has improved on their previous work to create a type of human-made analog synapses that runs a million times faster than an earlier iteration. And the previous iteration already runs a million times faster than the synapses in the human brain.

The researchers achieved this with an inorganic material, using it to make programmable resistors, much like how transistors are used to build CPUs and GPUs today. The hope is to leverage this as the building block to create a specialized analog processor for deep learning.

“By repeating arrays of programmable resistors in complex layers, researchers can create a network of analog artificial ‘neurons’ and ‘synapses’ that execute computations just like a digital neural network. This network can then be trained to achieve complex AI tasks like image recognition and natural language processing,” said the report.

In the human brain, learning happens through the strengthening and weakening of connections between neurons, also known as synapses. A deep neural network also uses this approach, albeit with the use of artificial weights programmed through AI “training”.

A future analog processor will adopt this design, and by increasing and decreasing the electrical conductance of the resistors, enable analog machine learning, says the report.

Crucially, the inorganic material also makes the resistor extremely energy-efficient, with the potential to leverage current silicon fabrication techniques to build it, opening the door to its commercialization for deep-learning.

But unlike the standard human brain, it could be far faster: “[The resistors for] analog deep learning are 1,000 times smaller than biological cells… Thus, the space-time-energy performance of the all-solid-state artificial synapses can greatly exceed that of their biological counterparts.”

The research is published here (subscription required).

Image credit: iStockphoto/metamorworks