/MIT Researchers Discover A New, Faster AI Using Liquid Neural Neurons

MIT Researchers Discover A New, Faster AI Using Liquid Neural Neurons

Key idea: The “liquid” neural network allows AI algorithms to adapt to new input data.

Original author and publication date: Jace Dela Cruz (TechTimes) – November 16, 2022

Futurizonte Editor’s Note: Let’s be honest: a real-life positronic brain (or whatever name it is used for the the new brain) is just around the corner.

From the article:   

Artificial neural networks are a method that artificial intelligence utilizes to simulate how the human brain functions. A neural network “learns” from input from datasets and produces a forecast based on the available data.

But now, MIT Computer Science and Artificial Intelligence Lab (MIT CSAIL) researchers found a faster method to solve an equation that is employed in the algorithms for “liquid” neural neurons, according to a report by Interesting Engineering.

Liquid Neural Neurons
Researchers from MIT created liquid neural neurons last year, drawing inspiration from the brains of microscopic species.

It is described as “liquid” since the algorithm can modify the equations in response to fresh information, allowing it to adapt to changes encountered in real-world systems.

The researchers who developed the liquid neurons have found a technique to simplify the differential equations underlying the interaction of two neurons through synapses.

Differential equations make it possible to calculate the state of the world or a phenomenon across time as it develops step-by-step rather than from beginning to end, as noted by Interesting Engineering.

This gave them access to a new class of quicker artificial intelligence algorithms. Since they are adaptable and understandable, the modes have similar qualities to liquid neural nets, but what makes them novel is how much faster and more scalable they are.

The liquid neural network is a cutting-edge type of neural network that can change its behavior after reviewing input data. .

Read here the complete article