MemComputing Introduces Superior Method for Training Neural Networks

SAN DIEGO, Feb. 12, 2020 /PRNewswire/ -- MemComputing, Inc., developer of disruptive high-performance computing technology, today released its whitepaper entitled "An Efficient Approach to the Supervised Training of Deep Neural Networks using MemComputing," highlighting the advantages of MemComputing's new training approach compared to traditional deep learning methods.

The paper addresses the inherent limitations associated with today's most popular gradient-based methods, such as Adaptive Moment Estimation (ADAM) and Stochastic Gradient Descent (SGD), which incorporate backpropagation. MemComputing's approach instead aims towards a more global and parallelized optimization algorithm, achievable through its entirely new computing architecture. Each data point contributes toward the network update concurrently and in parallel with the others, resulting in faster and more robust learning.

Fabio Traversa, CTO, and co-inventor of MemComputing commented, "MemComputing represents an alternative to gradient descent-based methods. Our approach allows for the trainable parameters of the network to be optimized together in a truly parallelized fashion, allowing for the training of deeper networks without the need of most hotfixes used for Deep Neural Network training."

As a case study, MemComputing reported performance on the detection of Exotic High-Energy Particles compared to ADAM and SGD. Traversa adds, "MemComputing training was 5 times faster than traditional methods with higher accuracy."

Although MemComputing has been focusing on solving combinatorial optimization problems, John Beane, CEO of MemComputing commented, "This opens a whole new door for our technology; when you look at the Artificial Intelligence market, we have been using the same methods to train our neural networks for over 3 decades. Even ignoring our superior performance, the introduction of an alternative method is significant in and of itself."

About MemComputing

MemComputing, Inc.'s disruptive technology is accelerating the time to find practical solutions to the world's most challenging computational problems. The company was formed by Dr. Massimiliano Di Ventra and Dr. Fabio Traversa, with John A. Beane, former Entrepreneur-in-Residence, UC San Diego. Di Ventra and Traversa are the inventors of "memcomputing", a new computing paradigm in which memory performs the tasks of both storing and processing information (as compared to traditional computing based on the von Neumann architecture, where memory and processing are separate). For more information, visit www.memcomputing.com.

The report can be downloaded for free at: https://www.memcpu.com/white-papers/

Press Contact:

Johnny Aiken
MemComputing, Inc.
jaiken@memcpu.com

View original content to download multimedia:http://www.prnewswire.com/news-releases/memcomputing-introduces-superior-method-for-training-neural-networks-301003522.html

SOURCE MemComputing Inc.