EPFL researchers have developed an algorithm to train analog neural networks as accurately as digital ones, offering more efficient alternatives to power-hungry deep learning hardware
Researchers at EPFL (École Polytechnique Fédérale de Lausanne) have developed an algorithm, called PhyLL, that enables the training of analog neural networks with the same accuracy as digital ones. This breakthrough offers more efficient alternatives to power-hungry deep learning hardware. Unlike traditional backpropagation-based training, PhyLL replaces the backpropagation step with a second forward pass through the physical system, reducing power consumption and better reflecting human learning. The algorithm demonstrated comparable accuracy to backpropagation-based training and showed robustness and adaptability.
EPFL researchers have successfully trained analog neural networks using PhyLL on three wave-based physical systems: sound waves, light waves, and microwaves. The algorithm can be applied to any physical system, making it a versatile approach. By replacing the backpropagation step with a second forward pass through the physical system, PhyLL eliminates the need for a digital twin, reducing power usage and improving efficiency. This method aligns more closely with how human learning occurs and shows enhanced speed, robustness, and reduced power consumption compared to other training methods.
Traditional deep neural networks have faced challenges related to their size, complexity, and energy consumption. In response, researchers have been exploring physical alternatives to digital deep neural networks. The algorithm developed by EPFL researchers at Laboratory of Wave Engineering (LWE) offers a promising solution to these challenges. While the LWE approach still requires some digital updates of the parameters, the aim is to minimize digital computation as much as possible. Future research will focus on implementing the algorithm on larger-scale optical systems and overcoming technical limitations to further increase network scalability.
Webdesk AI News : Physical Neural Networks, December 7, 2023