Presentation
20 August 2020 Noise propagation in feedforward and reservoir neural networks
Nadezhda Semenova, Xavier Porte, Maxime Jacquot, Laurent Larger, Daniel Brunner
Author Affiliations +
Abstract
Maximal computing performance can only be achieved if neural networks are fully hardware implemented. Besides the potentially large benefits, such parallel and analogue hardware platforms face new, fundamental challenges. An important concern is that such systems might ultimately succumb to the detrimental impact of noise. We study of noise propagation through deep neural networks with various neuron nonlinearities and trained via back-propagation for image recognition and time-series prediction. We consider correlated and uncorrelated, multiplicative and additive noise and use noise amplitudes extracted from a physical experiment. The developed analytical framework is of great relevance for future hardware neural networks. It allows predicting the noise level at the system’s output based on the properties of its constituents. As such it is an essential tool for future hardware neural network engineering and performance estimation.
Conference Presentation
© (2020) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Nadezhda Semenova, Xavier Porte, Maxime Jacquot, Laurent Larger, and Daniel Brunner "Noise propagation in feedforward and reservoir neural networks", Proc. SPIE 11469, Emerging Topics in Artificial Intelligence 2020, 114690J (20 August 2020); https://doi.org/10.1117/12.2570727
Advertisement
Advertisement
KEYWORDS
Neural networks

Neurons

Chaos

Numerical simulations

Back to Top