PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
Maximal computing performance can only be achieved if neural networks are fully hardware implemented. Besides the potentially large benefits, such parallel and analogue hardware platforms face new, fundamental challenges. An important concern is that such systems might ultimately succumb to the detrimental impact of noise. We study of noise propagation through deep neural networks with various neuron nonlinearities and trained via back-propagation for image recognition and time-series prediction. We consider correlated and uncorrelated, multiplicative and additive noise and use noise amplitudes extracted from a physical experiment. The developed analytical framework is of great relevance for future hardware neural networks. It allows predicting the noise level at the system’s output based on the properties of its constituents. As such it is an essential tool for future hardware neural network engineering and performance estimation.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
The alert did not successfully save. Please try again later.
Nadezhda Semenova, Xavier Porte, Maxime Jacquot, Laurent Larger, Daniel Brunner, "Noise propagation in feedforward and reservoir neural networks," Proc. SPIE 11469, Emerging Topics in Artificial Intelligence 2020, 114690J (20 August 2020); https://doi.org/10.1117/12.2570727