Presentation + Paper
18 June 2024 Large-scale neural network in passive silicon photonics for biologically plausible learning
Alessio Lugnan, Alessandro Foradori, Stefano Biasi, Peter Bienstman, Lorenzo Pavesi
Author Affiliations +
Abstract
Neuromorphic computing hardware that requires conventional training procedures based on backpropagation is difficult to scale, because of the need for full observability of network states and for programmability of network parameters. Therefore, the search for hardware-friendly and biologically-plausible learning schemes, and suitable platforms, is pivotal for the future developments of the field. We present a novel experimental study of a photonic integrated neural network featuring rich recurrent nonlinear dynamics and both short- and long-term plasticity. Scalability in these architectures is greatly enhanced by the capability to process input and to generate output that are encoded concurrently in the temporal, spatial and wavelength domains. Moreover, we discuss a novel biologically-plausible, backpropagation-free and hardware-friendly learning procedure based on our neuromorphic hardware.
Conference Presentation
© (2024) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Alessio Lugnan, Alessandro Foradori, Stefano Biasi, Peter Bienstman, and Lorenzo Pavesi "Large-scale neural network in passive silicon photonics for biologically plausible learning", Proc. SPIE 13017, Machine Learning in Photonics, 130170S (18 June 2024); https://doi.org/10.1117/12.3017123
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Artificial neural networks

Photonics

Machine learning

Nonlinear optics

Silicon photonics

Neural networks

Nonlinear dynamics

Back to Top