Sigmoid-like activation functions implemented in analog hardware differ in various ways from the standard sigmoidal function as they are asymmetric, truncated, and have a non-standard gain. It is demonstrated how one can adapt the backpropagation learning rule to compensate for these non-standard sigmoids as available in hardware. This method is applied to multilayer neural networks with all-optical forward propagation and liquid crystal light valves (LCLV) as optical thresholding devices. In this paper the results of software simulations of a backpropagation neural network with five different LCLV activation functions are presented and it is shown that the adapted learning rule performs well with these LCLV curves
The capabilities of deep learning systems have advanced much faster than our ability to understand them. Whilst the gains from deep neural networks (DNNs) are significant, they are accompanied by a growing risk and gravity of a bad outcome. This is tr ...
Michaël Unser, Alexis Marie Frederic Goujon