Sigmoidlike activation functions, as available in analog hardware, differ in various ways from the standard sigmoidal function because they are usually asymmetric, truncated, and have a non-standard gain. We present an adaptation of the backpropagation learning rule to compensate for these nonstandard sigmoids. This method is applied to multilayer neural networks with all-optical forward propagation and liquid-crystal light valves (LCLV) as optical thresholding devices. In this paper, the results of simulations of a backpropagation neural network with five different LCLV response curves as activation functions are presented. Although LCLV's perform poorly with the standard backpropagation algorithm, it is shown that our adapted learning rule performs well with these LCLV curves.
The capabilities of deep learning systems have advanced much faster than our ability to understand them. Whilst the gains from deep neural networks (DNNs) are significant, they are accompanied by a growing risk and gravity of a bad outcome. This is tr ...
Michaël Unser, Alexis Marie Frederic Goujon