On the Robustness of Perceptron Learning Recurrent Networks
Related publications (35)
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
With ever greater computational resources and more accessible software, deep neural networks have become ubiquitous across industry and academia.
Their remarkable ability to generalize to new samples defies the conventional view, which holds that complex, ...
How can we decipher the hidden structure of a network based on limited observations? This question arises in many scenarios ranging from social to wireless and to neural networks. In such settings, we typically observe the nodes’ behaviors (e.g., the time ...
Neuroscientific theories aim to explain paradigm cases of consciousness such as masking, binocular rivalry or the transition from dreamless sleep to wakefulness. The most popular theories are based on computational principles. Recurrent processing is a key ...
This paper provides a time-domain feedback analysis of the perceptron learning algorithm and of training schemes for dynamic networks with output feedback. It studies the robustness performance of the algorithms in the presence of uncertainties that might ...
The goal of the scene labeling task is to assign a class label to each pixel in an image. To ensure a good visual coherence and a high class accu- racy, it is essential for a model to capture long range (pixel) label dependencies in images. In a feed-forwa ...
We investigate generic models for cortical microcircuits, i.e., recurrent circuits of integrate-and-fire neurons with dynamic synapses. These complex dynamic systems subserve the amazing information processing capabilities of the cortex, but are at the pre ...
The optimal setting of the initial weights, learning rate, and gain of the activation function, which are key parameters of a neural network, influencing training time and generalization performance, are investigated by means of a large number of experimen ...
Proper initialization is one of the most important prerequisites for fast convergence of feed-forward neural networks like high order and multilayer perceptrons. This publication aims at determining the optimal variance (or range) for the initial weights a ...
Neural networks are widely applied in research and industry. However, their broader application is hampered by various technical details. Among these details are several training parameters and the choice of the topology of the network. The subject of this ...
Quantization of the parameters of a Perceptron is a central problem in hardware implementation of neural networks using a numerical technology. A neural model with each weight limited to a small integer range will require little surface of silicon. Moreove ...