Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Recurrent neural networks based on reservoir computing are increasingly being used in many applications. Optimization of the topological structure of the reservoir and the internal connection weights for a given task is one of the most important problems in reservoir computing. In this paper, considering the fact that one can construct a large matrix using Kronecker products of several small-size matrices, we propose a method to optimize the reservoir. Having a small number of parameters to optimize, a gradient based algorithm is applied to optimize parameters, and consequently the reservoir. In addition to reducing the number of parameters for optimization, potentially, the method is able to control several other properties of the reservoir such as spectral radius, sparsity, weight distribution and underlying connections, i.e. connection topology. To reveal the effectiveness of the proposed optimization method, the application to the following tasks are considered: Nonlinear autoregressive moving average and multiple superimposed oscillators. Simulation results show satisfactory performance of the method.
Volkan Cevher, Grigorios Chrysos, Fanghui Liu