View a PDF of the paper titled Extension of Recurrent Kernels to different Reservoir Computing topologies, by Giuseppe Alessio D’Inverno and 1 other authors
Abstract:Reservoir Computing (RC) has become popular in recent years due to its fast and efficient computational capabilities. Standard RC has been shown to be equivalent in the asymptotic limit to Recurrent Kernels, which helps in analyzing its expressive power. However, many well-established RC paradigms, such as Leaky RC, Sparse RC, and Deep RC, are yet to be analyzed in such a way. This study aims to fill this gap by providing an empirical analysis of the equivalence of specific RC architectures with their corresponding Recurrent Kernel formulation. We conduct a convergence study by varying the activation function implemented in each architecture. Our study also sheds light on the role of sparse connections in RC architectures and propose an optimal sparsity level that depends on the reservoir size. Furthermore, our systematic analysis shows that in Deep RC models, convergence is better achieved with successive reservoirs of decreasing sizes.
Submission history
From: Jonathan Dong [view email]
[v1]
Thu, 25 Jan 2024 22:54:39 UTC (699 KB)
[v2]
Fri, 6 Sep 2024 13:49:48 UTC (1,464 KB)
Source link
lol