View a PDF of the paper titled NeuralSolver: Learning Algorithms For Consistent and Efficient Extrapolation Across General Tasks, by Bernardo Esteves and 2 other authors
Abstract:We contribute NeuralSolver, a novel recurrent solver that can efficiently and consistently extrapolate, i.e., learn algorithms from smaller problems (in terms of observation size) and execute those algorithms in large problems. Contrary to previous recurrent solvers, NeuralSolver can be naturally applied in both same-size problems, where the input and output sizes are the same, and in different-size problems, where the size of the input and output differ. To allow for this versatility, we design NeuralSolver with three main components: a recurrent module, that iteratively processes input information at different scales, a processing module, responsible for aggregating the previously processed information, and a curriculum-based training scheme, that improves the extrapolation performance of the method. To evaluate our method we introduce a set of novel different-size tasks and we show that NeuralSolver consistently outperforms the prior state-of-the-art recurrent solvers in extrapolating to larger problems, considering smaller training problems and requiring less parameters than other approaches.
Submission history
From: Bernardo Esteves [view email]
[v1]
Fri, 23 Feb 2024 15:51:45 UTC (2,417 KB)
[v2]
Fri, 7 Jun 2024 17:10:09 UTC (3,075 KB)
[v3]
Wed, 30 Oct 2024 13:42:44 UTC (2,752 KB)
Source link
lol