View a PDF of the paper titled Balanced Neural ODEs: nonlinear model order reduction and Koopman operator approximations, by Julius Aka and 4 other authors
Abstract:Variational Autoencoders (VAEs) are a powerful framework for learning latent representations of reduced dimensionality, while Neural ODEs excel in learning transient system dynamics. This work combines the strengths of both to generate fast surrogate models with adjustable complexity reacting on time-varying inputs signals. By leveraging the VAE’s dimensionality reduction using a nonhierarchical prior, our method adaptively assigns stochastic noise, naturally complementing known NeuralODE training enhancements and enabling probabilistic time series modeling. We show that standard Latent ODEs struggle with dimensionality reduction in systems with time-varying inputs. Our approach mitigates this by continuously propagating variational parameters through time, establishing fixed information channels in latent space. This results in a flexible and robust method that can learn different system complexities, e.g. deep neural networks or linear matrices. Hereby, it enables efficient approximation of the Koopman operator without the need for predefining its dimensionality. As our method balances dimensionality reduction and reconstruction accuracy, we call it Balanced Neural ODE (B-NODE). We demonstrate the effectiveness of this methods on several academic and real-world test cases, e.g. a power plant or MuJoCo data.
Submission history
From: Julius Aka [view email]
[v1]
Mon, 14 Oct 2024 05:45:52 UTC (9,885 KB)
[v2]
Tue, 15 Oct 2024 10:15:12 UTC (9,885 KB)
[v3]
Tue, 14 Jan 2025 13:11:05 UTC (10,485 KB)
Source link
lol