View a PDF of the paper titled Grounding Continuous Representations in Geometry: Equivariant Neural Fields, by David R Wessels and 6 other authors
Abstract:Conditional Neural Fields (CNFs) are increasingly being leveraged as continuous signal representations, by associating each data-sample with a latent variable that conditions a shared backbone Neural Field (NeF) to reconstruct the sample. However, existing CNF architectures face limitations when using this latent downstream in tasks requiring fine grained geometric reasoning, such as classification and segmentation. We posit that this results from lack of explicit modelling of geometric information (e.g. locality in the signal or the orientation of a feature) in the latent space of CNFs. As such, we propose Equivariant Neural Fields (ENFs), a novel CNF architecture which uses a geometry-informed cross-attention to condition the NeF on a geometric variable, a latent point cloud of features, that enables an equivariant decoding from latent to field. We show that this approach induces a steerability property by which both field and latent are grounded in geometry and amenable to transformation laws: if the field transforms, the latent representation transforms accordingly – and vice versa. Crucially, this equivariance relation ensures that the latent is capable of (1) representing geometric patterns faitfhully, allowing for geometric reasoning in latent space, (2) weight-sharing over similar local patterns, allowing for efficient learning of datasets of fields. We validate these main properties in a range of tasks including classification, segmentation, forecasting and reconstruction, showing clear improvement over baselines with a geometry-free latent space.
Submission history
From: David Wessels [view email]
[v1]
Sun, 9 Jun 2024 12:16:30 UTC (2,622 KB)
[v2]
Tue, 11 Jun 2024 12:45:08 UTC (3,095 KB)
[v3]
Mon, 17 Jun 2024 07:28:40 UTC (3,096 KB)
[v4]
Fri, 4 Oct 2024 15:00:24 UTC (3,387 KB)
Source link
lol