15
Nov
arXiv:2411.09686v1 Announce Type: cross Abstract: Several statistical models for regression of a function $F$ on $mathbb{R}^d$ without the statistical and computational curse of dimensionality exist, for example by imposing and exploiting geometric assumptions on the distribution of the data (e.g. that its support is low-dimensional), or strong smoothness assumptions on $F$, or a special structure $F$. Among the latter, compositional models assume $F=fcirc g$ with $g$ mapping to $mathbb{R}^r$ with $rll d$, have been studied, and include classical single- and multi-index models and recent works on neural networks. While the case where $g$ is linear is rather well-understood, much less…