We develop a quantitative approximation theory for shallow neural networks using tools from time-frequency analysis. Working in weighted modulation spaces $M^{p,q}_m(\mathbf{R}^{d})$, we prove dimension-independent approximation rates in Sobolev norms $W^{n,r}(Ω)$ for networks whose units combine standard activations with localized time-frequency windows. Our main result shows that for $f \in M^{p,q}_m(\mathbf{R}^{d})$ one can achieve \[ \|f - f_N\|_{W^{n,r}(Ω)} \lesssim N^{-1/2}\,\|f\|_{M^{p,q}_m(\mathbf{R}^{d})}, \] on bounded domains, with explicit control of all constants. We further obtain global approximation theorems on $\mathbf{R}^{d}$ using weighted modulation dictionaries, and derive consequences for Feichtinger's algebra, Fourier-Lebesgue spaces, and Barron spaces. Numerical experiments in one and two dimensions confirm that modulation-based networks achieve substantially better Sobolev approximation than standard ReLU networks, consistent with the theoretical estimates.
翻译:我们利用时频分析工具,为浅层神经网络建立了一种定量逼近理论。在加权调制空间 $M^{p,q}_m(\mathbf{R}^{d})$ 中,我们证明了具有局部时频窗与标准激活函数组合的神经网络在 Sobolev 范数 $W^{n,r}(Ω)$ 下的逼近速率与维度无关。我们的主要结果表明,对于 $f \in M^{p,q}_m(\mathbf{R}^{d})$,在有界域上可以实现 \[ \|f - f_N\|_{W^{n,r}(Ω)} \lesssim N^{-1/2}\,\|f\|_{M^{p,q}_m(\mathbf{R}^{d})}, \] 且所有常数均可显式控制。进一步,我们利用加权调制字典获得了在 $\mathbf{R}^{d}$ 上的全局逼近定理,并推导了其在 Feichtinger 代数、Fourier-Lebesgue 空间及 Barron 空间中的推论。一维和二维数值实验证实,基于调制的神经网络在 Sobolev 逼近方面显著优于标准的 ReLU 网络,与理论估计一致。