It is an elementary fact in the scientific literature that the Lipschitz norm of the realization function of a feedforward fully-connected rectified linear unit (ReLU) artificial neural network (ANN) can, up to a multiplicative constant, be bounded from above by sums of powers of the norm of the ANN parameter vector. Roughly speaking, in this work we reveal in the case of shallow ANNs that the converse inequality is also true. More formally, we prove that the norm of the equivalence class of ANN parameter vectors with the same realization function is, up to a multiplicative constant, bounded from above by the sum of powers of the Lipschitz norm of the ANN realization function (with the exponents $ 1/2 $ and $ 1 $). Moreover, we prove that this upper bound only holds when employing the Lipschitz norm but does neither hold for Hölder norms nor for Sobolev-Slobodeckij norms. Furthermore, we prove that this upper bound only holds for sums of powers of the Lipschitz norm with the exponents $ 1/2 $ and $ 1 $ but does not hold for the Lipschitz norm alone.
翻译:科学文献中的一个基本事实是,前馈全连接整流线性单元(ReLU)人工神经网络(ANN)实现函数的Lipschitz范数,可以(至多相差一个乘法常数)由ANN参数向量范数的幂和上界控制。粗略而言,本文在浅层ANN情形下揭示了逆不等式同样成立。更形式化地,我们证明了具有相同实现函数的ANN参数向量等价类的范数,至多相差一个乘法常数,可由ANN实现函数的Lipschitz范数的幂和(指数为$1/2$和$1$)上界控制。此外,我们证明该上界仅在使用Lipschitz范数时成立,而对于Hölder范数或Sobolev-Slobodeckij范数均不成立。进一步地,我们证明该上界仅对指数为$1/2$和$1$的Lipschitz范数幂和成立,而仅用Lipschitz范数本身则无法成立。