We consider a class of statistical inverse problems involving the estimation of a regression operator from a Polish space to a separable Hilbert space, where the target lies in a vector-valued reproducing kernel Hilbert space induced by an operator-valued kernel. To address the associated ill-posedness, we analyze regularized stochastic gradient descent (SGD) algorithms in both online and finite-horizon settings. The former uses polynomially decaying step sizes and regularization parameters, while the latter adopts fixed values. Under suitable structural and distributional assumptions, we establish dimension-independent bounds for prediction and estimation errors. The resulting convergence rates are near-optimal in expectation, and we also derive high-probability estimates that imply almost sure convergence. Our analysis introduces a general technique for obtaining high-probability guarantees in infinite-dimensional settings. Possible extensions to broader kernel classes and encoder-decoder structures are briefly discussed.
翻译:我们研究一类统计逆问题,涉及从波兰空间到可分希尔伯特空间的回归算子估计,其中目标位于由算子值核诱导的向量值再生核希尔伯特空间中。为应对相关的病态性,我们分析了在线和有限时域设置下的正则化随机梯度下降算法。前者采用多项式衰减的步长和正则化参数,而后者采用固定值。在适当的结构和分布假设下,我们建立了预测误差和估计误差的维度无关界。所得收敛速率在期望意义下接近最优,同时我们还推导了高概率估计,该估计蕴含几乎必然收敛性。我们的分析提出了一种在无限维设置中获得高概率保证的通用技术。文中简要讨论了向更广泛核类及编码器-解码器结构的可能扩展。