This paper presents a comprehensive analysis of hyperparameter estimation within the empirical Bayes framework (EBF) for sparse learning. By studying the influence of hyperpriors on the solution of EBF, we establish a theoretical connection between the choice of the hyperprior and the sparsity as well as the local optimality of the resulting solutions. We show that some strictly increasing hyperpriors, such as half-Laplace and half-generalized Gaussian with the power in $(0,1)$, effectively promote sparsity and improve solution stability with respect to measurement noise. Based on this analysis, we adopt a proximal alternating linearized minimization (PALM) algorithm with convergence guaranties for both convex and concave hyperpriors. Extensive numerical tests on two-dimensional image deblurring problems demonstrate that introducing appropriate hyperpriors significantly promotes the sparsity of the solution and enhances restoration accuracy. Furthermore, we illustrate the influence of the noise level and the ill-posedness of inverse problems to EBF solutions.
翻译:本文对稀疏学习中经验贝叶斯框架下的超参数估计进行了全面分析。通过研究超先验对经验贝叶斯框架解的影响,我们建立了超先验选择与解的稀疏性及局部最优性之间的理论联系。研究表明,某些严格递增的超先验(如半拉普拉斯分布和幂指数在$(0,1)$区间的半广义高斯分布)能有效促进稀疏性,并提升解对测量噪声的稳定性。基于此分析,我们采用具有收敛性保证的近端交替线性化最小化算法,该算法适用于凸和非凸超先验。在二维图像去模糊问题上的大量数值实验表明,引入适当的超先验能显著提升解的稀疏性并改善复原精度。此外,我们还阐明了噪声水平和反问题不适定性对经验贝叶斯框架解的影响。