Equivariant Imaging (EI) regularization has become the de-facto technique for unsupervised training of deep imaging networks, without any need of ground-truth data. Observing that the EI-based unsupervised training paradigm currently has significant computational redundancy leading to inefficiency in high-dimensional applications, we propose a sketched EI regularization which leverages the randomized sketching techniques for acceleration. We apply our sketched EI regularization to develop an accelerated deep internal learning framework, which can be efficiently applied for test-time network adaptation. Additionally, for network adaptation tasks, we propose a parameter-efficient approach to accelerate both EI and Sketched-EI via optimizing only the normalization layers. Our numerical study on X-ray CT and multicoil magnetic resonance image reconstruction tasks demonstrate that our approach can achieve significant computational acceleration over the standard EI counterpart, especially in test-time training tasks.
翻译:等变成像正则化已成为深度成像网络无监督训练的事实标准技术,无需任何真实数据支持。观察到当前基于等变成像的无监督训练范式存在显著的计算冗余,导致高维应用效率低下,我们提出了一种素描等变成像正则化方法,该方法利用随机素描技术实现加速。我们将素描等变成像正则化应用于开发加速的深度内部学习框架,该框架可高效用于测试时网络自适应。此外,针对网络自适应任务,我们提出了一种参数高效的方法,通过仅优化归一化层来加速等变成像和素描等变成像。我们在X射线计算机断层扫描和多线圈磁共振图像重建任务上的数值研究表明,相较于标准等变成像方法,我们的方法能够实现显著的计算加速,尤其在测试时训练任务中。