Recently, Mamba-based super-resolution (SR) methods have demonstrated the ability to capture global receptive fields with linear complexity, addressing the quadratic computational cost of Transformer-based SR approaches. However, existing Mamba-based methods lack fine-grained transitions across different modeling scales, which limits the efficiency of feature representation. In this paper, we propose T-PMambaSR, a lightweight SR framework that integrates window-based self-attention with Progressive Mamba. By enabling interactions among receptive fields of different scales, our method establishes a fine-grained modeling paradigm that progressively enhances feature representation with linear complexity. Furthermore, we introduce an Adaptive High-Frequency Refinement Module (AHFRM) to recover high-frequency details lost during Transformer and Mamba processing. Extensive experiments demonstrate that T-PMambaSR progressively enhances the model's receptive field and expressiveness, yielding better performance than recent Transformer- or Mamba-based methods while incurring lower computational cost. Our codes will be released after acceptance.
翻译:近年来,基于Mamba的超分辨率方法已展现出以线性复杂度捕获全局感受野的能力,解决了基于Transformer的超分辨率方法二次计算成本的问题。然而,现有的基于Mamba的方法在不同建模尺度间缺乏细粒度过渡,限制了特征表示的效率。本文提出T-PMambaSR,一种轻量级超分辨率框架,将基于窗口的自注意力与渐进式Mamba相结合。通过实现不同尺度感受野之间的交互,我们的方法建立了一种细粒度建模范式,以线性复杂度逐步增强特征表示。此外,我们引入自适应高频细化模块,以恢复Transformer和Mamba处理过程中丢失的高频细节。大量实验表明,T-PMambaSR逐步增强了模型的感受野和表达能力,在计算成本更低的情况下,取得了优于近期基于Transformer或Mamba方法的性能。我们的代码将在论文录用后开源。