The precise one-shot characterisation of operational tasks in classical and quantum information theory relies on different forms of smooth entropic quantities. A particularly important connection is between the hypothesis testing relative entropy and the smoothed max-relative entropy, which together govern many operational settings. We first strengthen this connection into a type of equivalence: we show that the hypothesis testing relative entropy is equivalent to a variant of the smooth max-relative entropy based on the information spectrum divergence, which can be alternatively understood as a measured smooth max-relative entropy. Furthermore, we improve a fundamental lemma due to Datta and Renner that connects the different variants of the smoothed max-relative entropy, introducing a modified proof technique based on matrix geometric means and a tightened gentle measurement lemma. We use the unveiled connections and tools to strictly improve on previously known one-shot bounds and duality relations between the smooth max-relative entropy and the hypothesis testing relative entropy, establishing provably tight bounds between them. We use these results to refine other divergence inequalities, in particular sharpening bounds that connect the max-relative entropy with Rényi divergences.
翻译:经典与量子信息论中操作任务的精确单次刻画依赖于不同形式的光滑熵量。假设检验相对熵与光滑最大相对熵之间的关联尤为重要,二者共同主导着众多操作场景。我们首先将这一关联强化为一种等价关系:证明假设检验相对熵等价于基于信息谱散度的光滑最大相对熵变体,该变体亦可理解为一种测量光滑最大相对熵。此外,我们改进了由Datta与Renner提出的连接不同光滑最大相对熵变体的基本引理,引入了一种基于矩阵几何平均与强化温和测量引理的修正证明技术。利用揭示的关联与工具,我们严格改进了先前已知的光滑最大相对熵与假设检验相对熵之间的单次界及对偶关系,建立了二者间可证明的紧致界。基于这些结果,我们进一步优化了其他散度不等式,特别是锐化了连接最大相对熵与Rényi散度的界。