We study the statistical complexity of private linear regression under an unknown, potentially ill-conditioned covariate distribution. Somewhat surprisingly, under privacy constraints the intrinsic complexity is \emph{not} captured by the usual covariance matrix but rather its $L_1$ analogues. Building on this insight, we establish minimax convergence rates for both the central and local privacy models and introduce an Information-Weighted Regression method that attains the optimal rates. As application, in private linear contextual bandits, we propose an efficient algorithm that achieves rate-optimal regret bounds of order $\sqrt{T}+\frac{1}{\alpha}$ and $\sqrt{T}/\alpha$ under joint and local $\alpha$-privacy models, respectively. Notably, our results demonstrate that joint privacy comes at almost no additional cost, addressing the open problems posed by Azize and Basu (2024).
翻译:我们研究了在未知且可能病态协变量分布下私有线性回归的统计复杂性。令人惊讶的是,在隐私约束下,内在复杂性并非由通常的协方差矩阵刻画,而是由其$L_1$范数类比所表征。基于这一洞见,我们为中心化与本地化隐私模型建立了极小极大收敛速率,并提出了一种达到最优速率的“信息加权回归”方法。作为应用,在私有线性上下文赌博机问题中,我们提出了一种高效算法,分别在联合$\alpha$-隐私模型与本地$\alpha$-隐私模型下实现了阶数为$\sqrt{T}+\frac{1}{\alpha}$和$\sqrt{T}/\alpha$的速率最优遗憾界。值得注意的是,我们的结果表明联合隐私几乎不带来额外代价,这解决了Azize与Basu(2024)提出的开放性问题。