How can agents exchange information to learn while protecting privacy? Healthcare centers collaborating on clinical trials must balance knowledge sharing with safeguarding sensitive patient data. We address this challenge by using differential privacy (DP) to control information leakage. Agents update belief statistics via log-linear rules, and DP noise provides plausible deniability and rigorous performance guarantees. We study two settings: distributed maximum likelihood estimation (MLE) with a finite set of private signals and online learning from an intermittent signal stream. Noisy aggregation introduces trade-offs between rejecting low-quality states and accepting high-quality ones. The MLE setting naturally applies to binary hypothesis testing with formal statistical guarantees. Through simulations, we demonstrate differentially private, distributed survival analysis on real-world clinical trial data, evaluating treatment efficacy and the impact of biomedical indices on patient survival. Our methods enable privacy-preserving inference with greater efficiency and lower error rates than homomorphic encryption and first-order DP optimization approaches.
翻译:暂无翻译