We present the Deep Copula Classifier (DCC), a class-conditional generative model that separates marginal estimation from dependence modeling using neural copula densities. DCC is interpretable, Bayes-consistent, and achieves excess-risk $O(n^{-r/(2r+d)})$ for $r$-smooth copulas. In a controlled two-class study with strong dependence ($|\rho|=0.995$), DCC learns Bayes-aligned decision regions. With oracle or pooled marginals, it nearly reaches the best possible performance (accuracy $\approx 0.971$; ROC-AUC $\approx 0.998$). As expected, per-class KDE marginals perform less well (accuracy $0.873$; ROC-AUC $0.957$; PR-AUC $0.966$). On the Pima Indians Diabetes dataset, calibrated DCC ($\tau=1$) achieves accuracy $0.879$, ROC-AUC $0.936$, and PR-AUC $0.870$, outperforming Logistic Regression, SVM (RBF), and Naive Bayes, and matching Logistic Regression on the lowest Expected Calibration Error (ECE). Random Forest is also competitive (accuracy $0.892$; ROC-AUC $0.933$; PR-AUC $0.880$). Directly modeling feature dependence yields strong, well-calibrated performance with a clear probabilistic interpretation, making DCC a practical, theoretically grounded alternative to independence-based classifiers.
翻译:暂无翻译