Correlated-Sequence Differential Privacy
Luo, Yifan, Zhang, Meng, Xu, Jin, Chen, Junting, Huang, Jianwei
–arXiv.org Artificial Intelligence
Abstract--Data streams collected from multiple sources are rarely independent. V alues evolve over time and influence one another across sequences. These correlations improve prediction in healthcare, finance, and smart-city control yet violate the record-independence assumption built into most Differential Privacy (DP) mechanisms. T o restore rigorous privacy guarantees without sacrificing utility, we introduce Correlated-Sequence Differential Privacy (CSDP), a framework specifically designed for preserving privacy in correlated sequential data. CSDP addresses two linked challenges: quantifying the extra information an attacker gains from joint temporal and cross-sequence links, and adding just enough noise to hide that information while keeping the data useful. We model multivariate streams as a Coupling Markov Chain, yielding the derived loose leakage bound expressed with a few spectral terms and revealing a counterintuitive result: stronger coupling can actually decrease worst-case leakage by dispersing perturbations across sequences. Guided by these bounds, we build the Freshness-Regulated Adaptive Noise (FRAN) mechanism--combining data aging, correlation-aware sensitivity scaling, and Laplace noise--that runs in linear time. T ests on two-sequence datasets show that CSDP improves the privacy-utility trade-off by approximately 50% over existing correlated-DP methods and by two orders of magnitude compared to the standard DP approach.
arXiv.org Artificial Intelligence
Nov-25-2025
- Country:
- Asia > China
- Guangdong Province > Shenzhen (0.06)
- Hong Kong (0.04)
- Hubei Province > Wuhan (0.04)
- Asia > China
- Genre:
- Research Report (0.64)
- Industry:
- Health & Medicine (0.88)
- Information Technology > Security & Privacy (0.93)
- Technology: