New Statistical and Computational Results for Learning Junta Distributions
–arXiv.org Artificial Intelligence
We study the problem of learning junta distributions on $\{0, 1\}^n$, where a distribution is a $k$-junta if its probability mass function depends on a subset of at most $k$ variables. We make two main contributions: - We show that learning $k$-junta distributions is \emph{computationally} equivalent to learning $k$-parity functions with noise (LPN), a landmark problem in computational learning theory. - We design an algorithm for learning junta distributions whose statistical complexity is optimal, up to polylogarithmic factors. Computationally, our algorithm matches the complexity of previous (non-sample-optimal) algorithms. Combined, our two contributions imply that our algorithm cannot be significantly improved, statistically or computationally, barring a breakthrough for LPN.
arXiv.org Artificial Intelligence
Jul-15-2025
- Country:
- Asia > China
- Hong Kong (0.04)
- North America > United States
- California > Santa Cruz County > Santa Cruz (0.04)
- Asia > China
- Genre:
- Research Report (1.00)
- Technology: