Large data limits and scaling laws for tSNE
This work considers large-data asymptotics for t-distributed stochastic neighbor embedding (tSNE), a widely-used non-linear dimension reduction algorithm. We identify an appropriate continuum limit of the tSNE objective function, which can be viewed as a combination of a kernel-based repulsion and an asymptotically-vanishing Laplacian-type regularizer. As a consequence, we show that embeddings of the original tSNE algorithm cannot have any consistent limit as $n \to \infty$. We propose a rescaled model which mitigates the asymptotic decay of the attractive energy, and which does have a consistent limit.
Oct-16-2024
- Country:
- North America > United States > North Carolina > Wake County > Raleigh (0.04)
- Genre:
- Research Report (0.82)
- Industry:
- Health & Medicine (0.46)
- Technology: