On the Sparsity of the Strong Lottery Ticket Hypothesis
–Neural Information Processing Systems
Considerable research efforts have recently been made to show that a random neural network N contains subnetworks capable of accurately approximating any given neural network that is sufficiently smaller than N, without any training. This line of research, known as the Strong Lottery Ticket Hypothesis (SL TH), was originally motivated by the weaker Lottery Ticket Hypothesis, which states that a sufficiently large random neural network N contains sparse subnetworks that can be trained efficiently to achieve performance comparable to that of training the entire network N . Despite its original motivation, results on the SL TH have so far not provided any guarantee on the size of subnetworks. Such limitation is due to the nature of the main technical tool leveraged by these results, the Random Subset Sum (RSS) Problem. Informally, the RSS Problem asks how large a random i.i.d.
Neural Information Processing Systems
Nov-16-2025, 20:25:44 GMT
- Country:
- Europe
- France (0.14)
- Netherlands > North Holland
- Amsterdam (0.04)
- North America
- Canada > British Columbia
- Vancouver (0.04)
- United States > Hawaii
- Honolulu County > Honolulu (0.04)
- Canada > British Columbia
- Europe
- Genre:
- Contests & Prizes (1.00)
- Research Report > Experimental Study (0.93)
- Workflow (0.69)
- Industry:
- Leisure & Entertainment > Gambling (0.91)
- Technology: