Pure Exploration in Infinitely-Armed Bandit Models with Fixed-Confidence

Open in new window