Kleinberg, Robert D.
Nearly Tight Bounds for the Continuum-Armed Bandit Problem
Kleinberg, Robert D.
In the multi-armed bandit problem, an online algorithm must choose from a set of strategies in a sequence of n trials so as to minimize the total cost of the chosen strategies. While nearly tight upper and lower bounds are known in the case when the strategy set is finite, much less is known when there is an infinite strategy set.
Nearly Tight Bounds for the Continuum-Armed Bandit Problem
Kleinberg, Robert D.
In the multi-armed bandit problem, an online algorithm must choose from a set of strategies in a sequence of n trials so as to minimize the total cost of the chosen strategies. While nearly tight upper and lower bounds are known in the case when the strategy set is finite, much less is known when there is an infinite strategy set.
Nearly Tight Bounds for the Continuum-Armed Bandit Problem
Kleinberg, Robert D.
In the multi-armed bandit problem, an online algorithm must choose from a set of strategies in a sequence of n trials so as to minimize the total cost of the chosen strategies. While nearly tight upper and lower bounds are known in the case when the strategy set is finite, much less is known when there is an infinite strategy set.