Multi-armed Bandits: Competing with Optimal Sequences Yahoo! Research Berkeley, CA