Biased Stochastic First-Order Methods for Conditional Stochastic Optimization and Applications in Meta Learning

Neural Information Processing Systems 

Our lower bound analysis shows that the sample complexities of BSGD cannot be improved for general convex objectives and nonconvex objectives except for smooth nonconvex objectives with Lipschitz continuous gradient estimator.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found