Information-theoretic lower bounds on the oracle complexity of stochastic convex optimization

  • Alekh Agarwal ,
  • Peter Bartlett ,
  • Pradeep Ravikumar ,
  • Martin Wainwright

IEEE Transcations on Information Theory | , Vol 58

Relative to the large literature on upper bounds
on complexity of convex optimization, lesser attention has been
paid to the fundamental hardness of these problems. Given the
extensive use of convex optimization in machine learning and
statistics, gaining an understanding of these complexity-theoretic
issues is important. In this paper, we study the complexity of
stochastic convex optimization in an oracle model of computation.
We introduce a new notion of discrepancy between functions,
and use it to reduce problems of stochastic convex optimization
to statistical parameter estimation, which can be lower bounded
using information-theoretic methods. Using this approach, we
improve upon known results and obtain tight minimax complexity
estimates for various function classes.