00:00:00 / 00:00:00

Highly-Smooth Zero-th Order Online Optimization

By Vianney Perchet

Appears in collection : Schlumberger workshop - Computational and statistical trade-offs in learning

We consider online convex optimization with noisy zero-th order information, that is noisy function evaluations at any desired point. We focus on problems with high degrees of smoothness, such as online logistic regression. We show that as opposed to gradient-based algorithms, high-order smoothness may be used to improve estimation rates, with a precise dependence on the degree of smoothness and the dimension. In particular, we show that for infinitely differentiable functions, we recover the same dependence on sample size as gradient-based algorithms, with an extra dimension-dependent factor. This is done for convex and strongly-convex functions in constrained or global optimization (with either one point or two points noisy evaluations of the functions). Joint work with F. Bach.

Information about the video

  • Date of recording 22/03/2016
  • Date of publication 27/03/2016
  • Institution IHES
  • Format MP4

Last related questions on MathOverflow

You have to connect your Carmin.tv account with mathoverflow to add question

Ask a question on MathOverflow




Register

  • Bookmark videos
  • Add videos to see later &
    keep your browsing history
  • Comment with the scientific
    community
  • Get notification updates
    for your favorite subjects
Give feedback