Research

Classical and quantum algorithms for nonconvex optimization

Date:2021-10-14

ClickTimes:

Title:Classical and quantum algorithms for nonconvex optimization

Time: Thursday, October 14, 2021, 10:30am

Speaker: Chenyi Zhang (Tsinghua)

Address:S621

主讲人  Chenyi Zhang (Tsinghua) 时间  Thursday, October 14, 2021, 10:30am
地点 S621 报告语言
办公室

Nonconvex optimization is a central research topic in machine learning, mainly because the loss functions in many machine learning models (including neural networks) are typically nonconvex. However, finding a global optimum of a nonconvex function is very hard in general (the running time may be exponentially large). Instead, many theoretical works focus on finding local optima.
The main obstacle comes from ubiquitous saddle points where the optimization algorithms may get stuck. We propose a simple gradient-descent based algorithm such that for a smooth function, it outputs an approximate local minimum of the function in O(log n) gradient descent iterations. Compared to the previous state-of-the-art algorithm using O( log^4 n) iterations, our algorithm is polynomially better in terms of (log n). Technically, our main contribution is an idea of implementing a Hessian power method using only gradients, which can find negative curvature near saddle points that indicates feasible directions to escape from them.
For the same problem, we also propose a quantum algorithm that outputs an approximate local minimum using O(log^2 n) queries to a quantum oracle providing no gradient information but only function value..

Host: Dandan Xu

TOP