Fall 2018 Joint CSC@USC/CommNetS-MHI Seminar Series
AbstractWhen is solving a non-convex optimization problem easy? Despite significant research efforts to answer this question, most existing results are problem specific and cannot be applied even with simple changes in the objective function. In this talk, we provide theoretical insights to this question by answering two related questions: 1) Are all local optima of a given optimization problem globally optimal? 2) When can we compute a local optimum of a given non-convex constrained optimization problem efficiently? In the first part of the talk, motivated by the non-convex training problem of deep neural networks, we provide simple sufficient conditions under which any local optimum of a given highly composite optimization problem is globally optimal. Unlike many existing results in the literature, our sufficient condition applies to many non-convex optimization problems such as training problem of non-convex multi-linear neural networks and non-linear neural networks with pyramidal structures.
BiosketchMeisam Razaviyayn is an assistant professor at the department of Industrial and Systems Engineering at the University of Southern California. Prior to joining USC, he was a postdoctoral research fellow in the Department of Electrical Engineering at Stanford University working with Professor David Tse. He received his PhD in Electrical Engineering with minor in Computer Science at the University of Minnesota under the supervision of Professor Tom Luo. He obtained his MS degree in Mathematics under the supervision of Professor Gennady Lyubeznik. Meisam Razaviyayn is the recipient of the Signal Processing Society Young Author Best Paper Award in 2014 and the finalist for Best Paper Prize for Young Researcher in Continuous Optimization in 2013 and 2016. |