|Cedric Josz||UC Berkeley|
|Yi Ouyang||Preferred Networks|
|Richard Zhang||University of California, Berkeley|
|Javad Lavaei||University of California, Berkeley|
|Somayeh Sojoudi||University of California, Berkeley|
The authors study the set of continuous functions that admit no spurious local optima (i.e.
We study the set of continuous functions that admit no spurious local optima (i.e. local minima that are not global minima) which we term global functions. They satisfy various powerful properties for analyzing nonconvex and nonsmooth optimization problems. For instance, they satisfy a theorem akin to the fundamental uniform limit theorem in the analysis regarding continuous functions. Global functions are also endowed with useful properties regarding the composition of functions and change of variables. Using these new results, we show that a class of non-differentiable nonconvex optimization problems arising in tensor decomposition applications are global functions. This is the first result concerning nonconvex methods for nonsmooth objective functions. Our result provides a theoretical guarantee for the widely-used $\ell_1$ norm to avoid outliers in nonconvex optimization.