Solving Convex Min-Min Problems with Smoothness and Strong Convexity in One Group of Variables and Low Dimension in the Other
2021
The article deals with some approaches to solving convex problems of the min-min type
with smoothness and strong convexity in only one of the two groups of variables. It is shown that
the proposed approaches based on Vaidya’s method, the fast gradient method, and the accelerated
gradient method with variance reduction have linear convergence. It is proposed to use Vaidya’s
method to solve the exterior problem and the fast gradient method to solve the interior (smooth
and strongly convex) one. Due to its importance for applications in machine learning, the case
where the objective function is the sum of a large number of functions is considered separately. In
this case, the accelerated gradient method with variance reduction is used instead of the fast
gradient method. The results of numerical experiments are presented that illustrate the
advantages of the proposed procedures for a logistic regression problem in which the a priori
distribution for one of the two groups of variables is available.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
8
References
0
Citations
NaN
KQI