Fast Approximate Natural Gradient Descent In A Kronecker Factored Eigenbasis

Authors:
Thomas George MILA, Université de Montréal
César Laurent Mila - Université de Montréal
Xavier Bouthillier Université de Montréal
Nicolas Ballas Facebook FAIR
Pascal Vincent Facebook and U. Montreal

Introduction:

Optimization algorithms that leverage gradient covariance information, such as variants of natural gradient descent (Amari, 1998), offer the prospect of yielding more effective descent directions.In the present work the authors draw inspiration from both to propose a novel approximation that is provably better than KFAC and amendable to cheap partial updates.

Abstract:

Optimization algorithms that leverage gradient covariance information, such as variants of natural gradient descent (Amari, 1998), offer the prospect of yielding more effective descent directions. For models with many parameters, the covari- ance matrix they are based on becomes gigantic, making them inapplicable in their original form. This has motivated research into both simple diagonal approxima- tions and more sophisticated factored approximations such as KFAC (Heskes, 2000; Martens & Grosse, 2015; Grosse & Martens, 2016). In the present work we draw inspiration from both to propose a novel approximation that is provably better than KFAC and amendable to cheap partial updates. It consists in tracking a diagonal variance, not in parameter coordinates, but in a Kronecker-factored eigenbasis, in which the diagonal approximation is likely to be more effective. Experiments show improvements over KFAC in optimization speed for several deep network architectures.

You may want to know: