Find in Library
Search millions of books, articles, and more
Indexed Open Access Databases
On the Convergence Analysis of Cubic Regularized Symmetric Rank-1 Quasi-Newton Method and the Incremental Version in the Application of Large-Scale Problems
oleh: Huiming Chen, Wong-Hing Lam, Shing-Chow Chan
Format: | Article |
---|---|
Diterbitkan: | IEEE 2019-01-01 |
Deskripsi
This paper provides a detailed study on the convergence properties of the cubic regularized symmetric rank-1 (SR1) method (CuREG-SR1) proposed in [2]. The main advantage of incorporating cubic regularization technique to SR1 is to alleviate the problem of indefinite resulting matrix in SR1. However, its convergence under the line search framework is less studied. Here, we first show that CuREG-SR1 converges to a first-order critical point. Moreover, we give a novel result that provided the uniformly independent assumption, the difference between approximated Hessian matrix generated by CuREG-SR1 and the true Hessian is bounded. In addition, we show that for a problem with dimension d, CuREG-SR1 generates q-d superlinear steps every q iterations. We also propose a novel incremental CuREG-SR1 (ICuREG-SR1) algorithm to adapt SR1 and CuREG-SR1 efficiently for solving large scale problems. The basic idea is to incorporate incremental optimization scheme, which updates progressively information for objective function involving a sum of individual functions, which are frequently encountered in large-scale machine learning. Numerical experiments on several machine learning problems show that the proposed algorithm offers superior performance in terms of the gradient magnitude than other conventional algorithms tested.