Table of Content

Open Access iconOpen Access



A Modified Three-Term Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization

Wujie Hu1, Gonglin Yuan1, *, Hongtruong Pham2

1 College of Mathematics and Information Science, Guangxi University, Nanning, China.
2 Thai Nguyen University of Economics and Business Administration, Thai Nguyen, Vietnam.

* Corresponding Author: Gonglin Yuan. Email: email.

Computers, Materials & Continua 2020, 62(2), 787-800.


It is well known that Newton and quasi-Newton algorithms are effective to small and medium scale smooth problems because they take full use of corresponding gradient function’s information but fail to solve nonsmooth problems. The perfect algorithm stems from concept of ‘bundle’ successfully addresses both smooth and nonsmooth complex problems, but it is regrettable that it is merely effective to small and medium optimization models since it needs to store and update relevant information of parameter’s bundle. The conjugate gradient algorithm is effective both large-scale smooth and nonsmooth optimization model since its simplicity that utilizes objective function’s information and the technique of Moreau-Yosida regularization. Thus, a modified three-term conjugate gradient algorithm was proposed, and it has a sufficiently descent property and a trust region character. At the same time, it possesses the global convergence under mild assumptions and numerical test proves it is efficient than similar optimization algorithms.


Cite This Article

W. Hu, G. Yuan and H. Pham, "A modified three-term conjugate gradient algorithm for large-scale nonsmooth convex optimization," Computers, Materials & Continua, vol. 62, no.2, pp. 787–800, 2020.


cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 4172


  • 2181


  • 0


Share Link