TY - EJOU
AU - Hu, Wujie
AU - Yuan, Gonglin
AU - Pham, Hongtruong
TI - A Modified Three-Term Conjugate Gradient Algorithm for Large-Scale Nonsmooth Convex Optimization
T2 - Computers, Materials \& Continua
PY - 2020
VL - 62
IS - 2
SN - 1546-2226
AB - It is well known that Newton and quasi-Newton algorithms are effective to small
and medium scale smooth problems because they take full use of corresponding gradient
function’s information but fail to solve nonsmooth problems. The perfect algorithm stems
from concept of ‘bundle’ successfully addresses both smooth and nonsmooth complex
problems, but it is regrettable that it is merely effective to small and medium optimization
models since it needs to store and update relevant information of parameter’s bundle. The
conjugate gradient algorithm is effective both large-scale smooth and nonsmooth
optimization model since its simplicity that utilizes objective function’s information and the
technique of Moreau-Yosida regularization. Thus, a modified three-term conjugate gradient
algorithm was proposed, and it has a sufficiently descent property and a trust region
character. At the same time, it possesses the global convergence under mild assumptions
and numerical test proves it is efficient than similar optimization algorithms.
KW - Conjugate gradient
KW - large-scale
KW - trust region
KW - global convergence
DO - 10.32604/cmc.2020.02993