1
|
An Accelerated Convex Optimization Algorithm with Line Search and Applications in Machine Learning. MATHEMATICS 2022. [DOI: 10.3390/math10091491] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
In this paper, we introduce a new line search technique, then employ it to construct a novel accelerated forward–backward algorithm for solving convex minimization problems of the form of the summation of two convex functions in which one of these functions is smooth in a real Hilbert space. We establish a weak convergence to a solution of the proposed algorithm without the Lipschitz assumption on the gradient of the objective function. Furthermore, we analyze its performance by applying the proposed algorithm to solving classification problems on various data sets and compare with other line search algorithms. Based on the experiments, the proposed algorithm performs better than other line search algorithms.
Collapse
|
2
|
A New Forward–Backward Algorithm with Line Searchand Inertial Techniques for Convex Minimization Problems with Applications. MATHEMATICS 2021. [DOI: 10.3390/math9131562] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
For the past few decades, various algorithms have been proposed to solve convex minimization problems in the form of the sum of two lower semicontinuous and convex functions. The convergence of these algorithms was guaranteed under the L-Lipschitz condition on the gradient of the objective function. In recent years, an inertial technique has been widely used to accelerate the convergence behavior of an algorithm. In this work, we introduce a new forward–backward splitting algorithm using a new line search and inertial technique to solve convex minimization problems in the form of the sum of two lower semicontinuous and convex functions. A weak convergence of our proposed method is established without assuming the L-Lipschitz continuity of the gradient of the objective function. Moreover, a complexity theorem is also given. As applications, we employed our algorithm to solve data classification and image restoration by conducting some experiments on these problems. The performance of our algorithm was evaluated using various evaluation tools. Furthermore, we compared its performance with other algorithms. Based on the experiments, we found that the proposed algorithm performed better than other algorithms mentioned in the literature.
Collapse
|