TITLE:
Almost Sure Convergence of Proximal Stochastic Accelerated Gradient Methods
AUTHORS:
Xin Xiang, Haoming Xia
KEYWORDS:
Proximal Stochastic Accelerated Method, Almost Sure Convergence, Composite Optimization, Non-Smooth Optimization, Stochastic Optimization, Accelerated Gradient Method
JOURNAL NAME:
Journal of Applied Mathematics and Physics,
Vol.12 No.4,
April
30,
2024
ABSTRACT: Proximal gradient descent and its accelerated version are resultful methods for solving the sum of smooth and non-smooth problems. When the smooth function can be represented as a sum of multiple functions, the stochastic proximal gradient method performs well. However, research on its accelerated version remains unclear. This paper proposes a proximal stochastic accelerated gradient (PSAG) method to address problems involving a combination of smooth and non-smooth components, where the smooth part corresponds to the average of multiple block sums. Simultaneously, most of convergence analyses hold in expectation. To this end, under some mind conditions, we present an almost sure convergence of unbiased gradient estimation in the non-smooth setting. Moreover, we establish that the minimum of the squared gradient mapping norm arbitrarily converges to zero with probability one.