Open Access iconOpen Access

ARTICLE

crossmark

Distributed Stochastic Optimization with Compression for Non-Strongly Convex Objectives

Xuanjie Li, Yuedong Xu*

School of Information Science and Technology, Fudan University, Shanghai, China

* Corresponding Author: Yuedong Xu. Email: email

Computer Modeling in Engineering & Sciences 2024, 139(1), 459-481. https://doi.org/10.32604/cmes.2023.043247

Abstract

We are investigating the distributed optimization problem, where a network of nodes works together to minimize a global objective that is a finite sum of their stored local functions. Since nodes exchange optimization parameters through the wireless network, large-scale training models can create communication bottlenecks, resulting in slower training times. To address this issue, CHOCO-SGD was proposed, which allows compressing information with arbitrary precision without reducing the convergence rate for strongly convex objective functions. Nevertheless, most convex functions are not strongly convex (such as logistic regression or Lasso), which raises the question of whether this algorithm can be applied to non-strongly convex functions. In this paper, we provide the first theoretical analysis of the convergence rate of CHOCO-SGD on non-strongly convex objectives. We derive a sufficient condition, which limits the fidelity of compression, to guarantee convergence. Moreover, our analysis demonstrates that within the fidelity threshold, this algorithm can significantly reduce transmission burden while maintaining the same convergence rate order as its no-compression equivalent. Numerical experiments further validate the theoretical findings by demonstrating that CHOCO-SGD improves communication efficiency and keeps the same convergence rate order simultaneously. And experiments also show that the algorithm fails to converge with low compression fidelity and in time-varying topologies. Overall, our study offers valuable insights into the potential applicability of CHOCO-SGD for non-strongly convex objectives. Additionally, we provide practical guidelines for researchers seeking to utilize this algorithm in real-world scenarios.

Keywords


Cite This Article

APA Style
Li, X., Xu, Y. (2024). Distributed stochastic optimization with compression for non-strongly convex objectives. Computer Modeling in Engineering & Sciences, 139(1), 459-481. https://doi.org/10.32604/cmes.2023.043247
Vancouver Style
Li X, Xu Y. Distributed stochastic optimization with compression for non-strongly convex objectives. Comp Model Eng. 2024;139(1):459-481 https://doi.org/10.32604/cmes.2023.043247
IEEE Style
X. Li and Y. Xu, "Distributed Stochastic Optimization with Compression for Non-Strongly Convex Objectives," Comp. Model. Eng., vol. 139, no. 1, pp. 459-481. 2024. https://doi.org/10.32604/cmes.2023.043247



cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 269

    View

  • 143

    Download

  • 0

    Like

Share Link