Cannelli, Loris and Facchinei, Francisco and Kungurtsev, Vyacheslav and Scutari, Gesualdo
Cannelli, L., Facchinei, F., Kungurtsev, V., & Scutari, G. (2019). Asynchronous Parallel Algorithms for Nonconvex Big-Data Optimization: Model and Convergence. Mathematical Programming, 1–34.
Abstract
We propose a novel asynchronous parallel algorithmic framework for the minimization of the sum of a smooth nonconvex function and a convex nonsmooth regularizer, subject to both convex and nonconvex constraints. The proposed framework hinges on successive convex approximation techniques and a novel probabilistic model that captures key elements of modern computational architectures and asynchronous implementations in a more faithful way than current state of the art models. Key features of the proposed framework are: i) it accommodates inconsistent read, meaning that components of the vector variables may be written by some cores while being simultaneously read by others; ii) it covers in a unified way several different specific solution methods, and iii) it accommodates a variety of possible parallel computing architectures. Almost sure convergence to stationary solutions is proved. Numerical results, reported in the companion paper [5], on both convex and nonconvex problems show our method can consistently outperform existing parallel asynchronous algorithms.
Citation
@article{cannelli2019,
title = {Asynchronous Parallel Algorithms for Nonconvex Big-Data Optimization: Model and Convergence},
author = {Cannelli, Loris and Facchinei, Francisco and Kungurtsev, Vyacheslav and Scutari, Gesualdo},
journal = {Mathematical Programming},
year = {2019},
pages = {1-34},
url = {http://www.optimization-online.org/DB_FILE/2017/01/5823.pdf}
}