Elastic Consistency: A Practical Consistency Model for Distributed Stochastic Gradient Descent
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F21%3A00351168" target="_blank" >RIV/68407700:21230/21:00351168 - isvavai.cz</a>
Result on the web
<a href="https://ojs.aaai.org/index.php/AAAI/article/view/17092" target="_blank" >https://ojs.aaai.org/index.php/AAAI/article/view/17092</a>
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
Elastic Consistency: A Practical Consistency Model for Distributed Stochastic Gradient Descent
Original language description
One key element behind the recent progress of machine learning has been the ability to train machine learning models in large-scale distributed shared-memory and message-passing environments. Most of these models are trained employing variants of stochastic gradient descent (SGD) based optimization, but most methods involve some type of consistency relaxation relative to sequential SGD, to mitigate its large communication or synchronization costs at scale. In this paper, we introduce a general consistency condition covering communication-reduced and asynchronous distributed SGD implementations. Our framework, called elastic consistency, decouples the system-specific aspects of the implementation from the SGD convergence requirements, giving a general way to obtain convergence bounds for a wide variety of distributed SGD methods used in practice. Elastic consistency can be used to re-derive or improve several previous convergence bounds in message-passing and shared-memory settings, but also to analyze new models and distribution schemes. As a direct application, we propose and analyze a new synchronization-avoiding scheduling scheme for distributed SGD, and show that it can be used to efficiently train deep convolutional models for image classification.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
—
Continuities
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Others
Publication year
2021
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
Proceedings of the AAAI Conference on Artificial Intelligence
ISBN
978-1-57735-866-4
ISSN
2159-5399
e-ISSN
2374-3468
Number of pages
9
Pages from-to
9037-9045
Publisher name
Association for the Advancement of Artificial Intelligence (AAAI)
Place of publication
Palo Alto, California
Event location
Virtual
Event date
Feb 2, 2021
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
000681269800070