Skip to main content
Cornell University
We gratefully acknowledge support from
the Simons Foundation and member institutions.
arxiv logo > cs > arXiv:2012.03837

Help | Advanced Search

Computer Science > Machine Learning

(cs)
[Submitted on 7 Dec 2020 (v1), last revised 15 Jun 2021 (this version, v2)]

Title:Parallel Training of Deep Networks with Local Updates

Authors:Michael Laskin, Luke Metz, Seth Nabarro, Mark Saroufim, Badreddine Noune, Carlo Luschi, Jascha Sohl-Dickstein, Pieter Abbeel
Download PDF
Abstract: Deep learning models trained on large data sets have been widely successful in both vision and language domains. As state-of-the-art deep learning architectures have continued to grow in parameter count so have the compute budgets and times required to train them, increasing the need for compute-efficient methods that parallelize training. Two common approaches to parallelize the training of deep networks have been data and model parallelism. While useful, data and model parallelism suffer from diminishing returns in terms of compute efficiency for large batch sizes. In this paper, we investigate how to continue scaling compute efficiently beyond the point of diminishing returns for large batches through local parallelism, a framework which parallelizes training of individual layers in deep networks by replacing global backpropagation with truncated layer-wise backpropagation. Local parallelism enables fully asynchronous layer-wise parallelism with a low memory footprint, and requires little communication overhead compared with model parallelism. We show results in both vision and language domains across a diverse set of architectures, and find that local parallelism is particularly effective in the high-compute regime.
Comments: First two authors - Michael Laskin and Luke Metz - contributed equally. Order was determined by a coin flip
Subjects: Machine Learning (cs.LG); Artificial Intelligence (cs.AI); Neural and Evolutionary Computing (cs.NE)
Cite as: arXiv:2012.03837 [cs.LG]
  (or arXiv:2012.03837v2 [cs.LG] for this version)
  https://doi.org/10.48550/arXiv.2012.03837
arXiv-issued DOI via DataCite

Submission history

From: Michael Laskin [view email]
[v1] Mon, 7 Dec 2020 16:38:45 UTC (2,666 KB)
[v2] Tue, 15 Jun 2021 14:50:45 UTC (2,666 KB)
Full-text links:

Download:

  • PDF
  • Other formats
Current browse context:
cs.LG
< prev   |   next >
new | recent | 2012
Change to browse by:
cs
cs.AI
cs.NE

References & Citations

  • NASA ADS
  • Google Scholar
  • Semantic Scholar

DBLP - CS Bibliography

listing | bibtex
Luke Metz
Mark Saroufim
Carlo Luschi
Jascha Sohl-Dickstein
Pieter Abbeel
a export bibtex citation Loading...

Bookmark

BibSonomy logo Mendeley logo Reddit logo ScienceWISE logo

Bibliographic and Citation Tools

Bibliographic Explorer (What is the Explorer?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)
Which authors of this paper are endorsers? | Disable MathJax (What is MathJax?)
  • About
  • Help
  • contact arXivClick here to contact arXiv Contact
  • subscribe to arXiv mailingsClick here to subscribe Subscribe
  • Copyright
  • Privacy Policy
  • Web Accessibility Assistance
  • arXiv Operational Status
    Get status notifications via email or slack