Distributed coordinate descent method for learning with big data

Peter Richtarik, Martin Takáč

Research output: Contribution to journalArticlepeer-review

55 Scopus citations

Abstract

In this paper we develop and analyze Hydra: HYbriD cooRdinAte descent method for solving loss minimization problems with big data. We initially partition the coordinates (features) and assign each partition to a different node of a cluster. At every iteration, each node picks a random subset of the coordinates from those it owns, independently from the other computers, and in parallel computes and applies updates to the selected coordinates based on a simple closed-form formula. We give bounds on the number of iterations sufficient to approximately solve the problem with high probability, and show how it depends on the data and on the partitioning. We perform numerical experiments with a LASSO instance described by a 3TB matrix.

Original languageEnglish (US)
JournalJournal of Machine Learning Research
Volume17
StatePublished - Feb 1 2016

Keywords

  • Boosting
  • Distributed algorithms
  • Parallel coordinate descent
  • Stochastic methods

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Distributed coordinate descent method for learning with big data'. Together they form a unique fingerprint.

Cite this