Scaling up machine learning: parallel and distributed approaches

This book presents an integrated collection of representative approaches for scaling up machine learning and data mining methods on parallel and distributed computing platforms. Demand for parallelizing learning algorithms is highly task-specific: in some settings it is driven by the enormous datase...

Full description

Saved in:
Bibliographic Details
Other Authors: Bekkerman, Ron (Editor), Bilenko, Mikhail 1978- (Editor), Langford, John 1975- (Editor)
Format: Electronic eBook
Language:English
Published: Cambridge Cambridge University Press 2012
Subjects:
Online Access:BSB01
FHN01
Volltext
Summary:This book presents an integrated collection of representative approaches for scaling up machine learning and data mining methods on parallel and distributed computing platforms. Demand for parallelizing learning algorithms is highly task-specific: in some settings it is driven by the enormous dataset sizes, in others by model complexity or by real-time performance requirements. Making task-appropriate algorithm and platform choices for large-scale machine learning requires understanding the benefits, trade-offs and constraints of the available options. Solutions presented in the book cover a range of parallelization platforms from FPGAs and GPUs to multi-core systems and commodity clusters, concurrent programming frameworks including CUDA, MPI, MapReduce and DryadLINQ, and learning settings (supervised, unsupervised, semi-supervised and online learning). Extensive coverage of parallelization of boosted trees, SVMs, spectral clustering, belief propagation and other popular learning algorithms and deep dives into several applications make the book equally useful for researchers, students and practitioners
Item Description:Title from publisher's bibliographic system (viewed on 05 Oct 2015)
Physical Description:1 online resource (xvi, 475 pages)
ISBN:9781139042918
DOI:10.1017/CBO9781139042918

There is no print copy available.

Interlibrary loan Place Request Caution: Not in THWS collection! Get full text