Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a conference talk from OSDI '14 that introduces a parameter server framework for distributed machine learning. Learn about the framework's ability to manage asynchronous data communication between nodes, support flexible consistency models, elastic scalability, and continuous fault tolerance. Discover how this approach distributes both data and workloads over worker nodes while maintaining globally shared parameters on server nodes. Examine experimental results demonstrating the framework's scalability on petabytes of real data with billions of examples and parameters, covering problems from Sparse Logistic Regression to Latent Dirichlet Allocation and Distributed Sketching.