Distributed & Stochastic Optimization for Machine Learning (Spring 2017)

  • Introduction [slides] (introductory material on SVM’s)

  • Basics convexity, duality [slides]

  • Setup instructions for python: python3, numpy, numba, scikit-learn, ipython notebook / jupyter.

  • Practical session 1: [Github link]

  • Gradient descent, proximal gradient descent, SG [slides]

  • Incremental methods, coordinate ascent [slides]

Project: Choose one of the projects below, or suggest one by email.

  • Implement the SDCA algorithm to estimate support Vector Machines, LASSO. Test the algorithm on databases of your choice and compare it with a batch (primal) proximal gradient descent approach.

  • Implement COCOA on a single machine (can be distributed across several machines using Spark, but this won’t be needed, the skeleton of the algorithm will be ok for this time, the algorithm can be run on multiple cores) for logistic regression, benchmark it against naive alternatives on a large dataset.

  • Implement and discuss the efficiency of 3 incremental gradient algorithm (SVRG, SAGA, MISO) and benchmark them against batch-gradient descent on a sparse logistic regression problem.