, Events

A Modern guide to hyperparameter optimization - Presentation hosted by GSB.

Location: MSB Hörsaal 5701.EG.001, Boltzmannstr. 11

Date: Wed 18 December at 14:00

Speaker: Richard Liaw, UC Berkeley RISELab

Title: A Modern guide to hyperparameter optimization

Abstract:
Modern deep learning model performance is very dependent on the choice of model hyperparameters, and the tuning process is a major bottleneck in the machine learning pipeline. The talk will first motivate the need for advancements in hyperparameter tuning methods. The talk will then overview standard methods for hyperparameter tuning: grid search, random search, and bayesian optimization. Then, we will motivate and discuss cutting edge methods for hyperparameter tuning: multi-fidelity bayesian optimization, successive halving algorithms (HyperBand), and population-based training. The talk will then present a overview of Tune, a scalable hyperparameter tuning system from the UC Berkeley RISELab, and demonstrate about how users can leverage cutting edge hyperparameter tuning methods implemented in Tune to quickly improve the performance of standard deep learning models. Tune is completely open source at tune.io.

Note: Please come with questions, especially if you have used Ray or Ray.Tune before!

Bio: Richard Liaw is a current PhD student in the Computer Science Department at UC Berkeley, advised by Joseph Gonzalez, Ion Stoica, and Ken Goldberg. He is a member of the RISELab, the AUTOLAB, and the Berkeley AI Research Lab. He currently works heavily on Ray, a distributed execution engine (mainly for Python). Email: rliaw@berkeley.edu

This presentation is hosted by the Graduate School of Bioengineering. Please contact JohnLaMaster if you have any questions.