ROOM 1. AI specific tools and techniques

Tutorial. Dist. HPO and Uncertainty

Room1:

Monday September 18, 2023

02:00 - 17:30 hours

Instructor:

MARIA PANTOJA California Polytechnic State University, USA

Program:

  • Brief organization of tutorial
  • Introduction to Hyper Parameter Optimization and Uncertainty in Deep Learning and Motivation. 1 1/2 hours. (using slides)
    • HPO tools
    • Applications
    • Uncertainty tools
    • How to measure Uncertainty
  • Hands-on Tutorial on HPO 1.5 hours.(Jupyter Notebook/Google Colab)   1h
  • Hands-on Tutorial Uncertainty Estimation for Image Classification.   1 ½ h

 

Chair(s):
Information

AI specific tools and techniques: Distributed HyperParameter Optimization and Uncertainty Evaluation for Deep Learning

 

Training and validation of Deep Learning (DL) are very computationally intensive. Traditionally, researchers design and train DL to produce an estimate by setting different values for tunable model parameters (hyperparameters) and then using gradient descent to optimize the iterative algorithm. Unfortunately, when configuring a NN model for a machine learning application, there is often no clear-cut optimal way to select default values for hyperparameters, including the structure (number of layers, number of neurons per layer, type of the layer, and others) of the model. Another problem in DL is that it classifies raw input data according to the patterns learned from an input training set, most models assume that the input data distribution is identical between test and validation, but in reality, they are not. The objective of Uncertainty for DL is to provide not just a single prediction but a distribution over prediction that can potentially be used to answer the question "Does the model know what it doesn't know?".

This tutorial presents an overview of two common tasks performed in DL. First, how to tune hyperparameters and second to evaluate the uncertainty of the model. Two tasks that seem unrelated but can be used together to improve the robustness of the model efficiently.

 

Student's prerequisites

Audience: the intended audience is intermediate machine learning students. If the attendees need a basic tutorial on deep learning, the first two classes from MIT 6.S191 are a great resource (\url{http://introtodeeplearning.com/}), but any other basic tutorial will suffice. We can do it in English, or Spanish (one of the TAs also speaks Portuguese) whatever the preference of the audience is.

  • Special conditions for accessing the tutorial. For instance: “Laptop with X Linux installed” or “registration for X website previous to the tutorial”). Remember we don't have rooms equipped with computers, just Internet Access. Students must use their personal equipment

Attendees will require a laptop computer that can view and run jupyter notebook specifically  \url{google.colab.com}. Note that sometimes tablets can run colab pages, but it is definitely tricky to do it on a smartphone. There will be no software needed. Everything will be in the notebook.

References

[1] Kendall A and Gal Y What uncertainties do we need in Bayesian Deep Learning for Computer Vision NIPS 2017 https://arxiv.org/abs/1703.04977

[2] Weideman H. Quantifying Uncertainty in Neural Networks https://hjweide.github.io/quantifying-uncertainty-in-neural-networks

[3] Amini A., Schwarting W., Soleimany A. and Rus D. Deep evidential regression, Advances in Neural InformationProcessing Systems, 33, 2020

[4] Balaji Lakshminarayanan, Dustin Tran, and Jasper Snoek Introduction to Uncertainty in Deep Learning https://www.gatsby.ucl.ac.uk/~balaji/balaji-uncertainty-talk-cifar-dlrl.pdf

 

Instructor(s):
Maria Pantoja