Google is committed to advancing racial equity for Black communities. See how.

Exercises

This page lists the exercises in Machine Learning Crash Course.

The majority of the Programming Exercises use the California housing data set.

All

In March, 2020, this course began using Programming Exercises coded with tf.keras. If you'd prefer to use the legacy Estimators Programming Exercises, you can find them on GitHub.

Framing

Descending into ML

Reducing Loss

First Steps with TensorFlow

Training and Test Sets

Validation

Feature Crosses

Regularization for Simplicity

Classification

Regularization for Sparsity

Intro to Neural Nets

Training Neural Nets

Multi-Class Neural Nets

Fairness

Static vs. Dynamic Training

Static vs. Dynamic Inference

Data Dependencies

Programming

In March, 2020, this course began using Programming Exercises coded with tf.keras. If you'd prefer to use the legacy Estimators Programming Exercises, you can find them on GitHub.

Framing

Descending into ML

Reducing Loss

First Steps with TensorFlow

Training and Test Sets

Validation

Feature Crosses

Regularization for Simplicity

Classification

Regularization for Sparsity

Intro to Neural Nets

Training Neural Nets

Multi-Class Neural Nets

Fairness

Static vs. Dynamic Training

Static vs. Dynamic Inference

Data Dependencies

Check Your Understanding

In March, 2020, this course began using Programming Exercises coded with tf.keras. If you'd prefer to use the legacy Estimators Programming Exercises, you can find them on GitHub.

Framing

Descending into ML

Reducing Loss

First Steps with TensorFlow

Training and Test Sets

Validation

Feature Crosses

Regularization for Simplicity

Classification

Regularization for Sparsity

Intro to Neural Nets

Training Neural Nets

Multi-Class Neural Nets

Fairness

Static vs. Dynamic Training

Static vs. Dynamic Inference

Data Dependencies

Playground

In March, 2020, this course began using Programming Exercises coded with tf.keras. If you'd prefer to use the legacy Estimators Programming Exercises, you can find them on GitHub.

Framing

Descending into ML

Reducing Loss

First Steps with TensorFlow

Training and Test Sets

Validation

Feature Crosses

Regularization for Simplicity

Classification

Regularization for Sparsity

Intro to Neural Nets

Training Neural Nets

Multi-Class Neural Nets

Fairness

Static vs. Dynamic Training

Static vs. Dynamic Inference

Data Dependencies