Discussions
You can download the discussions here. We will try to upload discussion material prior to their corresponding sessions. Future discussions listed below likely have broken links.
-
Discussion 1
tl;dr: We will learn to use PyTorch, the framework we will be using for implementing deep neural networks.
Notebook: 00_pytorch.ipynb
Suggested Readings: -
Discussion 2
tl;dr: We will use PyTorch's automatic differentiation to train some simple models.
Notebook: 02_gradient_descent.ipynb
Suggested Readings: -
Discussion 3
tl;dr: We will practice running notebooks on the shared compute cluster.
Notebook: 03_scc.ipynb
Suggested Readings: -
Discussion 4
tl;dr: We will build deep neural networks and compare training them with good and bad initialization strategies.
Notebook: 04_neural_network.ipynb
Suggested Readings: -
Discussion 5
tl;dr: We will compare deep convolutional networks with and without residual connections.
Notebook: 05_deep.ipynb
Suggested Readings:- TBD
-
Discussion 6
tl;dr: We will test regularization strategies and measure their impact on our models.
Notebook: 06_regularization.ipynb
Suggested Readings:- TBD
-
Discussion 7
tl;dr: We will train LSTM models on simple string manipulation tasks.
Notebook: 07_rnn.ipynb
Suggested Readings:- TBD
-
Discussion 8
tl;dr: We will train attention-based models on simple string manipulation tasks.
Notebook: 08_attention.ipynb
Suggested Readings:- TBD
-
Discussion 9
tl;dr: We will use a basic LLM training workflow to train a small LLM.
Notebook: 09_llms.ipynb
Suggested Readings:- TBD
-
Discussion 10
tl;dr: We will construct adversarial examples and train a small GAN.
Notebook: 10_adversarial.ipynb
Suggested Readings:- TBD
-
-
-