Discussions
You can download the discussions here. We will try to upload discussion material prior to their corresponding sessions. Future discussions listed below likely have broken links.
-
Discussion 1
tl;dr: We will learn to use PyTorch, the framework we will be using for implementing deep neural networks.
Notebook: discussion_1.ipynb Suggested Readings:
-
Discussion 2
tl;dr: We will use PyTorch's automatic differentiation to fit some simple models.
Notebook: discussion_2.ipynb Suggested Readings:
-
Discussion 3
tl;dr: We will practice running notebooks on the shared compute cluster.
Notebook: discussion_3.ipynb Suggested Readings:
-
Discussion 4
tl;dr: We will build deep neural networks and compare training them with good and bad initialization strategies.
Notebook: discussion_4.ipynb
Suggested Readings: -
Discussion 5
tl;dr: We will compare deep convolutional networks with and without residual connections.
Notebook: discussion_5.ipynb
Suggested Readings:- TBD
-
Discussion 6
tl;dr: We will test regularization strategies and measure their impact on our models.
Notebook: discussion_6.ipynb
Suggested Readings:- TBD
-
Discussion 7
tl;dr: We will train LSTM models on simple string manipulation tasks.
Notebook: discussion_7.ipynb
Suggested Readings:- TBD
-
Discussion 8
tl;dr: We will train attention-based models on simple string manipulation tasks.
Notebook: discussion_8.ipynb
Suggested Readings:- TBD
-
Discussion 9
tl;dr: We will use a basic LLM training workflow to train a small LLM.
Notebook: discussion_9.ipynb
Suggested Readings:- TBD
-
Discussion 10
tl;dr: We will construct adversarial examples and train a small GAN.
Notebook: discussion_10.ipynb
Suggested Readings:- TBD
-
-
-