Applied Deep Learning with PyTorch
Go to main | Course Page
Section 1: Introduction to PyTorch
- Tensors
- Dataset Class and Dataloader
- Autograd
- Optimizer
- Model Class
- Saving Models
Section 2: Setup and Helper Functions for PyTorch (Code Demo)
- Setup Environment
- Data for Clasification and Regression
- Helper Class for DataLoader
- Helper Functions for Classification and Regression
Section 3: Logistic Regression in PyTorch (Code Demo)
- Initialize Dataloader
- Define Model architecture and Initialize
- Loss and Optimizer
- Model Training
- Model Evaluation
- Save Model
Section 4: 2-Layer Neural Network for Classification in PyTorch (Code Demo)
- Initialize Dataloader
- Define Model architecture and initialize
- Loss and Optimizer and model training
Section 5: 5-Layer Neural Network for Regression (Week 1 Assignment) in PyTorch (Code Demo)
- Initialize Dataloader
- Define Model architecture and initialize
- Loss and Optimizer
- Model Training
- Model Evaluation and save the model
Section 6: Regularization
- Introduction to Regularization
- L1 and L2 Regularization
- Why Regularization Works
- Dropout Regularization
- Inverted Dropout
- Early Stopping
Section 7: Regularization in PyTorch (Code Demo)
- Batch Normalization
- Regularization - Dropout
- L1 Regularization
- L2 Regularization
- ElasticNet Regularization
- Regularization - Early Stopping
Section 8: Optimization
- Stochastic Gradient Descent
- Stochastic Gradient Descent With Momentum
- Adaptive Gradient Algorithm (AdaGrad)
- Root Mean Squared Propagation (RMSProp)
- Adaptive Moment Estimation (Adam)
- How to Choose - Checklist
Section 9: Optimization in PyTorch (Code Demo)
- Code Implemendation of Optimization