NYU Deep Learning Week 2 – Lecture: Stochastic gradient descent and Backpropagation
AIP - State-of-the-Art AI Research
Join the channel membership: https://www.youtube.com/c/AIPursuit/join
Subscribe to the channel: https://www.youtube.com/c/AIPursuit?sub_confirmation=1
Support and Donation: Paypal ⇢ https://paypal.me/tayhengee Patreon ⇢ https://www.patreon.com/hengee BTC ⇢ bc1q2r7eymlf20576alvcmryn28tgrvxqw5r30cmpu ETH ⇢ 0x58c4bD4244686F3b4e636EfeBD159258A5513744 Doge ⇢ DSGNbzuS1s6x81ZSbSHHV5uGDxJXePeyKy
Wanted to own BTC, ETH, or even Dogecoin? Kickstart your crypto portfolio with the largest crypto market Binance with my affiliate link: https://accounts.binance.com/en/register?ref=27700065 BuyMeACoffee: https://www.buymeacoffee.com/angustay
The video was published under the license of the Creative Commons Attribution license (reuse allowed) and is reposted for educational purposes. Source: https://youtu.be/d9vdh3b787Y Course website: http://bit.ly/pDL-home
0:00:00 – Week 2 – Lecture
LECTURE Part A: http://bit.ly/pDL-en-02-1 We start by understanding what parameterized models are and then discuss what a loss function is. We then look at gradient-based methods and how it's used in the backpropagation algorithm in a traditional neural network. We conclude this section by learning how to implement a neural network in PyTorch followed by a discussion on a more generalized form of backpropagation. 0:00:29 – Gradient Descent Optimization Algorithm 0:17:16 – Advantages of SGD, Backpropagation for Traditional Neural Net 0:38:08 – PyTorch implementation of Neural Network and a Generalized Backprop Algorithm
LECTURE Part B: http://bit.ly/pDL-en-02-2 We begin with a concrete example of backpropagation and discuss the dimensions of Jacobian matrices. We then look at various basic neural net modules and compute their gradients, followed by a brief discussion on softmax and logsoftmax. The other topic of discussion in this part is Practical Tricks for Backpropagation. 0:49:49 – Basic Modules - LogSoftMax 1:05:53 – Practical Tricks for Backpropagation 1:21:31 – Computing gradients for NN modules and Practical tricks for Back Propagation ... https://www.youtube.com/watch?v=y-4GOWLns_I
306244627 Bytes