NYU Deep Learning Week 11– Lecture: PyTorch activation and loss functions
AIP - State-of-the-Art AI Research
Join the channel membership: https://www.youtube.com/c/AIPursuit/join
Subscribe to the channel: https://www.youtube.com/c/AIPursuit?sub_confirmation=1
Support and Donation: Paypal ⇢ https://paypal.me/tayhengee Patreon ⇢ https://www.patreon.com/hengee BTC ⇢ bc1q2r7eymlf20576alvcmryn28tgrvxqw5r30cmpu ETH ⇢ 0x58c4bD4244686F3b4e636EfeBD159258A5513744 Doge ⇢ DSGNbzuS1s6x81ZSbSHHV5uGDxJXePeyKy
Wanted to own BTC, ETH, or even Dogecoin? Kickstart your crypto portfolio with the largest crypto market Binance with my affiliate link: https://accounts.binance.com/en/register?ref=27700065 BuyMeACoffee: https://www.buymeacoffee.com/angustay
The video was published under the license of the Creative Commons Attribution license (reuse allowed) and is reposted for educational purposes. Source: https://youtu.be/bj1fh3BvqSU Course website: http://bit.ly/pDL-home 0:00:00 – Week 11 – Lecture
LECTURE Part A: In this section, we discussed the common activation functions in Pytorch. In particular, we compared activations with kink(s) versus smooth activations - the former is preferred in a deep neural network as the latter might suffer from gradient vanishing problem. We then learned about the common loss functions in Pytorch. 0:00:15 – Activation Functions 0:14:21 – Q&A of activation 0:33:10 – Loss Functions (until AdaptiveLogSoftMax) LECTURE Part B: In this section, we continued to learn about loss functions - in particular, margin-based losses and their applications. We then discussed how to design a good loss function for EBMs as well as examples of well-known EBM loss functions. We gave particular attention to margin-based loss function here, as well as explaining the idea of “most offending incorrect answer. 0:53:27 – Loss Functions (until CosineEmbeddingLoss) 1:08:23 – Loss Functions and Loss Functions for Energy-Based Models 1:23:18 – Loss Functions for Energy-Based Models ... https://www.youtube.com/watch?v=W3dh1DOKS5g
362374507 Bytes