19:46
Imperative Verbs Overload in ChatGPT? Try This Possible Fix!
Maple Grove Productions
Shared 24/11/2023
30:05
XLNet: Generalized Autoregressive Pretraining for Language Understanding
Yannic Kilcher
Shared 03/07/2019
40:12
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Yannic Kilcher
Shared 30/01/2019
55:15
Transformer Architecture Explained | Attention Is All You Need | Foundation of BERT, GPT-3, RoBERTa
Deep Learning Explainer
Shared 07/09/2020
50:21
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Paper Explained)
Deep Learning Explainer
Shared 19/10/2020
26:55
Pre-training Is (Almost) All You Need: An Application to Commonsense Reasoning (Paper Explained)
Deep Learning Explainer
Shared 04/08/2020
48:04
Revealing Dark Secrets of BERT (Analysis of BERT's Attention Heads) - Paper Explained
Deep Learning Explainer
Shared 28/06/2020
53:59
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators (paper explained)
Deep Learning Explainer
Shared 15/03/2020
28:55
GPT-3's New AI is a NIMBY! NIMBY Conversation Simulator 2020 #YIMBY #HousingForAll
The Fourth Industrial Revolution
Shared 25/07/2020
20:26
How far can we scale up? Deep Learning's Diminishing Returns (Article Review)
Yannic Kilcher
Shared 02/10/2021