Intro to Dense Vectors for NLP and Vision
James Briggs
There is perhaps no greater component to the success of modern Natural Language Processing (NLP) technology than vector representations of language. The meteoric early 2010s rise of NLP was ignited with the introduction of word2vec by a team lead by Tomáš Mikolov in 2013.
Word2vec is one of the most iconic and earliest examples of dense vectors representing text. But since the days of word2vec, developments in representing language have advanced at ludicrous speeds.
This video will explore why we use dense vectors - and some of the best approaches to building dense vectors available today.
🌲 Pinecone article: https://www.pinecone.io/learn/dense-vector-embeddings-nlp/
🤖 70% Discount on the NLP With Transformers in Python course: https://bit.ly/3DFvvY5
🎉 Subscribe for Article and Video Updates! https://jamescalam.medium.com/subscribe https://medium.com/@jamescalam/membership
👾 Discord: https://discord.gg/c5QtDB9RAP
00:00 Intro 01:50 Why Dense Vectors? 03:55 Word2vec and Representing Meaning 08:40 Sentence Transformers 09:58 Sentence Transformers in Python 15:08 Question-Answering 18:18 DPR in Python 29:55 Vision Transformers 33:22 OpenAI's CLIP in Python 42:49 Review and What's Next ... https://www.youtube.com/watch?v=bVZJ_O_-0RE
320418875 Bytes