Intro to Sentence Embeddings with Transformers
James Briggs
Transformers have wholly rebuilt the landscape of natural language processing (NLP). Before transformers, we had okay translation and language classification thanks to recurrent neural nets (RNNs) - their language comprehension was limited and led to many minor mistakes, and coherence over larger chunks of text was practically impossible.
Since the introduction of the first transformer model in the 2017 paper 'Attention is all you need', NLP has moved from RNNs to models like BERT and GPT. These new models can answer questions, write articles (maybe GPT-3 wrote this), enable incredibly intuitive semantic search - and much more.
In this video, we will explore how these embeddings have been adapted and applied to a range of semantic similarity applications by using a new breed of transformers called 'sentence transformers'.
š² Pinecone article: https://www.pinecone.io/learn/sentence-embeddings/
Vectors in ML: https://www.youtube.com/playlist?list=PLIUOU7oqGTLgz-BI8bNMVGwQxIMuQddJO
š¤ 70% Discount on the NLP With Transformers in Python course: https://bit.ly/3DFvvY5
š Subscribe for Article and Video Updates! https://jamescalam.medium.com/subscribe https://medium.com/@jamescalam/membership
š¾ Discord: https://discord.gg/c5QtDB9RAP ... https://www.youtube.com/watch?v=WS1uVMGhlWQ
293472475 Bytes