Chat with OpenAI in LangChain - #5
James Briggs
With the advent of OpenAI's Chat-GPT API endpoint (ChatCompletion), LangChain quickly added support for the new endpoint. Unlike previous LLM endpoints, the chat endpoint takes multiple inputs and so has its own unique set of objects and methods.
OpenAI's ChatCompletion
endpoint consumes three types of input:
- System message ā this acts as an initial prompt to "setup" the behavior of the chat completion.
- Human messages ā these are human prompts (both current and past) that are fed into the model.
- AI messages ā past AI responses to the human prompts.
The prior Completion
endpoint used for other models, like OpenAI's text-davinci-003
, only accepted a single input
field. In input
, we would write everything.
Because of this difference, we now have the LangChain ChatOpenAI
object alongside several new prompt templates and "message" objects. We'll explore how these are used in this chapter.
š Code notebook: https://github.com/pinecone-io/examples/blob/master/learn/generation/langchain/handbook/04-langchain-chat.ipynb
š² LangChain ebook: https://pinecone.io/learn/langchain/
šļø AI Dev Studio: https://aurelio.ai/
š Subscribe for Article and Video Updates! https://jamescalam.medium.com/subscribe https://medium.com/@jamescalam/membership
š¾ Discord: https://discord.gg/c5QtDB9RAP
00:00 LangChain's new Chat modules 02:09 New LangChain chat in Python 03:14 Using LangChains ChatOpenAI object 04:36 Chat messages in LangChain 06:43 New chat prompt templates 09:05 LangChain human message prompt template 13:18 Using multiple chat prompt templates 17:42 F-strings vs. LangChain prompt templates 19:23 Where to use LangChain chat features?
#artificialintelligence #nlp #openai #deeplearning #langchain ... https://www.youtube.com/watch?v=CnAgB3A5OlU
276500325 Bytes