Using ConversationBufferMemory with ConversationChain #3116
Unanswered
PeppeMiller
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm using gpt-3.5-turbo and need to have a SequentialChain where the first item in the chain (ChainA) adds a new "System" message (instructions for GPT to follow) into the message list. The second item in the chain needs to call to GPT to run the chat, passing in the full chat history with that System message included.
The correct way to this seems to be to use ConversationBufferMemory and have my ChainA add the new System message to the list. And then have ConversationChain handle the conversation. The issue I run into is that if I use ConversationChain, it inserts the entire conversation into the "history" variable in the prompt template, and sends it as one OpenAI "message" object. However, I need it to be multiple OpenAI message objects (like ChatPromptTemplate does), so that GPT respects the role of each message.
It appears that there is no way to combine ConversationBufferMemory with OpenAI's multiple message API. Is that the case, or am I missing something? Is this on the roadmap? I'm happy to pitch in to work on this.
Beta Was this translation helpful? Give feedback.
All reactions