Chat Memory | History
in
Build Chat Applications with OpenAI and LangChain
/
Conversation Buffer Memory: Configuring the Chain
By adding chat Memory will we spend more tokens when asking a new question because the previous questions and answers will be passed to the LLM ?
1 answers ( 0 marked as helpful)
Hey Rui,
That's correct, you will pass more information with each new prompt and hence, use up more and more tokens. That's why alternative and more efficient methods like Conversation Summary Memory have been designed.
Enjoy the rest of the course!
Kind regards,
365 Hristina
That's correct, you will pass more information with each new prompt and hence, use up more and more tokens. That's why alternative and more efficient methods like Conversation Summary Memory have been designed.
Enjoy the rest of the course!
Kind regards,
365 Hristina