4 Memory Types of LangChain to Enhance the Performance of LLMs

Soner Yıldırım
6 min readJun 25, 2023

LangChain extends the abilities of large language models.

Photo by Brett Jordan on Unsplash

In an earlier article, I explained the memory component of LangChain and how it can be seamlessly integrated within a chain, along with a large language model (LLM).

In a nutshell, the memory component stores the messages and extracts them in a variable, making the underlying model in a chain to remember previous interactions. To be stateful and remember the previous messages is of crucial importance for some applications (e.g. chat bot).

In this article, we’ll go through examples to learn the following types of memory components of LangChain:

  • ConversationBufferMemory
  • ConversationBufferWindowMemory
  • ConversationTokenBufferMemory
  • ConversationSummaryMemory

Setting up the API key

We’ll use OpenAI’s ChatGPT as our LLM so we need to set up an API key.

import os
import openai

from dotenv import load_dotenv, find_dotenv
_ = load_dotenv(find_dotenv())
openai.api_key = os.environ['OPENAI_API_KEY']

For this code to work and set up the API key, you need to create an environment variable named OPENAI_API_KEY, which holds the…

--

--