Optimizing Your Communication: The Importance of Monitoring Message History

Introduction

In the ever-evolving world of chatbot applications, maintaining message history can be essential for delivering context-aware responses that enhance user experiences. In this article, we will dive into the realm of Python and LangChain and explore two exemplary scenarios that highlight the importance of message history tracking and how it can improve chatbot interactions.

ConversationChain

By default, LangChain's ConversationChain has a simple type of memory that remembers all previous inputs/outputs and adds them to the context that is passed. This can be considered a type of short-term memory. Here's an example of how to use ConversationChain with short-term memory. As always, remember to set the OPENAI_API_KEY environment variable with your API token before running this code. Remember to install the required packages with the following command: pip install langchain==0.1.4 deeplake==3.9.27 openai==0.27.8 tiktoken.

from langchain import OpenAI, ConversationChain

llm = OpenAI(model_name="gpt-3.5-turbo-instruct", temperature=0)
conversation = ConversationChain(llm=llm, verbose=True)

output = conversation.predict(input="Hi there!")

print(output)
The sample code.
> Entering new ConversationChain chain...

Prompt after formatting:

The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:

Human: Hi there!
AI:

> Finished chain.

Hi there! It's nice to meet you. How can I help you today?
The output.

We can use the same conversation object to keep interacting with the model and ask various questions. The following block will ask three questions, however, we will only print the output for the last line of code which shows the history as well.

output = conversation.predict(input="In what scenarios extra memory should be used?")
output = conversation.predict(input="There are various types of memory in Langchain. When to use which type?")
output = conversation.predict(input="Do you remember what was our first message?")

print(output)
The sample code.
> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:
Human: Hi there!
AI:  Hi there! It's nice to meet you. How can I help you today?
Human: In what scenarios extra memory should be used?
AI:  Extra memory should be used when you need to store more data than the amount of memory your device has available. For example, if you are running a program that requires a lot of data to be stored, you may need to add extra memory to your device in order to run the program efficiently.
Human: There are various types of memory in Langchain. When to use which type?
AI:  Different types of memory in Langchain are used for different purposes. For example, RAM is used for short-term storage of data, while ROM is used for long-term storage of data. Flash memory is used for storing data that needs to be accessed quickly, while EEPROM is used for storing data that needs to be retained even when the power is turned off. Depending on the type of data you need to store, you should choose the appropriate type of memory.
Human: Do you remember what was our first message?
AI:

> Finished chain.

Yes, our first message was "Hi there!"
The output.

As you can see from the “Current Conversation” section of the output, the model have access to all the previous messages. It can also remember what the initial message were after 3 questions.

The ConversationChain is a powerful tool that leverages past messages to produce fitting replies, resulting in comprehensive and knowledgeable outputs. This extra memory is invaluable when chatbots have to remember lots of details, especially when users ask for complicated information or engage in complex chats. By implementing the ConversationChain, users can enjoy seamless interactions with chatbots, ultimately enhancing their overall experience.

ConversationBufferMemory

The ConversationChain uses the ConversationBufferMemory class by default to provide a history of messages. This memory can save the previous conversations in form of variables. The class accepts the return_messages argument which is helpful for dealing with chat models. This is how the CoversationChain keep context under the hood.

from langchain.memory import ConversationBufferMemory

memory = ConversationBufferMemory(return_messages=True)
memory.save_context({"input": "hi there!"}, {"output": "Hi there! It's nice to meet you. How can I help you today?"})

print( memory.load_memory_variables({}) )
The sample code.
{'history': [HumanMessage(content='hi there!', additional_kwargs={}, example=False),
  AIMessage(content="Hi there! It's nice to meet you. How can I help you today?", additional_kwargs={}, example=False)]}
The output.

Alternatively, the code in the previous section is the same as the following. It will automatically call the .save_context() object after each interaction.

from langchain.chains import ConversationChain

conversation = ConversationChain(
    llm=llm, 
    verbose=True, 
    memory=ConversationBufferMemory()
)

The next code snippet shows the full usage of the ConversationChain and the ConversationBufferMemory class. Another basic example of how the chatbot keeps track of the conversation history, allowing it to generate context-aware responses.

from langchain import ConversationChain
from langchain.memory import ConversationBufferMemory
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder, SystemMessagePromptTemplate, HumanMessagePromptTemplate

prompt = ChatPromptTemplate.from_messages([
    SystemMessagePromptTemplate.from_template("The following is a friendly conversation between a human and an AI."),
    MessagesPlaceholder(variable_name="history"),
    HumanMessagePromptTemplate.from_template("{input}")
])

memory = ConversationBufferMemory(return_messages=True)
conversation = ConversationChain(memory=memory, prompt=prompt, llm=llm)


print( conversation.predict(input="Tell me a joke about elephants") )
print( conversation.predict(input="Who is the author of the Harry Potter series?") )
print( conversation.predict(input="What was the joke you told me earlier?") )
The sample code.
AI: What did the elephant say to the naked man? "How do you breathe through that tiny thing?

AI: The author of the Harry Potter series is J.K. Rowling

AI: The joke I told you earlier was "What did the elephant say to the naked man? \'How do you breathe through that tiny thing?
The output.

Here we used MessagesPlaceholder function to create a placeholder for the conversation history in a chat model prompt. It is particularly useful when working with ConversationChain and ConversationBufferMemory to maintain the context of a conversation. The MessagesPlaceholder function takes a variable name as an argument, which is used to store the conversation history in the memory buffer. We will cover that function later.

In the next scenario, a user interacts with a chatbot to find information about a specific topic, in this case, a particular question related to the Internet.

from langchain import ConversationChain
from langchain.memory import ConversationBufferMemory
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder, SystemMessagePromptTemplate, HumanMessagePromptTemplate

prompt = ChatPromptTemplate.from_messages([
    SystemMessagePromptTemplate.from_template("The following is a friendly conversation between a human and an AI."),
    MessagesPlaceholder(variable_name="history"),
    HumanMessagePromptTemplate.from_template("{input}")
])

memory = ConversationBufferMemory(return_messages=True)
conversation = ConversationChain(memory=memory, prompt=prompt, llm=llm, verbose=True)

If we start with a general question:

user_message = "Tell me about the history of the Internet."
response = conversation(user_message)
print(response)
The sample code.
> Entering new ConversationChain chain...
Prompt after formatting:
System: The following is a friendly conversation between a human and an AI.
Human: Tell me about the history of the Internet.

> Finished chain.
{'input': 'Tell me about the history of the Internet.', 'history': [HumanMessage(content='Tell me about the history of the Internet.', additional_kwargs={}, example=False), AIMessage(content='\n\nAI: The Internet has a long and complex history. It began in the 1960s as a project of the United States Department of Defense, which wanted to create a network of computers that could communicate with each other in the event of a nuclear attack. This network eventually evolved into the modern Internet, which is now used by billions of people around the world.', additional_kwargs={}, example=False)], 'response': '\n\nAI: The Internet has a long and complex history. It began in the 1960s as a project of the United States Department of Defense, which wanted to create a network of computers that could communicate with each other in the event of a nuclear attack. This network eventually evolved into the modern Internet, which is now used by billions of people around the world.'}
The output.

Here is the second query.

# User sends another message
user_message = "Who are some important figures in its development?"
response = conversation(user_message)
print(response)  # Chatbot responds with names of important figures, recalling the previous topic
> Entering new ConversationChain chain...
Prompt after formatting:
System: The following is a friendly conversation between a human and an AI.
Human: Tell me about the history of the Internet.
AI: 

AI: The Internet has a long and complex history. It began in the 1960s as a project of the United States Department of Defense, which wanted to create a network of computers that could communicate with each other in the event of a nuclear attack. This network eventually evolved into the modern Internet, which is now used by billions of people around the world.
Human: Who are some important figures in its development?

> Finished chain.
{'input': 'Who are some important figures in its development?', 'history': [HumanMessage(content='Tell me about the history of the Internet.', additional_kwargs={}, example=False), AIMessage(content='\n\nAI: The Internet has a long and complex history. It began in the 1960s as a project of the United States Department of Defense, which wanted to create a network of computers that could communicate with each other in the event of a nuclear attack. This network eventually evolved into the modern Internet, which is now used by billions of people around the world.', additional_kwargs={}, example=False), HumanMessage(content='Who are some important figures in its development?', additional_kwargs={}, example=False), AIMessage(content='\nAI:\n\nSome of the most important figures in the development of the Internet include Vint Cerf and Bob Kahn, who developed the TCP/IP protocol, Tim Berners-Lee, who developed the World Wide Web, and Marc Andreessen, who developed the first web browser.', additional_kwargs={}, example=False)], 'response': '\nAI:\n\nSome of the most important figures in the development of the Internet include Vint Cerf and Bob Kahn, who developed the TCP/IP protocol, Tim Berners-Lee, who developed the World Wide Web, and Marc Andreessen, who developed the first web browser.'}

And the last query that showcase how using ConversationBufferMemory enables the chatbot to recall previous messages and provide more accurate and context-aware responses to the user's questions.

user_message = "What did Tim Berners-Lee contribute?"
response = conversation(user_message)
print(response)
> Entering new ConversationChain chain...
Prompt after formatting:
System: The following is a friendly conversation between a human and an AI.
Human: Tell me about the history of the Internet.
AI: 

AI: The Internet has a long and complex history. It began in the 1960s as a project of the United States Department of Defense, which wanted to create a network of computers that could communicate with each other in the event of a nuclear attack. This network eventually evolved into the modern Internet, which is now used by billions of people around the world.
Human: Who are some important figures in its development?
AI: 
AI:

Some of the most important figures in the development of the Internet include Vint Cerf and Bob Kahn, who developed the TCP/IP protocol, Tim Berners-Lee, who developed the World Wide Web, and Marc Andreessen, who developed the first web browser.
Human: What did Tim Berners-Lee contribute?

> Finished chain.
{'input': 'What did Tim Berners-Lee contribute?', 'history': [HumanMessage(content='Tell me about the history of the Internet.', additional_kwargs={}, example=False), AIMessage(content='\n\nAI: The Internet has a long and complex history. It began in the 1960s as a project of the United States Department of Defense, which wanted to create a network of computers that could communicate with each other in the event of a nuclear attack. This network eventually evolved into the modern Internet, which is now used by billions of people around the world.', additional_kwargs={}, example=False), HumanMessage(content='Who are some important figures in its development?', additional_kwargs={}, example=False), AIMessage(content='\nAI:\n\nSome of the most important figures in the development of the Internet include Vint Cerf and Bob Kahn, who developed the TCP/IP protocol, Tim Berners-Lee, who developed the World Wide Web, and Marc Andreessen, who developed the first web browser.', additional_kwargs={}, example=False), HumanMessage(content='What did Tim Berners-Lee contribute?', additional_kwargs={}, example=False), AIMessage(content='\nAI: \n\nTim Berners-Lee is credited with inventing the World Wide Web, which is the system of interlinked documents and other resources that make up the Internet. He developed the Hypertext Transfer Protocol (HTTP) and the Hypertext Markup Language (HTML), which are the two main technologies used to create and display webpages. He also developed the first web browser, which allowed users to access the web.', additional_kwargs={}, example=False)], 'response': '\nAI: \n\nTim Berners-Lee is credited with inventing the World Wide Web, which is the system of interlinked documents and other resources that make up the Internet. He developed the Hypertext Transfer Protocol (HTTP) and the Hypertext Markup Language (HTML), which are the two main technologies used to create and display webpages. He also developed the first web browser, which allowed users to access the web.'}

In the upcoming lessons, we will cover several more types of conversational memory such as

→ ConversationBufferMemory, which is the most straightforward, then

→ ConversationBufferWindowMemory, which maintains a memory window that keeps a limited number of past interactions based on the specified window size.

→ And the most complex variant, ConversationSummaryMemory that holds a summary of previous converations.

Conclusion

Keeping track of message history in chatbot interactions yields several benefits. Firstly, the chatbot gains a stronger sense of context from previous interactions, improving the accuracy and relevance of its responses. Secondly, the recorded history serves as a valuable resource for troubleshooting, tracing the sequence of events to identify potential issues. Thirdly, effective monitoring systems that include log tracking can trigger notifications based on alert conditions, aiding in the early detection of conversation anomalies. Lastly, monitoring message history provides a means to evaluate the chatbot's performance over time, paving the way for necessary adjustments and enhancements.

While monitoring message history can offer numerous advantages, there are also some trade-offs to consider. Storing extensive message history can lead to significant memory and storage usage, potentially impacting the overall system performance. Additionally, maintaining conversation history might present privacy issues, particularly when sensitive or personally identifiable information is involved. Therefore, it is crucial to manage such data with utmost responsibility and in compliance with the relevant data protection regulations.

To sum up, monitoring message history in LangChain is crucial for providing context-aware, accurate, and engaging AI-driven conversations. It also offers valuable information for troubleshooting, alerting, and performance evaluation. However, it's essential to be mindful of the trade-offs, such as memory and storage consumption and privacy concerns.

In the next lesson, we’ll see the different memory classes that LangChain has and when to use them.

You can find the code of this lesson in this online Notebook.