Summary Buffer Memory¶
In [1]:
Copied!
from langchain.memory import CassandraChatMessageHistory
from langchain.memory import ConversationSummaryBufferMemory
from langchain.chains import ConversationChain
from langchain.memory import CassandraChatMessageHistory
from langchain.memory import ConversationSummaryBufferMemory
from langchain.chains import ConversationChain
In [2]:
Copied!
from cqlsession import getCQLSession, getCQLKeyspace
astraSession = getCQLSession()
astraKeyspace = getCQLKeyspace()
from cqlsession import getCQLSession, getCQLKeyspace
astraSession = getCQLSession()
astraKeyspace = getCQLKeyspace()
/home/stefano/.virtualenvs/langchain-cassio-3.10/lib/python3.10/site-packages/cassandra/datastax/cloud/__init__.py:173: DeprecationWarning: ssl.PROTOCOL_TLS is deprecated ssl_context = SSLContext(PROTOCOL_TLS) /home/stefano/.virtualenvs/langchain-cassio-3.10/lib/python3.10/site-packages/cassandra/io/asyncorereactor.py:347: DeprecationWarning: ssl.match_hostname() is deprecated self._connect_socket()
In [3]:
Copied!
message_history = CassandraChatMessageHistory(
session_id='summarized-conversation-4321',
session=astraSession,
keyspace='langchain',
ttl_seconds = 3600,
)
message_history = CassandraChatMessageHistory(
session_id='summarized-conversation-4321',
session=astraSession,
keyspace='langchain',
ttl_seconds = 3600,
)
In [4]:
Copied!
message_history.clear()
message_history.clear()
Below is the logic to instantiate the LLM of choice. We choose to leave it in the notebooks for clarity.
In [5]:
Copied!
from llm_choice import suggestLLMProvider
llmProvider = suggestLLMProvider()
# (Alternatively set llmProvider to 'VertexAI', 'OpenAI' ... manually if you have credentials)
if llmProvider == 'VertexAI':
from langchain.llms import VertexAI
llm = VertexAI()
print('LLM from VertexAI')
elif llmProvider == 'OpenAI':
from langchain.llms import OpenAI
llm = OpenAI()
print('LLM from OpenAI')
else:
raise ValueError('Unknown LLM provider.')
from llm_choice import suggestLLMProvider
llmProvider = suggestLLMProvider()
# (Alternatively set llmProvider to 'VertexAI', 'OpenAI' ... manually if you have credentials)
if llmProvider == 'VertexAI':
from langchain.llms import VertexAI
llm = VertexAI()
print('LLM from VertexAI')
elif llmProvider == 'OpenAI':
from langchain.llms import OpenAI
llm = OpenAI()
print('LLM from OpenAI')
else:
raise ValueError('Unknown LLM provider.')
/home/stefano/.virtualenvs/langchain-cassio-3.10/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives from distutils import util
LLM from VertexAI
In [6]:
Copied!
memory = ConversationSummaryBufferMemory(
llm=llm,
chat_memory=message_history,
max_token_limit=180,
)
memory = ConversationSummaryBufferMemory(
llm=llm,
chat_memory=message_history,
max_token_limit=180,
)
In [7]:
Copied!
summaryConversation = ConversationChain(
llm=llm,
memory=memory,
verbose=True
)
summaryConversation = ConversationChain(
llm=llm,
memory=memory,
verbose=True
)
In [8]:
Copied!
summaryConversation.predict(input='Tell me about William Tell, the hero')
summaryConversation.predict(input='Tell me about William Tell, the hero')
> Entering new ConversationChain chain... Prompt after formatting: The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know. Current conversation: Human: Tell me about William Tell, the hero AI: > Finished chain.
Out[8]:
'William Tell is a legendary hero of Switzerland.'
In [9]:
Copied!
summaryConversation.predict(
input="Any relation to the apple from Cupertino?"
)
summaryConversation.predict(
input="Any relation to the apple from Cupertino?"
)
> Entering new ConversationChain chain... Prompt after formatting: The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know. Current conversation: Human: Tell me about William Tell, the hero AI: William Tell is a legendary hero of Switzerland. Human: Any relation to the apple from Cupertino? AI: > Finished chain.
Out[9]:
'No, the Apple Inc. is not named after William Tell.'
In [10]:
Copied!
memory.moving_summary_buffer
memory.moving_summary_buffer
Out[10]:
''
In [11]:
Copied!
summaryConversation.predict(
input="I heard the two apples may be in fact the very same fruit."
)
summaryConversation.predict(
input="I heard the two apples may be in fact the very same fruit."
)
> Entering new ConversationChain chain... Prompt after formatting: The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know. Current conversation: Human: Tell me about William Tell, the hero AI: William Tell is a legendary hero of Switzerland. Human: Any relation to the apple from Cupertino? AI: No, the Apple Inc. is not named after William Tell. Human: I heard the two apples may be in fact the very same fruit. AI: > Finished chain.
Out[11]:
'I do not know.'
In [12]:
Copied!
memory.moving_summary_buffer
memory.moving_summary_buffer
Out[12]:
''
In [13]:
Copied!
print(memory.predict_new_summary(
memory.chat_memory.messages,
memory.moving_summary_buffer,
))
print(memory.predict_new_summary(
memory.chat_memory.messages,
memory.moving_summary_buffer,
))
The human asks about William Tell, the hero of Switzerland. The AI says that William Tell is not related to the Apple Inc. The human then asks if the two apples may be in fact the very same fruit. The AI says that it does not know.
In [14]:
Copied!
summaryConversation.predict(
input='Ok, enough with apples. Let\'s talk starfish now.'
)
summaryConversation.predict(
input='Ok, enough with apples. Let\'s talk starfish now.'
)
> Entering new ConversationChain chain... Prompt after formatting: The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know. Current conversation: Human: Tell me about William Tell, the hero AI: William Tell is a legendary hero of Switzerland. Human: Any relation to the apple from Cupertino? AI: No, the Apple Inc. is not named after William Tell. Human: I heard the two apples may be in fact the very same fruit. AI: I do not know. Human: Ok, enough with apples. Let's talk starfish now. AI: > Finished chain.
Out[14]:
'Starfish are marine echinoderms.'
In [15]:
Copied!
memory.moving_summary_buffer
memory.moving_summary_buffer
Out[15]:
''
In [16]:
Copied!
print(memory.predict_new_summary(
memory.chat_memory.messages,
memory.moving_summary_buffer,
))
print(memory.predict_new_summary(
memory.chat_memory.messages,
memory.moving_summary_buffer,
))
The human asks about William Tell, the hero. The AI says that William Tell is a legendary hero of Switzerland. The human asks if there is any relation to the apple from Cupertino. The AI says that the Apple Inc. is not named after William Tell. The human says that they heard the two apples may be in fact the very same fruit. The AI says that they do not know. The human says that they are done talking about apples and wants to talk about starfish now. The AI says that starfish are marine echinoderms.