Context
Context provides product analytics for AI chatbots.
Context helps you understand how users are interacting with your AI chat products. Gain critical insights, optimise poor experiences, and minimise brand risks.
In this guide we will show you how to integrate with Context.
Installation and Setup
$ pip install context-python --upgrade
Getting API Credentials
To get your Context API token:
- Go to the settings page within your Context account (https://go.getcontext.ai/settings).
- Generate a new API Token.
- Store this token somewhere secure.
Setup Context
To use the ContextCallbackHandler
, import the handler from Langchain and instantiate it with your Context API token.
Ensure you have installed the context-python
package before using the handler.
import os
from langchain.callbacks import ContextCallbackHandler
token = os.environ["CONTEXT_API_TOKEN"]
context_callback = ContextCallbackHandler(token)
API Reference:
- ContextCallbackHandler from
langchain.callbacks
Usage
Using the Context callback within a Chat Model
The Context callback handler can be used to directly record transcripts between users and AI assistants.
Example
import os
from langchain.chat_models import ChatOpenAI
from langchain.schema import (
SystemMessage,
HumanMessage,
)
from langchain.callbacks import ContextCallbackHandler
token = os.environ["CONTEXT_API_TOKEN"]
chat = ChatOpenAI(
headers={"user_id": "123"}, temperature=0, callbacks=[ContextCallbackHandler(token)]
)
messages = [
SystemMessage(
content="You are a helpful assistant that translates English to French."
),
HumanMessage(content="I love programming."),
]
print(chat(messages))
API Reference:
- ChatOpenAI from
langchain.chat_models
- ContextCallbackHandler from
langchain.callbacks
Using the Context callback within Chains
The Context callback handler can also be used to record the inputs and outputs of chains. Note that intermediate steps of the chain are not recorded - only the starting inputs and final outputs.
Note: Ensure that you pass the same context object to the chat model and the chain.
Wrong:
chat = ChatOpenAI(temperature=0.9, callbacks=[ContextCallbackHandler(token)])
chain = LLMChain(llm=chat, prompt=chat_prompt_template, callbacks=[ContextCallbackHandler(token)])
Correct:
handler = ContextCallbackHandler(token)
chat = ChatOpenAI(temperature=0.9, callbacks=[callback])
chain = LLMChain(llm=chat, prompt=chat_prompt_template, callbacks=[callback])
Example
import os
from langchain.chat_models import ChatOpenAI
from langchain import LLMChain
from langchain.prompts import PromptTemplate
from langchain.prompts.chat import (
ChatPromptTemplate,
HumanMessagePromptTemplate,
)
from langchain.callbacks import ContextCallbackHandler
token = os.environ["CONTEXT_API_TOKEN"]
human_message_prompt = HumanMessagePromptTemplate(
prompt=PromptTemplate(
template="What is a good name for a company that makes {product}?",
input_variables=["product"],
)
)
chat_prompt_template = ChatPromptTemplate.from_messages([human_message_prompt])
callback = ContextCallbackHandler(token)
chat = ChatOpenAI(temperature=0.9, callbacks=[callback])
chain = LLMChain(llm=chat, prompt=chat_prompt_template, callbacks=[callback])
print(chain.run("colorful socks"))
API Reference:
- ChatOpenAI from
langchain.chat_models
- PromptTemplate from
langchain.prompts
- ContextCallbackHandler from
langchain.callbacks