Infino - LangChain LLM Monitoring Example
This example shows how one can track the following while calling OpenAI models via LangChain and Infino:
- prompt input,
- response from chatgpt or any other LangChain model,
- latency,
- errors,
- number of tokens consumed
# Install necessary dependencies.
pip install infinopy
pip install matplotlib
# Remove the (1) import sys and sys.path.append(..) and (2) uncomment `!pip install langchain` after merging the PR for Infino/LangChain integration.
import sys
sys.path.append("../../../../../langchain")
#!pip install langchain
import datetime as dt
from infinopy import InfinoClient
import json
from langchain.llms import OpenAI
from langchain.callbacks import InfinoCallbackHandler
import matplotlib.pyplot as plt
import matplotlib.dates as md
import os
import time
import sys
Requirement already satisfied: matplotlib in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (3.7.1)
Requirement already satisfied: contourpy>=1.0.1 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from matplotlib) (1.0.7)
Requirement already satisfied: cycler>=0.10 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from matplotlib) (0.11.0)
Requirement already satisfied: fonttools>=4.22.0 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from matplotlib) (4.39.4)
Requirement already satisfied: kiwisolver>=1.0.1 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from matplotlib) (1.4.4)
Requirement already satisfied: numpy>=1.20 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from matplotlib) (1.24.3)
Requirement already satisfied: packaging>=20.0 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from matplotlib) (23.1)
Requirement already satisfied: pillow>=6.2.0 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from matplotlib) (9.5.0)
Requirement already satisfied: pyparsing>=2.3.1 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from matplotlib) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from matplotlib) (2.8.2)
Requirement already satisfied: six>=1.5 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from python-dateutil>=2.7->matplotlib) (1.16.0)
Requirement already satisfied: infinopy in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (0.0.1)
Requirement already satisfied: docker in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from infinopy) (6.1.3)
Requirement already satisfied: requests in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from infinopy) (2.31.0)
Requirement already satisfied: packaging>=14.0 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from docker->infinopy) (23.1)
Requirement already satisfied: urllib3>=1.26.0 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from docker->infinopy) (2.0.2)
Requirement already satisfied: websocket-client>=0.32.0 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from docker->infinopy) (1.5.2)
Requirement already satisfied: charset-normalizer<4,>=2 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from requests->infinopy) (3.1.0)
Requirement already satisfied: idna<4,>=2.5 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from requests->infinopy) (3.4)
Requirement already satisfied: certifi>=2017.4.17 in /Users/vinaykakade/.pyenv/versions/3.10.11/lib/python3.10/site-packages (from requests->infinopy) (2023.5.7)
Start Infino server, initialize the Infino client
# Start server using the Infino docker image.
docker run --rm --detach --name infino-example -p 3000:3000 infinohq/infino:latest
# Create Infino client.
client = InfinoClient()
497a621125800abdd19f57ce7e033349b3cf83ca8cea6a74e8e28433a42ecadd
Read the questions dataset
# These are a subset of questions from Stanford's QA dataset -
# https://rajpurkar.github.io/SQuAD-explorer/
data = """In what country is Normandy located?
When were the Normans in Normandy?
From which countries did the Norse originate?
Who was the Norse leader?
What century did the Normans first gain their separate identity?
Who gave their name to Normandy in the 1000's and 1100's
What is France a region of?
Who did King Charles III swear fealty to?
When did the Frankish identity emerge?
Who was the duke in the battle of Hastings?
Who ruled the duchy of Normandy
What religion were the Normans
What type of major impact did the Norman dynasty have on modern Europe?
Who was famed for their Christian spirit?
Who assimilted the Roman language?
Who ruled the country of Normandy?
What principality did William the conquerer found?
What is the original meaning of the word Norman?
When was the Latin version of the word Norman first recorded?
What name comes from the English words Normans/Normanz?"""
questions = data.split("\n")
LangChain OpenAI Q&A; Publish metrics and logs to Infino
# Set your key here.
# os.environ["OPENAI_API_KEY"] = "YOUR_API_KEY"
# Create callback handler. This logs latency, errors, token usage, prompts as well as prompt responses to Infino.
handler = InfinoCallbackHandler(
model_id="test_openai", model_version="0.1", verbose=False
)
# Create LLM.
llm = OpenAI(temperature=0.1)
# Number of questions to ask the OpenAI model. We limit to a short number here to save $$ while running this demo.
num_questions = 10
questions = questions[0:num_questions]
for question in questions:
print(question)
# We send the question to OpenAI API, with Infino callback.
llm_result = llm.generate([question], callbacks=[handler])
print(llm_result)
In what country is Normandy located?
generations=[[Generation(text='\n\nNormandy is located in France.', generation_info={'finish_reason': 'stop', 'logprobs': None})]] llm_output={'token_usage': {'total_tokens': 16, 'completion_tokens': 9, 'prompt_tokens': 7}, 'model_name': 'text-davinci-003'} run=RunInfo(run_id=UUID('8de21639-acec-4bd1-a12d-8124de1e20da'))
When were the Normans in Normandy?
generations=[[Generation(text='\n\nThe Normans first settled in Normandy in the late 9th century.', generation_info={'finish_reason': 'stop', 'logprobs': None})]] llm_output={'token_usage': {'total_tokens': 24, 'completion_tokens': 16, 'prompt_tokens': 8}, 'model_name': 'text-davinci-003'} run=RunInfo(run_id=UUID('cf81fc86-250b-4e6e-9d92-2df3bebb019a'))
From which countries did the Norse originate?
generations=[[Generation(text='\n\nThe Norse originated from Scandinavia, which includes modern-day Norway, Sweden, and Denmark.', generation_info={'finish_reason': 'stop', 'logprobs': None})]] llm_output={'token_usage': {'total_tokens': 29, 'completion_tokens': 21, 'prompt_tokens': 8}, 'model_name': 'text-davinci-003'} run=RunInfo(run_id=UUID('50f42f5e-b4a4-411a-a049-f92cb573a74f'))
Who was the Norse leader?
generations=[[Generation(text='\n\nThe most famous Norse leader was the legendary Viking king Ragnar Lodbrok. He is believed to have lived in the 9th century and is renowned for his exploits in England and France.', generation_info={'finish_reason': 'stop', 'logprobs': None})]] llm_output={'token_usage': {'total_tokens': 45, 'completion_tokens': 39, 'prompt_tokens': 6}, 'model_name': 'text-davinci-003'} run=RunInfo(run_id=UUID('e32f31cb-ddc9-4863-8e6e-cb7a281a0ada'))
What century did the Normans first gain their separate identity?
generations=[[Generation(text='\n\nThe Normans first gained their separate identity in the 11th century.', generation_info={'finish_reason': 'stop', 'logprobs': None})]] llm_output={'token_usage': {'total_tokens': 28, 'completion_tokens': 16, 'prompt_tokens': 12}, 'model_name': 'text-davinci-003'} run=RunInfo(run_id=UUID('da9d8f73-b3b3-4bc5-8495-da8b11462a51'))
Who gave their name to Normandy in the 1000's and 1100's
generations=[[Generation(text='\n\nThe Normans, a people from northern France, gave their name to Normandy in the 1000s and 1100s. The Normans were descended from Viking settlers who had come to the region in the late 800s.', generation_info={'finish_reason': 'stop', 'logprobs': None})]] llm_output={'token_usage': {'total_tokens': 58, 'completion_tokens': 45, 'prompt_tokens': 13}, 'model_name': 'text-davinci-003'} run=RunInfo(run_id=UUID('bb5829bf-b6a6-4429-adfa-414ac5be46e5'))
What is France a region of?
generations=[[Generation(text='\n\nFrance is a region of Europe.', generation_info={'finish_reason': 'stop', 'logprobs': None})]] llm_output={'token_usage': {'total_tokens': 16, 'completion_tokens': 9, 'prompt_tokens': 7}, 'model_name': 'text-davinci-003'} run=RunInfo(run_id=UUID('6943880b-b4e4-4c74-9ca1-8c03c10f7e9c'))
Who did King Charles III swear fealty to?
generations=[[Generation(text='\n\nKing Charles III swore fealty to Pope Innocent III.', generation_info={'finish_reason': 'stop', 'logprobs': None})]] llm_output={'token_usage': {'total_tokens': 23, 'completion_tokens': 13, 'prompt_tokens': 10}, 'model_name': 'text-davinci-003'} run=RunInfo(run_id=UUID('c91fd663-09e6-4d00-b746-4c7fd96f9ceb'))
When did the Frankish identity emerge?
generations=[[Generation(text='\n\nThe Frankish identity began to emerge in the late 5th century, when the Franks began to expand their power and influence in the region. The Franks were a Germanic tribe that had migrated to the area from the east and had established a kingdom in what is now modern-day France. The Franks were eventually able to establish a powerful kingdom that lasted until the 10th century.', generation_info={'finish_reason': 'stop', 'logprobs': None})]] llm_output={'token_usage': {'total_tokens': 86, 'completion_tokens': 78, 'prompt_tokens': 8}, 'model_name': 'text-davinci-003'} run=RunInfo(run_id=UUID('23f86775-e592-4cb8-baa3-46ebe74305b2'))
Who was the duke in the battle of Hastings?
generations=[[Generation(text='\n\nThe Duke of Normandy, William the Conqueror, was the leader of the Norman forces at the Battle of Hastings in 1066.', generation_info={'finish_reason': 'stop', 'logprobs': None})]] llm_output={'token_usage': {'total_tokens': 39, 'completion_tokens': 28, 'prompt_tokens': 11}, 'model_name': 'text-davinci-003'} run=RunInfo(run_id=UUID('ad5b7984-8758-4d95-a5eb-ee56e0218f6b'))
Create Metric Charts
We now use matplotlib to create graphs of latency, errors and tokens consumed.
# Helper function to create a graph using matplotlib.
def plot(data, title):
data = json.loads(data)
# Extract x and y values from the data
timestamps = [item["time"] for item in data]
dates = [dt.datetime.fromtimestamp(ts) for ts in timestamps]
y = [item["value"] for item in data]
plt.rcParams["figure.figsize"] = [6, 4]
plt.subplots_adjust(bottom=0.2)
plt.xticks(rotation=25)
ax = plt.gca()
xfmt = md.DateFormatter("%Y-%m-%d %H:%M:%S")
ax.xaxis.set_major_formatter(xfmt)
# Create the plot
plt.plot(dates, y)
# Set labels and title
plt.xlabel("Time")
plt.ylabel("Value")
plt.title(title)
plt.show()
response = client.search_ts("__name__", "latency", 0, int(time.time()))
plot(response.text, "Latency")
response = client.search_ts("__name__", "error", 0, int(time.time()))
plot(response.text, "Errors")
response = client.search_ts("__name__", "prompt_tokens", 0, int(time.time()))
plot(response.text, "Prompt Tokens")
response = client.search_ts("__name__", "completion_tokens", 0, int(time.time()))
plot(response.text, "Completion Tokens")
response = client.search_ts("__name__", "total_tokens", 0, int(time.time()))
plot(response.text, "Total Tokens")
![png](_infino_files/output_9_0.png)
![png](_infino_files/output_9_1.png)
![png](_infino_files/output_9_2.png)
![png](_infino_files/output_9_3.png)
![png](_infino_files/output_9_4.png)
Full text query on prompt or prompt outputs.
# Search for a particular prompt text.
query = "normandy"
response = client.search_log(query, 0, int(time.time()))
print("Results for", query, ":", response.text)
print("===")
query = "king charles III"
response = client.search_log("king charles III", 0, int(time.time()))
print("Results for", query, ":", response.text)
Results for normandy : [{"time":1686821979,"fields":{"prompt":"In what country is Normandy located?"},"text":"In what country is Normandy located?"},{"time":1686821982,"fields":{"prompt_response":"\n\nNormandy is located in France."},"text":"\n\nNormandy is located in France."},{"time":1686821984,"fields":{"prompt_response":"\n\nThe Normans first settled in Normandy in the late 9th century."},"text":"\n\nThe Normans first settled in Normandy in the late 9th century."},{"time":1686821993,"fields":{"prompt":"Who gave their name to Normandy in the 1000's and 1100's"},"text":"Who gave their name to Normandy in the 1000's and 1100's"},{"time":1686821997,"fields":{"prompt_response":"\n\nThe Normans, a people from northern France, gave their name to Normandy in the 1000s and 1100s. The Normans were descended from Viking settlers who had come to the region in the late 800s."},"text":"\n\nThe Normans, a people from northern France, gave their name to Normandy in the 1000s and 1100s. The Normans were descended from Viking settlers who had come to the region in the late 800s."}]
===
Results for king charles III : [{"time":1686821998,"fields":{"prompt":"Who did King Charles III swear fealty to?"},"text":"Who did King Charles III swear fealty to?"},{"time":1686822000,"fields":{"prompt_response":"\n\nKing Charles III swore fealty to Pope Innocent III."},"text":"\n\nKing Charles III swore fealty to Pope Innocent III."}]
Step 5: Stop infino server
docker rm -f infino-example
infino-example