Log, Trace, and Monitor
When building apps or agents using Langchain, you end up making multiple API calls to fulfill a single user request. However, these requests are not chained when you want to analyse them. With Portkey, all the embeddings, completions, and other requests from a single user request will get logged and traced to a common ID, enabling you to gain full visibility of user interactions.
This notebook serves as a step-by-step guide on how to log, trace, and monitor Langchain LLM calls using Portkey
in your Langchain app.
First, let's import Portkey, OpenAI, and Agent tools
import os
from langchain.agents import AgentExecutor, create_openai_tools_agent
from langchain_openai import ChatOpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders
Paste your OpenAI API key below. (You can find it here)
os.environ["OPENAI_API_KEY"] = "..."