AI Observability with LangSmith
AI EngineeringLangSmith is a built-in observability service and platform that integrates easily with LangChain. We use LangSmith as an optional dependency in the LangChain Essentials Course. We recommend using it beyond this course for general development with LangChain. With all that in mind, we recommend getting familiar with LangSmith.
Setting up LangSmith
LangSmith does require an API key, but it comes with a generous free tier. You can sign up for an account and get your API key here.
When using LangSmith, we need to set our environment variables and provide our API key like so:
In most cases, this is all we need to start seeing logs and traces in the LangSmith UI. By default, LangChain will trace LLM calls, chains, etc. We'll take a quick example of this below.
Default Tracing
As mentioned, LangSmith traces a lot of data without us needing to do anything. Let's see how that looks. We'll start by initializing our LLM. Again, this will need an API key.
Let's invoke our LLM and then see what happens in the LangSmith UI.
After this, we should see a new project (aurelioai-langchain-course-langsmith-openai
)
created in the LangSmith UI. Inside that project, we should see the trace from our LLM call:
By default, LangSmith will capture plenty — however, it won't capture functions from outside of LangChain. Let's see how we can trace those.
Tracing Non-LangChain Code
LangSmith can trace functions that are not part of LangChain. We need to add the
@traceable
decorator. Let's try this for a few simple functions.
Let's run these a few times and see what happens.
Those traces should now be visible in the LangSmith UI, again under the same project:
We can use various metrics here for each run. First, of course, the run name. We can see any inputs and outputs from each run, and if the run raises any errors, we can see its start time and latency. Inside the UI, we can also filter for specific runs, like so:
We can do various other things within the UI, but we'll leave that for you to explore.
Finally, we can modify our traceable names if we want to make them more readable inside the UI. For example:
Let's run this a few times and see what we get in LangSmith.
Let's filter for the Chitchat Maker
traceable and see our results.
We can see our runs and their related metadata! That's it for this introduction to LangSmith. As we work through the course, we will (optionally) refer to LangSmith to investigate our runs.