- DATE:
- AUTHOR:
- The LangChain Team
End-to-end native OpenTelemetry support for LangSmith SDK
Introducing End-to-End OpenTelemetry Support in LangSmith
LangSmith now supports full end-to-end OpenTelemetry (OTel) for LangChain and LangGraph applications, ensuring seamless observability from your application code to the LangSmith platform.
Why It Matters
With our OTel support, you'll be able to get::
Unified observability: See your entire application stack in one place
Distributed tracing: Track requests across microservices
Interoperability: Integrate with existing observability tools via OpenTelemetry standards
Getting Started
Install OpenTelemetry support:
pip install "langsmith[otel]" langchain
Enable OpenTelemetry by setting environment variables:
LANGSMITH_OTEL_ENABLED=true LANGSMITH_TRACING=true LANGSMITH_ENDPOINT=https://api.smith.langchain.com LANGSMITH_API_KEY=<your_langsmith_api_key>
Use tracing in your LangChain app:
from langchain_openai import ChatOpenAI from langchain_core.prompts import ChatPromptTemplate prompt = ChatPromptTemplate.from_template("Tell me a joke about {topic}") model = ChatOpenAI() chain = prompt | model result = chain.invoke({"topic": "programming"}) print(result.content)
Try It Today
Get started with OpenTelemetry in LangSmith— check out our docs for more details: https://docs.smith.langchain.com/observability/how_to_guides/trace_langchain_with_otel