Products
LangSmith Platform
Observability
See exactly what your agents are doing
Evaluation
Score and improve agent performance
Deployment
Ship and scale agents in production
Agent Builder
No-code agents for real work
Open Source Frameworks
deepagents
Build long-running agents for complex tasks
langchain
Quick start agents with any model provider
langgraph
Build reliable agents with low-level control
Learn
Resources
Blog
Customer Stories
Guides
How-To
LangChain Academy
YouTube
Documentation
Community
LangSmith for Startups
Events
Community
Docs
Company
About
Careers
Partners
Pricing
Try LangSmith
Get a demo
Try LangSmith
Get a demo

Changelog

Sign up for our newsletter to stay up to date
arrow-left Back to Announcements
DATE:
March 27, 2025
AUTHOR:
The LangChain Team
LangChain

Universal token counting callback for LangChain Python

DATE: March 27, 2025
AUTHOR: The LangChain Team

We've added a new callback/context manager that tracks token usage across all major LangChain chat models, including support for:

  • Cached tokens

  • Multi-modal token counting

  • Works with all major LangChain chat models

Docs: https://python.langchain.com/docs/how_to/chat_token_usage_tracking/#using-callbacks

Powered by LaunchNotes
Subscribe to updates
Choose the topics you'd like to stay updated on.
Please select at least one category.
By clicking subscribe, you accept our privacy policy and terms and conditions. reCAPTCHA privacy and terms apply