DATE:
AUTHOR:
The LangChain Team
LangGraph

🤖 LangGraph Agent Protocol + LangGraph Studio for local execution

DATE:
AUTHOR: The LangChain Team

We’ve taken a big step toward our vision of a multi-agent future by making it easier to connect and integrate agents— regardless of how they’re built. We've shipped the following:

Agent Protocol: A Standard for Agent Communication

We've open-sourcing a framework-agnostic interface for agents to communicate. This enables seamless interaction between LangGraph agents and those built on other frameworks. The protocol covers APIs for runs, threads, and long-term memory—key components of reliable agent deployment.

Learn more in our blog: https://blog.langchain.dev/agent-protocol-interoperability-for-llm-agents/

Read the docs: https://github.com/langchain-ai/agent-protocol?

LangGraph Studio Now Runs Locally

LangGraph Studio can now be installed as a Python package, running entirely in your local environment—no Docker required.

Debug and iterate faster with local execution, cross-platform support, and tighter feedback loops. Plus, Studio now connects to any server implementing the Agent Protocol.

See how to install and use it: https://langchain-ai.github.io/langgraph/how-tos/local-studio/?

Watch a video tutorial:

Integrations with AutoGen, CrewAI, and More

We now have a new guide shows how to integrate LangGraph with other frameworks as sub-agents. This lets you build powerful multi-agent systems by embedding agents from other frameworks directly into your LangGraph workflows. You'll then get the benefit of LangGraph’s scalable infrastructure—task queues, persistence layers, and memory support—even for non-LangGraph agents.

Powered by LaunchNotes