LangGraph is a powerful library for building stateful, multi-actor applications with LLMs. By modeling LLM agent workflows as graphs, LangGraph enables developers to create flexible and interactive agent and multi-agent workflows.
In this article, we'll walk through building a simple agent using LangGraph, with OpenAI and TavilySearch as the core components. Both require API keysβTavilySearch offers free credits, while OpenAI requires you to add funds. πΈ
π Getting Started with LangGraph
To start, clone the repository: π
git clone https://github.com/CyprianTinasheAarons/basic-bot-langgraph
Set up the project environment: π οΈ
cd basic-bot
python -m venv .venv
Activate the virtual environment:
- On Windows:
.venv\Scripts\activate
- On macOS and Linux:
source .venv/bin/activate
Install dependencies: π¦
pip install -r requirements.txt
Set up environment variables: π±
- Copy
.env.example
to.env
. - Fill in your API keys in the
.env
file.
π οΈ Key Libraries and Configurations
Here are some of the key libraries that are required for our project:
import os
from typing import Literal
from langchain_core.messages import HumanMessage
from langchain_openai import ChatOpenAI
from langgraph.graph import END, START, StateGraph, MessagesState
from langgraph.prebuilt import ToolNode
from dotenv import load_dotenv
from langchain_community.tools.tavily_search import TavilySearchResults
from loguru import logger
π Configure Logging
We use Loguru to handle logging in this project:
# Configure loguru
logger.add("bot.log", rotation="10 MB")
π Load Environment Variables
load_dotenv()
logger.info("Environment variables loaded")
π§ Defining Tools and Nodes
Define the web search tool: π
web_search = TavilySearchResults(max_results=2)
Define a function to perform a web search: π
def search(query: str):
"""Call to surf the web."""
logger.debug(f"Performing web search for query: {query}")
return web_search.invoke({"query": query})
Define and create the tool node: π οΈ
tools = [search]
tool_node = ToolNode(tools)
logger.info("Tool node created")
π€ Initializing the LLM Model
Here, we initialize an LLM using OpenAI's GPT-4: π§
llm = ChatOpenAI(model="gpt-4o", temperature=0, api_key=os.getenv("OPENAI_API_KEY"))
logger.info("LLM model initialized")
Bind the tools to the LLM: π
llm_with_tools = llm.bind_tools(tools)
π Defining Workflow Logic
Define a function that determines the conversation flow: π
def should_continue(state: MessagesState) -> Literal["tools", END]:
messages = state['messages']
last_message = messages[-1]
if last_message.tool_calls:
logger.debug("Routing to tools node")
return "tools"
logger.debug("Ending conversation")
return END
Define the function that calls the LLM model: π£οΈ
def call_model(state: MessagesState):
messages = state['messages']
logger.debug(f"Calling LLM with {len(messages)} messages")
response = llm_with_tools.invoke(messages)
return {"messages": [response]}
ποΈ Building the Graph
Initialize the StateGraph
: βοΈ
graph = StateGraph(MessagesState)
logger.info("StateGraph initialized")
Add nodes to the graph: β
graph.add_node("weatherbot", call_model)
graph.add_node("tools", tool_node)
logger.info("Nodes added to the graph")
Set the entry point: πͺ
graph.add_edge(START, "weatherbot")
Add conditional edges: π
graph.add_conditional_edges("weatherbot", should_continue)
Add a normal edge from tools to weather bot: π
graph.add_edge("tools", 'weatherbot')
logger.info("Graph edges configured")
Compile the graph: β
app = graph.compile()
logger.info("Graph compiled successfully")
π Saving Graph Visualization
try:
graph_png = app.get_graph().draw_mermaid_png()
with open("graph.png", "wb") as f:
f.write(graph_png)
logger.info("Graph visualization saved as graph.png")
except Exception as e:
logger.error(f"Failed to save graph visualization: {e}")
π Running the Agent
Invoke the app with a sample query: π¨οΈ
logger.info("Invoking the app with a sample query")
final_state = app.invoke(
{"messages": [HumanMessage(content="what is the weather in sf")]},
config={"configurable": {"thread_id": 42}}
)
π₯οΈ Usage
Run the bot using: π»
python bot.py
π Conclusion
Congratulations on building your first agent using LangGraph! π This is just the beginning of what you can achieve with LangGraph, LangChain, and LLMs. Stay tuned for more projects and tutorials exploring the exciting intersection of AI, LLMs, and agent workflows. ππ€
Feel free to follow me on Twitter for more updates and projects. Also, check out my website here. πβ¨
Top comments (0)