Creating Multi-Agent Applications with LangGraph and Web Scrapers
Written on
Chapter 1: Introduction to LangGraph and Multi-Agent Systems
In this guide, I will outline the process of constructing a LangGraph, executing function calls, and designing a web scraper that can extract information from any website. Additionally, you will learn how to query your data and receive insightful responses.
This guide is particularly beneficial for newcomers to LangGraph who wish to quickly understand how LangGraph, function calls, and web scraping tools operate together. You will also learn how to create a chatbot and implement the provided code into your own projects.
Before we dive in, if you find this topic engaging and want to support my work, please clap for this article 50 times; it would greatly help me. Also, feel free to follow me on Medium and subscribe for free to receive my latest updates. What topics would you like to see me cover next? Letβs get started!
Disclaimer: This article is intended solely for educational purposes. We do not advocate for scraping websites that prohibit such activities in their terms and conditions.
To begin, you'll need to install several Python libraries: langchain_community, langchain-openai, langchain_core, LangGraph, and streamlit. If you haven't done so yet, simply type:
pip install -r requirements.txt
Once that's complete, let's move to the Python file we will be working with.
The libraries include:
- langchain_community: This contains various third-party integrations.
- langchain_openai: This is used to answer the questions we pose.
- langchain_core: This holds the fundamental abstractions that have become standard, along with the LangChain Expression Language, which facilitates the composition of these components.
- LangGraph: A library designed for creating stateful, multi-agent applications with LLMs based on LangChain.
- Streamlit: A tool that allows you to convert Python scripts into web applications swiftly, without needing JavaScript or HTML knowledge.
import os
from langchain.agents import AgentExecutor, create_openai_tools_agent
from langchain.output_parsers.openai_functions import JsonOutputFunctionsParser
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_openai import ChatOpenAI
from langchain.tools import tool
from langchain_core.messages import HumanMessage, SystemMessage
import streamlit as st
from langchain_core.runnables import Runnable
import operator
from typing import Annotated, Sequence, TypedDict, List
from langchain_core.messages import BaseMessage
from langchain_community.document_loaders import WebBaseLoader
from langgraph.graph import StateGraph, END
Next, we will set the title of the Streamlit web application to "LangGraph + Function Call + Amazon Scraper πΎ". We also create a dropdown in the sidebar for users to select a model from the options: "gpt-4-turbo-preview", "gpt-3.5-turbo", and "gpt-3.5-turbo-instruct".
st.title("LangGraph + Function Call + Amazon Scraper πΎ")
OPENAI_MODEL = st.sidebar.selectbox(
"Select Model",
["gpt-4-turbo-preview", "gpt-3.5-turbo", "gpt-3.5-turbo-instruct"]
)
A text input field is created in the sidebar for users to enter their OpenAI API Key, with the input type set to "password" to keep it hidden. If a key is provided, it will be set as an environment variable.
api_key = st.sidebar.text_input("Enter your OpenAI API Key", type="password")
if api_key:
os.environ["OPENAI_API_KEY"] = api_key
A text field for user input is also created, along with a button labeled "Run Workflow". If clicked, it triggers the defined workflow.
user_input = st.text_input("Enter your input here:")
if st.button("Run Workflow"):
with st.spinner("Running Workflow..."):
We now define a function called create_agent, which takes a language model, a list of tools, and a system prompt. This function generates a prompt template and uses the create_openai_tools_agent API to construct the agent.
def create_agent(llm: ChatOpenAI, tools: list, system_prompt: str):
prompt = ChatPromptTemplate.from_messages(
[
("system", system_prompt),
MessagesPlaceholder(variable_name="messages"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
]
)
agent = create_openai_tools_agent(llm, tools, prompt)
return AgentExecutor(agent=agent, tools=tools)
Next, we define a function for the supervisor, setting a system prompt that guides the supervisor's actions and includes placeholders for the agent list.
def create_supervisor(llm: ChatOpenAI, agents: list[str]):
system_prompt = (
"You are the supervisor over the following agents: {agents}."
" You are responsible for assigning tasks to each agent as requested by the user."
" Each agent executes tasks according to their roles and responds with their results and status."
" Please review the information and answer with the name of the agent to which the task should be assigned next."
" Answer 'FINISH' if you are satisfied that you have fulfilled the user's request."
)
Video 1: LangGraph + Function Call + Web Scraper = Multi-Agent Application - YouTube
This video demonstrates how to combine LangGraph with function calls and web scraping to create a multi-agent application that can efficiently gather and analyze data.
Following this, we will define a function named researcher that accepts a list of URLs and employs a tool to scrape the web.
@tool("Scrape the web")
def researcher(urls: List[str]) -> str:
"""Use requests and bs4 to scrape the provided web pages for detailed information."""
loader = WebBaseLoader(urls)
docs = loader.load()
return "nn".join(
[
f'n{doc.page_content}n'
for doc in docs
]
)
Next, we will define the analyze function, which takes a string as input and simulates a conversation with a language model to analyze market trends and suggest winning products for Amazon.
@tool("Market Analyser")
def analyze(content: str) -> str:
"""Market Analyser"""
chat = ChatOpenAI()
messages = [
SystemMessage(
content="You are a market analyst specializing in e-commerce trends, tasked with identifying a winning product to sell on Amazon."),
HumanMessage(content=content),
]
response = chat(messages)
return response.content
We also create an expert function that interacts with an OpenAI chatbot to provide insights regarding dropshipping.
@tool("DropShipping_expert")
def expert(content: str) -> str:
"""Execute a trade"""
chat = ChatOpenAI()
messages = [
SystemMessage(
content="Act as an experienced DropShipping assistant. Your task is to identify a winning product."),
HumanMessage(content=content),
]
response = chat(messages)
return response.content
Now, we define three AI agents that will be responsible for identifying and selling profitable products on Amazon:
- Scraper Agent: Gathers data from Amazon by scraping product information.
- Analyzer Agent: Analyzes the scraped data to identify potential winning products.
- Expert Agent: Provides expert analysis to decide if a product is worth selling.
def scraper_agent() -> Runnable:
prompt = "You are an Amazon scraper."
return create_agent(llm, [researcher], prompt)
def analyzer_agent() -> Runnable:
prompt = "You are analyzing data scraped from Amazon. Help find a winning product."
return create_agent(llm, [analyze], prompt)
def expert_agent() -> Runnable:
prompt = "You are a Buyer. Help me decide whether to start selling this product or not."
return create_agent(llm, [expert], prompt)
We define agent states and a structured representation using a typed dictionary for managing the workflow:
class AgentState(TypedDict):
messages: Annotated[Sequence[BaseMessage], operator.add]
next: str
Video 2: LangGraph + Function Call + YahooFinance = Multi-Agent Application - YouTube
In this video, we explore how to leverage LangGraph and function calls with Yahoo Finance data to create a powerful multi-agent application.
Finally, we establish the workflow that involves managing the communication between different agents and executing tasks based on user input.
workflow = StateGraph(AgentState)
workflow.add_node(RESEARCHER, scraper_node)
workflow.add_node(ANALYZER, analyzer_node)
workflow.add_node(EXPERT, expert_node)
workflow.add_node(SUPERVISOR, supervisor_node)
workflow.add_edge(RESEARCHER, SUPERVISOR)
workflow.add_edge(ANALYZER, SUPERVISOR)
workflow.add_edge(EXPERT, SUPERVISOR)
workflow.add_conditional_edges(
SUPERVISOR,
lambda x: x["next"],
{
RESEARCHER: RESEARCHER,
ANALYZER: ANALYZER,
EXPERT: EXPERT,
"FINISH": END
}
)
workflow.set_entry_point(SUPERVISOR)
To conclude, we have developed a sophisticated chatbot capable of scraping any website and addressing user inquiries using LangGraph, function calls, and a web scraper. I hope this tutorial has been beneficial; stay tuned for more content in this series. I look forward to seeing what you will create based on the knowledge gained here.
Thank you for reading, and keep spreading the love! Cheers!
π§ββοΈ I am an AI application expert! If you're seeking a Generative AI Engineer, feel free to reach out or book a 1-on-1 consulting session with me.
π Don't hesitate to check out my other articles!