Exploring LangChain's Quickstart (4) - Dynamically Select the Tools (Agent)

In this series, we explore the ‘Quickstart’ section of the LangChain documentation.

Previously, we developed chains that operated on predefined steps. In this article, we explore how the LLM chooses the right tools for processing based on user input, introducing the concept of an agent.

In a previous post, we built a retriever from LangChain’s documentation.
Let’s revisit that setup:

import os
from langchain_openai import OpenAIEmbeddings
from langchain_community.document_loaders import WebBaseLoader
from langchain_community.vectorstores import FAISS
from langchain_text_splitters import RecursiveCharacterTextSplitter

# Set up the API key as environment variable
with open('.openai') as f:
    os.environ['OPENAI_API_KEY'] = f.read().strip()

# Load web page content
loader = WebBaseLoader("https://python.langchain.com/docs/get_started/introduction")
docs = loader.load()

# Load embeddings
embeddings = OpenAIEmbeddings()

# Split documents
text_splitter = RecursiveCharacterTextSplitter()
documents = text_splitter.split_documents(docs)

# Vectorize documents and create a vector store
vector = FAISS.from_documents(documents, embeddings)

# Create a retriever
retriever = vector.as_retriever()

In this article, we’re creating an agent that utilizes the retriever if the user’s question concerns LangChain. For other topics, it will conduct internet searches. The LLM will determine which process to employ based on the query.

We use create_retriever_tool to create a document searching tool. We provide it with the retriever, a tool name, and a detailed description to aid the LLM in its decision-making process.

from langchain.tools.retriever import create_retriever_tool

retriever_tool = create_retriever_tool(
    retriever,
    "langchain_search",
    "A search tool for LangChain-related queries. Use this for questions about LangChain!"
)

Next, we create a tool for internet searches using Tavily.

  1. Create an account at Tavily. The free account allows 1000 API calls per month.
  2. You can get an API key starting with tvly-.
  3. Save the API key in a .tavily file in your working directory.

Set the saved API key as the environment variable TAVILY_API_KEY.

with open('.tavily') as f:
    os.environ['TAVILY_API_KEY'] = f.read().strip()

We use TavilySearchResults to perform internet searches.

from langchain_community.tools.tavily_search import TavilySearchResults

search = TavilySearchResults()

Finally, we create an agent that utilizes the tools we’ve set up.

To use templates from LangChain Hub, install langchainhub.

pip install langchainhub

Here’s how to create the agent.

from langchain_openai import ChatOpenAI
from langchain import hub
from langchain.agents import create_openai_functions_agent
from langchain.agents import AgentExecutor

# List of tools to use
tools = [retriever_tool, search]

# Retrieve template from LangChain Hub
template = hub.pull("hwchase17/openai-functions-agent")

# Create the agent
llm = ChatOpenAI(model="gpt-3.5-turbo", temperature=0)
agent = create_openai_functions_agent(llm, tools, template)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

Use the agent_executor’s invoke method to ask questions to the agent.

agent_executor.invoke({"input": "What is LangChain?"})
  • Log

    > Entering new AgentExecutor chain...
    
    Invoking: `langchain_search` with `{'query': 'LangChain'}`
    
    
    Introduction | 🦜️🔗 LangChain
    
    Skip to main contentLangChain v0.2 is coming soon! Preview the new docs here.ComponentsIntegrationsGuidesAPI ReferenceMorePeopleVersioningContributingTemplatesCookbooksTutorialsYouTube🦜️🔗LangSmithLangSmith DocsLangServe GitHubTemplates GitHubTemplates HubLangChain HubJS/TS Docs💬SearchGet startedIntroductionQuickstartInstallationUse casesQ&A with RAGExtracting structured outputChatbotsTool use and agentsQuery analysisQ&A over SQL + CSVMoreExpression LanguageGet startedRunnable interfacePrimitivesAdvantages of LCELStreamingAdd message history (memory)MoreEcosystem🦜🛠️ LangSmith🦜🕸️LangGraph🦜️🏓 LangServeSecurityGet startedOn this pageIntroductionLangChain is a framework for developing applications powered by large language models (LLMs).LangChain simplifies every stage of the LLM application lifecycle:Development: Build your applications using LangChain's open-source building blocks and components. Hit the ground running using third-party integrations and Templates.Productionization: Use LangSmith to inspect, monitor and evaluate your chains, so that you can continuously optimize and deploy with confidence.Deployment: Turn any chain into an API with LangServe.Concretely, the framework consists of the following open-source libraries:langchain-core: Base abstractions and LangChain Expression Language.langchain-community: Third party integrations.Partner packages (e.g. langchain-openai, langchain-anthropic, etc.): Some integrations have been further split into their own lightweight packages that only depend on langchain-core.langchain: Chains, agents, and retrieval strategies that make up an application's cognitive architecture.langgraph: Build robust and stateful multi-actor applications with LLMs by modeling steps as edges and nodes in a graph.langserve: Deploy LangChain chains as REST APIs.The broader ecosystem includes:LangSmith: A developer platform that lets you debug, test, evaluate, and monitor LLM applications and seamlessly integrates with LangChain.Get started​We recommend following our Quickstart guide to familiarize yourself with the framework by building your first LangChain application.See here for instructions on how to install LangChain, set up your environment, and start building.noteThese docs focus on the Python LangChain library. Head here for docs on the JavaScript LangChain library.Use cases​If you're looking to build something specific or are more of a hands-on learner, check out our use-cases.
    
    They're walkthroughs and techniques for common end-to-end tasks, such as:Question answering with RAGExtracting structured outputChatbotsand more!Expression Language​LangChain Expression Language (LCEL) is the foundation of many of LangChain's components, and is a declarative way to compose chains. LCEL was designed from day 1 to support putting prototypes in production, with no code changes, from the simplest “prompt + LLM” chain to the most complex chains.Get started: LCEL and its benefitsRunnable interface: The standard interface for LCEL objectsPrimitives: More on the primitives LCEL includesand more!Ecosystem​🦜🛠️ LangSmith​Trace and evaluate your language model applications and intelligent agents to help you move from prototype to production.🦜🕸️ LangGraph​Build stateful, multi-actor applications with LLMs, built on top of (and intended to be used with) LangChain primitives.🦜🏓 LangServe​Deploy LangChain runnables and chains as REST APIs.Security​Read up on our Security best practices to make sure you're developing safely with LangChain.Additional resources​Components​LangChain provides standard, extendable interfaces and integrations for many different components, including:Integrations​LangChain is part of a rich ecosystem of tools that integrate with our framework and build on top of it. Check out our growing list of integrations.Guides​Best practices for developing with LangChain.API reference​Head to the reference section for full documentation of all classes and methods in the LangChain and LangChain Experimental Python packages.Contributing​Check out the developer's guide for guidelines on contributing and help getting your dev environment set up.Help us out by providing feedback on this documentation page:NextIntroductionGet startedUse casesExpression LanguageEcosystem🦜🛠️ LangSmith🦜🕸️ LangGraph🦜🏓 LangServeSecurityAdditional resourcesComponentsIntegrationsGuidesAPI referenceContributingCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogYouTubeCopyright © 2024 LangChain, Inc.LangChain is a framework for developing applications powered by large language models (LLMs). It simplifies every stage of the LLM application lifecycle, including development, productionization, and deployment. The framework consists of open-source libraries such as langchain-core, langchain-community, langchain, langgraph, and langserve. LangChain also includes partner packages for integrations with platforms like OpenAI and Anthropic. Additionally, the ecosystem includes LangSmith for debugging and monitoring LLM applications, LangGraph for building stateful applications with LLMs, and LangServe for deploying LangChain chains as REST APIs.
    
    > Finished chain.
    
  • Result

    {'input': 'What is LangChain?',
     'output': 'LangChain is a framework for developing applications powered by large language models (LLMs). It simplifies every stage of the LLM application lifecycle, including development, productionization, and deployment. The framework consists of open-source libraries such as langchain-core, langchain-community, langchain, langgraph, and langserve. LangChain also includes partner packages for integrations with platforms like OpenAI and Anthropic. Additionally, the ecosystem includes LangSmith for debugging and monitoring LLM applications, LangGraph for building stateful applications with LLMs, and LangServe for deploying LangChain chains as REST APIs.'}
    
agent_executor.invoke({"input": "How is the weather today in Tokyo?"})
  • Log

    > Entering new AgentExecutor chain...
    
    Invoking: `tavily_search_results_json` with `{'query': 'weather in Tokyo today'}`
    
    
    [{'url': 'https://www.weatherapi.com/', 'content': "{'location': {'name': 'Tokyo', 'region': 'Tokyo', 'country': 'Japan', 'lat': 35.69, 'lon': 139.69, 'tz_id': 'Asia/Tokyo', 'localtime_epoch': 1715507659, 'localtime': '2024-05-12 18:54'}, 'current': {'last_updated_epoch': 1715507100, 'last_updated': '2024-05-12 18:45', 'temp_c': 21.2, 'temp_f': 70.1, 'is_day': 0, 'condition': {'text': 'Partly Cloudy', 'icon': '//cdn.weatherapi.com/weather/64x64/night/116.png', 'code': 1003}, 'wind_mph': 23.3, 'wind_kph': 37.4, 'wind_degree': 189, 'wind_dir': 'S', 'pressure_mb': 1014.0, 'pressure_in': 29.94, 'precip_mm': 0.0, 'precip_in': 0.0, 'humidity': 65, 'cloud': 30, 'feelslike_c': 21.2, 'feelslike_f': 70.1, 'vis_km': 10.0, 'vis_miles': 6.0, 'uv': 6.0, 'gust_mph': 30.3, 'gust_kph': 48.8}}"}, {'url': 'https://www.accuweather.com/en/jp/tokyo/226396/december-weather/226396', 'content': 'Get the monthly weather forecast for Tokyo, Tokyo, Japan, including daily high/low, historical averages, to help you plan ahead.'}, {'url': 'https://world-weather.info/forecast/japan/tokyo/december-2024/', 'content': 'Extended weather forecast in Tokyo. Hourly Week 10 days 14 days 30 days Year. Detailed ⚡ Tokyo Weather Forecast for December 2024 - day/night 🌡️ temperatures, precipitations - World-Weather.info.'}, {'url': 'https://www.weather25.com/asia/japan/tokyo?page=month&month=December', 'content': "Tokyo weather in December 2024. The temperatures in Tokyo in December are quite cold with temperatures between 41°F and 53°F, warm clothes are a must. You can expect about 3 to 8 days of rain in Tokyo during the month of December. It's a good idea to bring along your umbrella so that you don't get caught in poor weather."}, {'url': 'https://www.weathertab.com/en/d/e/12/japan/tokyo-to/tokyo/', 'content': 'Our Tokyo, Tōkyō-to Daily Weather Forecast for December 2024, developed from a specialized dynamic long-range model, provides precise daily temperature and rainfall predictions. This model, distinct from standard statistical or climatological approaches, is the result of over 50 years of dedicated private research, offering a clearer and more ...'}]The weather in Tokyo today is partly cloudy with a temperature of 21.2°C (70.1°F). The wind speed is 37.4 km/h coming from the south. The humidity is at 65% with a visibility of 10.0 km.
    
    > Finished chain.
    
  • Result

    {'input': 'How is the weather today in Tokyo?',
     'output': 'The weather in Tokyo today is partly cloudy with a temperature of 21.2°C (70.1°F). The wind speed is 37.4 km/h coming from the south. The humidity is at 65% with a visibility of 10.0 km.'}
    

Here’s how we incorporate chat history into our responses.

from langchain_core.messages import HumanMessage, AIMessage

chat_history = [
    HumanMessage(content="What is LCEL?"),
    AIMessage(content="LangChain Expression Language.")
]
agent_executor.invoke({
    "chat_history": chat_history,
    "input": "Please tell me more about it."
})
  • Log

    > Entering new AgentExecutor chain...
    
    Invoking: `langchain_search` with `{'query': 'LCEL'}`
    
    
    They're walkthroughs and techniques for common end-to-end tasks, such as:Question answering with RAGExtracting structured outputChatbotsand more!Expression Language​LangChain Expression Language (LCEL) is the foundation of many of LangChain's components, and is a declarative way to compose chains. LCEL was designed from day 1 to support putting prototypes in production, with no code changes, from the simplest “prompt + LLM” chain to the most complex chains.Get started: LCEL and its benefitsRunnable interface: The standard interface for LCEL objectsPrimitives: More on the primitives LCEL includesand more!Ecosystem​🦜🛠️ LangSmith​Trace and evaluate your language model applications and intelligent agents to help you move from prototype to production.🦜🕸️ LangGraph​Build stateful, multi-actor applications with LLMs, built on top of (and intended to be used with) LangChain primitives.🦜🏓 LangServe​Deploy LangChain runnables and chains as REST APIs.Security​Read up on our Security best practices to make sure you're developing safely with LangChain.Additional resources​Components​LangChain provides standard, extendable interfaces and integrations for many different components, including:Integrations​LangChain is part of a rich ecosystem of tools that integrate with our framework and build on top of it. Check out our growing list of integrations.Guides​Best practices for developing with LangChain.API reference​Head to the reference section for full documentation of all classes and methods in the LangChain and LangChain Experimental Python packages.Contributing​Check out the developer's guide for guidelines on contributing and help getting your dev environment set up.Help us out by providing feedback on this documentation page:NextIntroductionGet startedUse casesExpression LanguageEcosystem🦜🛠️ LangSmith🦜🕸️ LangGraph🦜🏓 LangServeSecurityAdditional resourcesComponentsIntegrationsGuidesAPI referenceContributingCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogYouTubeCopyright © 2024 LangChain, Inc.
    
    Skip to main contentLangChain v0.2 is coming soon! Preview the new docs here.ComponentsIntegrationsGuidesAPI ReferenceMorePeopleVersioningContributingTemplatesCookbooksTutorialsYouTube🦜️🔗LangSmithLangSmith DocsLangServe GitHubTemplates GitHubTemplates HubLangChain HubJS/TS Docs💬SearchGet startedIntroductionQuickstartInstallationUse casesQ&A with RAGExtracting structured outputChatbotsTool use and agentsQuery analysisQ&A over SQL + CSVMoreExpression LanguageGet startedRunnable interfacePrimitivesAdvantages of LCELStreamingAdd message history (memory)MoreEcosystem🦜🛠️ LangSmith🦜🕸️LangGraph🦜️🏓 LangServeSecurityGet startedOn this pageIntroductionLangChain is a framework for developing applications powered by large language models (LLMs).LangChain simplifies every stage of the LLM application lifecycle:Development: Build your applications using LangChain's open-source building blocks and components. Hit the ground running using third-party integrations and Templates.Productionization: Use LangSmith to inspect, monitor and evaluate your chains, so that you can continuously optimize and deploy with confidence.Deployment: Turn any chain into an API with LangServe.Concretely, the framework consists of the following open-source libraries:langchain-core: Base abstractions and LangChain Expression Language.langchain-community: Third party integrations.Partner packages (e.g. langchain-openai, langchain-anthropic, etc.): Some integrations have been further split into their own lightweight packages that only depend on langchain-core.langchain: Chains, agents, and retrieval strategies that make up an application's cognitive architecture.langgraph: Build robust and stateful multi-actor applications with LLMs by modeling steps as edges and nodes in a graph.langserve: Deploy LangChain chains as REST APIs.The broader ecosystem includes:LangSmith: A developer platform that lets you debug, test, evaluate, and monitor LLM applications and seamlessly integrates with LangChain.Get started​We recommend following our Quickstart guide to familiarize yourself with the framework by building your first LangChain application.See here for instructions on how to install LangChain, set up your environment, and start building.noteThese docs focus on the Python LangChain library. Head here for docs on the JavaScript LangChain library.Use cases​If you're looking to build something specific or are more of a hands-on learner, check out our use-cases.
    
    Introduction | 🦜️🔗 LangChainLangChain Expression Language (LCEL) is the foundation of many of LangChain's components and is a declarative way to compose chains. It was designed to support putting prototypes into production without code changes, from simple chains to complex ones. LCEL allows for the creation of end-to-end tasks such as question answering with RAG, extracting structured output, chatbots, and more. It provides a standard interface for LCEL objects, includes primitives, and is part of the LangChain ecosystem which includes LangSmith, LangGraph, LangServe, and more.
    
    > Finished chain.
    
  • Result

    {'chat_history': [HumanMessage(content='What is LCEL?'),
      AIMessage(content='LangChain Expression Language.')],
     'input': 'Please tell me more about it.',
     'output': "LangChain Expression Language (LCEL) is the foundation of many of LangChain's components and is a declarative way to compose chains. It was designed to support putting prototypes into production without code changes, from simple chains to complex ones. LCEL allows for the creation of end-to-end tasks such as question answering with RAG, extracting structured output, chatbots, and more. It provides a standard interface for LCEL objects, includes primitives, and is part of the LangChain ecosystem which includes LangSmith, LangGraph, LangServe, and more."}
    

Related Content