kioku-space
As a personal memory space.
This series dives into how to use LangChain, based on the LangChain Quickstart guide.
In this post, we’ll explore how to deploy LangChain agents as a REST API using LangServe.
Recap of the Previous PostIn our last post, we created an agent that combines tools for answering LangChain-related queries with internet search capabilities.
Here’s a recap of the code we used:
import os from langchain import hub from langchain.agents import AgentExecutor, create_openai_functions_agent from langchain.
In this series, we explore the ‘Quickstart’ section of the LangChain documentation.
Previously, we developed chains that operated on predefined steps. In this article, we explore how the LLM chooses the right tools for processing based on user input, introducing the concept of an agent.
A RecapIn a previous post, we built a retriever from LangChain’s documentation.
Let’s revisit that setup:
import os from langchain_openai import OpenAIEmbeddings from langchain_community.document_loaders import WebBaseLoader from langchain_community.
In this post, we continue our journey through LangChain’s Quickstart guide, exploring how to enhance your chains by integrating conversation history.
Recap of Our ProgressHere’s what we’ve set up so far:
retriever: Retrives a list of relevant documents based on the input text. document_chain: Generates LLM responses using the user’s questions and the list of documents. create_retrieval_chain: Combines retriever and document_chain to answer queries by referencing documents. import os from langchain_openai import ChatOpenAI, OpenAIEmbeddings from langchain_community.
In this series, we’ll explore the ‘Quickstart’ section of the LangChain documentation.
In this article, we discuss how to expand LLM knowledge using information on the internet.
Below, I outline the sections of code from our previous article that we’ll use again in this article.
import os from langchain_openai import ChatOpenAI # Set the API key from an environment variable with open('.openai') as f: os.environ['OPENAI_API_KEY'] = f.read().strip() # Load the large language model llm = ChatOpenAI() 6.
In this series, we’ll explore the ‘Quickstart’ section of the LangChain documentation.
In this article, we focus on LLMs, prompt templates, and chains.
1. InstallationTo get started, install langchain and its OpenAI extension, langchain-openai:
pip install langchain pip install langchain-openai We are using the following versions. Note that LangChain often introduces breaking changes, so be careful with different versions:
$ pip list | grep langchain langchain 0.1.17 langchain-community 0.0.37 langchain-core 0.
Continuing from the previous “Hello World” program, this session dives into building a memo app with the following functionalities:
A text box for entering a memo. A shortcut (Control+S) for saving the contents of the text box to a file. Displaying the file contents in the text box upon app launch. 1. Setting Up the Text BoxLet’s get started by building on our previous program. First, we create the root window and place a frame inside it.