LangChain Java - a framework for developing applications with LLMs

LangChainJava是用于结合大型语言模型(LLMs)和其他计算或知识源开发应用程序的库。它提供了与OpenAIAPI的集成,包括设置环境变量、获取LLM预测、管理提示模板、创建多步骤工作流(链)以及使用代理的功能。此外,文章还介绍了如何使用LLMChain和SQLChain进行数据查询,以及如何通过Agent动态调用链来基于用户输入执行操作。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

https://github.com/HamaWhiteGG/langchain-java

1. What is this?

This is the Java language implementation of LangChain.

Large language models (LLMs) are emerging as a transformative technology, enabling developers to build applications that they previously could not. But using these LLMs in isolation is often not enough to create a truly powerful app - the real power comes when you can combine them with other sources of computation or knowledge.

This library is aimed at assisting in the development of those types of applications.

Looking for the Python version? Check out LangChain.

2. Quickstart Guide

This tutorial gives you a quick walkthrough about building an end-to-end language model application with LangChain.

View the Quickstart Guide on the LangChain official website.

2.1 Maven Repository

Prerequisites for building:

  • Java 17 or later
  • Unix-like environment (we use Linux, Mac OS X)
  • Maven (we recommend version 3.8.6 and require at least 3.5.4)
<dependency>
    <groupId>io.github.hamawhitegg</groupId>
    <artifactId>langchain-core</artifactId>
    <version>0.1.6</version>
</dependency>

2.2 Environment Setup

Using LangChain will usually require integrations with one or more model providers, data stores, apis, etc.
For this example, we will be using OpenAI’s APIs.

We will then need to set the environment variable.

export OPENAI_API_KEY=xxx

# If a proxy is needed, set the OPENAI_PROXY environment variable.
export OPENAI_PROXY=http://host:port

If you want to set the API key and proxy dynamically, you can use the openaiApiKey and openaiProxy parameter when initiating OpenAI class.

var llm = OpenAI.builder()
        .openaiApiKey("xxx")
        .openaiProxy("http://host:port")
        .build()
        .init();

The following test code can be used to view the QuickStart.java

2.3 LLMs: Get predictions from a language model

The most basic building block of LangChain is calling an LLM on some input. Let’s walk through a simple example of how to do this. For this purpose, let’s pretend we are building a service that generates a company name based on what the company makes.

var llm = OpenAI.builder()
        .temperature(0.9f)
        .build()
        .init();

String text = "What would be a good company name for a company that makes colorful socks?";
System.out.println(llm.call(text));
Feetful of Fun

2.4 Prompt Templates: Manage prompts for LLMs

Calling an LLM is a great first step, but it’s just the beginning. Normally when you use an LLM in an application, you are not sending user input directly to the LLM. Instead, you are probably taking user input and constructing a prompt, and then sending that to the LLM.

var prompt = new PromptTemplate(List.of("product"),
        "What is a good name for a company that makes {product}?");

System.out.println(prompt.format(Map.of("product", "colorful socks")));
What is a good name for a company that makes colorful socks?

2.5 Chains: Combine LLMs and prompts in multi-step workflows

Up until now, we’ve worked with the PromptTemplate and LLM primitives by themselves. But of course, a real application is not just one primitive, but rather a combination of them.

A chain in LangChain is made up of links, which can be either primitives like LLMs or other chains.

2.5.1 LLM Chain

The most core type of chain is an LLMChain, which consists of a PromptTemplate and an LLM.

var llm = OpenAI.builder()
        .temperature(0.9f)
        .build()
        .init();

var prompt = new PromptTemplate(List.of("product"),
        "What is a good name for a company that makes {product}?");

var chain = new LLMChain(llm, prompt);
System.out.println(chain.run("colorful socks"));
\n\nSocktastic!
2.5.2 SQL Chain

This example demonstrates the use of the SQLDatabaseChain for answering questions over a database.

var database = SQLDatabase.fromUri("jdbc:mysql://127.0.0.1:3306/demo", "xxx", "xxx");

var llm = OpenAI.builder()
        .temperature(0)
        .build()
        .init();

var chain = SQLDatabaseChain.fromLLM(llm, database);
System.out.println(chain.run("How many students are there?"));
There are 6 students.

2.6 Agents: Dynamically Call Chains Based on User Input

Agents no longer do: they use an LLM to determine which actions to take and in what order. An action can either be using a tool and observing its output, or returning to the user.

When used correctly agents can be extremely powerful. In this tutorial, we show you how to easily use agents through the simplest, highest level API.

Set the appropriate environment variables.

export SERPAPI_API_KEY=xxx

Now we can get started!

var llm = OpenAI.builder()
        .temperature(0)
        .build()
        .init();

// load some tools to use.
var tools = loadTools(List.of("serpapi", "llm-math"), llm);

// initialize an agent with the tools, the language model, and the type of agent
var agent = initializeAgent(tools, llm, AgentType.ZERO_SHOT_REACT_DESCRIPTION);

// let's test it out!
String text =
        "What was the high temperature in SF yesterday in Fahrenheit? What is that number raised to the .023 power?";
System.out.println(agent.run(text));
I need to find the temperature first, then use the calculator to raise it to the .023 power.

Action: Search
Action Input: "High temperature in SF yesterday"
Observation: San Francisco Weather History for the Previous 24 Hours ; 60 °F · 60 °F · 61 °F ...

Thought: I now have the temperature, so I can use the calculator to raise it to the .023 power.
Action: Calculator
Action Input: 60^.023
Observation: Answer: 1.09874643447

Thought: I now know the final answer
Final Answer: 1.09874643447

1.09874643447

3. Run Test Cases from Source

git clone https://github.com/HamaWhiteGG/langchain-java.git
cd langchain-java

# export JAVA_HOME=JDK17_INSTALL_HOME && mvn clean test
mvn clean test

4. Apply Spotless

cd langchain-java

# export JAVA_HOME=JDK17_INSTALL_HOME && mvn spotless:apply
mvn spotless:apply

5. Support

Don’t hesitate to ask!

Open an issue if you find a bug in Flink.

6. Fork and Contribute

This is an active open-source project. We are always open to people who want to use the system or contribute to it.

Contact me if you are looking for implementation tasks that fit your skills.

### 多个 Python 库的安装方法 为了完成文档处理和大模型集成的任务,可以通过 `pip` 工具来安装所需的库。以下是具体的命令以及注意事项: #### 安装所需库 可以分别运行以下命令来安装各个库: ```bash pip install docx2txt[^1] pip install llama-index pip install llama-index-llms-huggingface pip install llama-index-embeddings-langchain pip install langchain-huggingface pip install sentence-transformers ``` 如果希望一次性安装所有依赖项,则可以创建一个包含这些包名的文件(例如 `requirements.txt`),并使用如下命令批量安装: ```bash pip install -r requirements.txt ``` 其中,`requirements.txt` 文件的内容应为: ``` docx2txt llama-index llama-index-llms-huggingface llama-index-embeddings-langchain langchain-huggingface sentence-transformers ``` #### 版本兼容性注意点 在安装过程中需要注意版本之间的兼容性问题。某些库可能有特定版本需求,建议查看官方文档中的说明以确认支持情况。 另外,在实际部署环境中测试 API 请求时可参考示例代码结构[^2]: ```python import requests response = requests.get( 'http://127.0.0.1:8000', headers={"Content-Type": "application/json"}, json={ "path": "S:/Downloads/", "instruction": "string", "incognito": False } ) print(response.json()) ``` 此脚本展示了如何向本地服务发送 JSON 数据请求,并接收返回的结果作为字典对象进行解析操作。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值