JavaAI:LangChain4j学习(一) 集成SpringBoot和阿里通义千问DashScope

该文章已生成可运行项目,

使用资源

Langchain4j官方文档

LangChain4j社区DashScope

Langchain4jMaven仓库

阿里通义千问API

记录

本人使用版本: 1.0.0-beta2

不使用Spring Boot ,仅集成

额外:如果不使用SpringBoot,仅集成LangChain4j - DashScope

如果使用的langChain4j-dashScope版本小于等于1.0.0-alpha1

<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-dashscope</artifactId>
    <version>${previous version here}</version>
</dependency>

如果使用的langChain4j-dashScope版本大于等于1.0.0-alpha1

<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-community-dashscope</artifactId>
    <version>${langchain4j.version}</version>
</dependency>

集成Spring Boot 启动器

Spring Boot 启动器有助于创建和配置语言模型、嵌入模型、嵌入存储、 和其他核心 LangChain4j 组件。如果要使用Spring Boot 相关启动器, 需要在pom.xml中引入相应依赖项

pom.xml :

如果使用的langChain4j-dashScope版本小于等于1.0.0-alpha1

<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-community-dashscope-spring-boot-starter</artifactId>
    <version>${langchain4j.version}</version>
</dependency>

如果使用的langChain4j-dashScope版本大于等于1.0.0-alpha1

<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-community-dashscope-spring-boot-starter</artifactId>
    <version>${latest version here}</version>
</dependency>

或者使用 BOM 管理依赖项:

<dependencyManagement>
    <dependency>
        <groupId>dev.langchain4j</groupId>
        <artifactId>langchain4j-community-bom</artifactId>
        <version>${latest version here}</version>
        <typ>pom</typ>
        <scope>import</scope>
    </dependency>
</dependencyManagement>

配置

引入依赖后,在propertity中配置注册

langchain4j.community.dashscope.api-key=<API Key>
langchain4j.community.dashscope.model-name==<Model Name>

#是否输出日志
langchain4j.community.dashscope.log-requests=true
langchain4j.community.dashscope.log-responses=true
#日志等级
logging.level.dev.langchain4j=DEBUG

其他相关参数也可添加,例如 QwenChatModel参数

langchain4j.community.dashscope.temperature=0.7
langchain4j.community.dashscope.max-tokens=4096

官方参数如下:
langchain4j-community-dashscope 有四个model可使用:

QwenChatModel
QwenStreamingChatModel
QwenLanguageModel
QwenStreamingLanguageModel

QwenChatModel 参数如下,其他也相同

PropertyDescriptionDefault Value
baseUrlThe URL to connect to. You can use HTTP or websocket to connect to DashScopeText Inference and Multi-Modal
apiKeyThe API Key
modelNameThe model to use.qwen-plus
topPThe probability threshold for kernel sampling controls the diversity of texts generated by the model. the higher the , the more diverse the generated texts, and vice versa. Value range: (0, 1.0]. We generally recommend altering this or temperature but not both.top_p
topKThe size of the sampled candidate set during the generation process.
enableSearchWhether the model uses Internet search results for reference when generating text or not.
seedSetting the seed parameter will make the text generation process more deterministic, and is typically used to make the results consistent.
repetitionPenaltyRepetition in a continuous sequence during model generation. Increasing reduces the repetition in model generation, 1.0 means no penalty. Value range: (0, +inf)repetition_penalty
temperatureSampling temperature that controls the diversity of the text generated by the model. the higher the temperature, the more diverse the generated text, and vice versa. Value range: [0, 2)
stopsWith the stop parameter, the model will automatically stop generating text when it is about to contain the specified string or token_id.
maxTokensThe maximum number of tokens returned by this request.
listenersListeners that listen for request, response and errors.

案例使用

例如 QwenChatModel

ChatLanguageModel qwenModel = QwenChatModel.builder()
                    .apiKey("You API key here")
                    .modelName("qwen-max")
                    .build();
本文章已经生成可运行项目
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值