Java中如何使用Langchain4j调用大模型
@TOC
前言
上一篇文章已经介绍如何使用okhttpclient调用大模型,现在再来介绍如何使用框架调用,接下来会介绍我们的重点Langchain4j 使用技术: jdk:17 Langchain4j版本:1.5.0
一、Langchain4j?
废话不多说,直接上官网介绍:docs.langchain4j.dev/get-started…
二、使用步骤
1.引入库
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.open.ai</groupId>
<artifactId>AI-blog</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>17</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
<version>1.0.0-beta2</version>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
<version>1.0.0-beta2</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.36</version>
</dependency>
<dependency>
<groupId>com.squareup.okhttp3</groupId>
<artifactId>okhttp</artifactId>
<version>4.12.0</version>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.83</version>
</dependency>
</dependencies>
</project>
2.编写代码
代码如下(示例):
package com.ai.demo;
import dev.langchain4j.model.chat.StreamingChatModel;
import dev.langchain4j.model.chat.response.*;
import dev.langchain4j.model.openai.OpenAiChatModel;
import dev.langchain4j.model.openai.OpenAiStreamingChatModel;
import java.io.IOException;
public class LangChain4jChatDemo {
private static String url = "https://www.apiplus.online/v1";
private static String token = "sk-aZ8OTonNtI7jo5bDQtwcbTsg0z4giYiH9Ax6wleLpRGlX4NP"; // 替换为实际的API令牌
public static void main(String[] args) throws IOException {
noStreamChat();
streamChat();
// 流式输出是异步的 需要堵塞线程
System.in.read();
}
/**
* 流式输出
*/
public static void streamChat() {
StreamingChatModel model = OpenAiStreamingChatModel.builder()
.apiKey(token)
.baseUrl(url)
.modelName("gpt-4o-mini") // 这里可以填入任何支持openai格式的模型
.build();
String userMessage = "你好";
model.chat(userMessage, new StreamingChatResponseHandler() {
@Override
public void onPartialResponse(String partialResponse) {
System.out.println("流式输出: " + partialResponse);
}
@Override
public void onPartialThinking(PartialThinking partialThinking) {
// 思考的响应
System.out.println("onPartialThinking: " + partialThinking);
}
@Override
public void onPartialToolCall(PartialToolCall partialToolCall) {
// 函数调用相关的
System.out.println("onPartialToolCall: " + partialToolCall);
}
@Override
public void onCompleteToolCall(CompleteToolCall completeToolCall) {
// 函数调用相关的
System.out.println("onCompleteToolCall: " + completeToolCall);
}
@Override
public void onCompleteResponse(ChatResponse completeResponse) {
System.out.println("完整的AI回答内容: " + completeResponse);
}
@Override
public void onError(Throwable error) {
error.printStackTrace();
}
});
}
/**
* 非流
*/
public static void noStreamChat() {
OpenAiChatModel model = OpenAiChatModel.builder()
.apiKey(token)
.modelName("gpt-4o-mini")
.baseUrl(url)
.build();
String response = model.chat("你好呀");
System.out.println(response);
}
}
总结
上面简单介绍了Java中如何使用Langchain4j去调用gpt&常规的大模型的简单使用,接下来也会详细介绍基于该框架去做一些事情和一些小工具。 github仓库: github.com/EroAI-Free/… gitee仓库: gitee.com/lixinjiuhao…