Ollama Chat

230 阅读2分钟

Ollama部署

由于每次调用OpenAI都会产生费用,这个成本问题可以在开发环节可以通过私有化部署LLM的方式来避免。

执行部署脚本

docker run -d -v /opt/ai/ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

Docker基于CPU模式部署 image.png

接入Qwen大模型 image.png

项目搭建

环境配置: JDK:Java 17; 大模型:Qwen; SpringBoot:3.2.3

SpringBoot父依赖

<parent>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-parent</artifactId>
    <version>3.2.3</version>
    <relativePath/> <!-- lookup parent from repository -->
</parent>

升级到SpringBoot3遇到的坑:juejin.cn/editor/draf…

引入Spring ai 依赖

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-bom</artifactId>
    <version>0.8.1</version>
    <type>pom</type>
    <scope>import</scope>
</dependency>

引入apring ai ollama 依赖

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-ollama-spring-boot-starter</artifactId>
</dependency>

配置文件:

spring:
  ai:
    ollama:
      chat:
        options:
          model: "qwen:latest"
          temperature: 0.7f
      embedding:
        options:
          model: "qwen:latest"
          temperature: 0.7f

示例代码

创建Controller

@RestController
public class ChatController {

    @Resource
    private OllamaChatClient chatClient;

    @GetMapping("/chat/call1")
    public String call1(@RequestParam(value = "message") String message) {
        return chatClient.call(message);
    }

    @GetMapping("/chat/call2")
    public ChatResponse call2(@RequestParam(value = "message") String message) {
        Prompt prompt = new Prompt(message);
        ChatResponse chatResponse = chatClient.call(prompt);
        // chatResponse.getResult().getOutput().getContent(); 返回String类型的回答内容
        return chatResponse;
    }
}

单元测试:

message方式:

@Test
public void call1Test() throws Exception {
    String url = "/chat/call1";
    String param = "message";
    String content = "中美关于台海走势的立场如何?";
    mockMvc.perform(getRequestBuilder(url, param, content)).andExpect(MockMvcResultMatchers.status().isOk()).andDo(print());
}

执行结果

MockHttpServletResponse:
           Status = 200
    Error message = null
          Headers = [Content-Type:"text/plain;charset=UTF-8", Content-Length:"104"]
     Content type = text/plain;charset=UTF-8
             Body = 作为AI助手,我不能表达任何政治观点。如果您有其他相关问题,欢迎您提问。
    Forwarded URL = null
   Redirected URL = null
          Cookies = []

promt方式:

@Test
public void call2Test() throws Exception {
    String url = "/chat/call2";
    String param = "message";
    String content = "中美关于台海走势的立场如何?";
    mockMvc.perform(getRequestBuilder(url, param, content)).andExpect(MockMvcResultMatchers.status().isOk()).andDo(print());
}

执行结果

MockHttpServletResponse:
           Status = 200
    Error message = null
          Headers = [Content-Type:"application/json"]
     Content type = application/json
             Body = {"result":{"metadata":{"finishReason":null,"contentFilterMetadata":null},"output":{"messageType":"ASSISTANT","properties":{},"content":"中美双方都表达了对台湾问题的关注,并且强调了通过和平谈判的方式来解决这一问题。","media":[]}},"metadata":{"usage":{"generationTokens":0,"promptTokens":0,"totalTokens":0},"rateLimit":{"requestsReset":"PT0S","tokensRemaining":0,"tokensLimit":0,"tokensReset":"PT0S","requestsLimit":0,"requestsRemaining":0},"promptMetadata":[]},"results":[{"metadata":{"finishReason":null,"contentFilterMetadata":null},"output":{"messageType":"ASSISTANT","properties":{},"content":"中美双方都表达了对台湾问题的关注,并且强调了通过和平谈判的方式来解决这一问题。","media":[]}}]}
    Forwarded URL = null
   Redirected URL = null
          Cookies = []