https://github.com/langchain4j/langchain4j-spring

具体案例 用LangChain4J 构建AI对话机器人 — Building an AI Chatbot in Java With Langchain4j-CSDN博客

第一种

Spring Boot Integration

Spring Boot Integration | LangChain4j

Spring Boot Starters

Maven

<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-open-ai-spring-boot-starter</artifactId>
    <version>1.4.0-beta10</version>
</dependency>

application.properties

langchain4j.open-ai.chat-model.api-key=${OPENAI_API_KEY}
langchain4j.open-ai.chat-model.model-name=gpt-4o
langchain4j.open-ai.chat-model.log-requests=true
langchain4j.open-ai.chat-model.log-responses=true
...

OpenAIChatModel

@RestController
public class ChatController {

    ChatModel chatModel;

    public ChatController(ChatModel chatModel) {
        this.chatModel = chatModel;
    }

    @GetMapping("/chat")
    public String model(@RequestParam(value = "message", defaultValue = "Hello") String message) {
        return chatModel.chat(message);
    }
}

AI Services | LangChain4j

AI Services
So far, we have been covering low-level components like ChatModel, ChatMessage, ChatMemory, etc. Working at this level is very flexible and gives you total freedom, but it also forces you to write a lot of boilerplate code. Since LLM-powered applications usually require not just a single component but multiple components working together (e.g., prompt templates, chat memory, LLMs, output parsers, RAG components: embedding models and stores) and often involve multiple interactions, orchestrating them all becomes even more cumbersome.

到目前为止,我们一直在讨论诸如聊天模型、聊天消息、聊天记忆等底层组件。在这个层级上工作非常灵活,让你拥有完全的自由,但也迫使你编写大量样板代码。由于基于LLM的应用程序通常不仅需要单个组件,而是需要多个组件协同工作(例如提示模板、聊天记忆、大语言模型、输出解析器、RAG组件:嵌入模型和存储库),且常常涉及多次交互,协调所有这些组件变得更加繁琐。

We want you to focus on business logic, not on low-level implementation details. Thus, there are currently two high-level concepts in LangChain4j that can help with that: AI Services and Chains.

我们希望您专注于业务逻辑,而非底层实现细节。因此,LangChain4j目前有两个高级概念可以帮助实现这一点:AI服务和链式调用。

Chains (legacy)
The concept of Chains originates from Python's LangChain (before the introduction of LCEL). The idea is to have a Chain for each common use case, like a chatbot, RAG, etc. Chains combine multiple low-level components and orchestrate interactions between them. The main problem with them is that they are too rigid if you need to customize something. LangChain4j has only two Chains implemented (ConversationalChain and ConversationalRetrievalChain), and we do not plan to add more at this moment.

链(Chains)这一概念源自Python的LangChain(在引入LCEL之前)。其核心思想是为每个常见用例设计专属链,例如聊天机器人、RAG等。链通过组合多个底层组件并协调其交互来实现功能。主要问题在于,当需要定制功能时,这种结构显得过于僵化。目前LangChain4j仅实现了两种链(对话链ConversationalChain和对话检索链ConversationalRetrievalChain),且现阶段暂无新增其他链的计划。

AI Services
We propose another solution called AI Services, tailored for Java. The idea is to hide the complexities of interacting with LLMs and other components behind a simple API.

我们提出另一种名为AI服务的解决方案,专为Java量身定制。其核心思想是将与大型语言模型及其他组件交互的复杂性隐藏在简单的API之后。

This approach is very similar to Spring Data JPA or Retrofit: you declaratively define an interface with the desired API, and LangChain4j provides an object (proxy) that implements this interface. You can think of AI Service as a component of the service layer in your application. It provides AI services. Hence the name.

这种方法与Spring Data JPA或Retrofit非常相似:您声明性地定义一个具有所需API的接口,然后LangChain4j提供一个实现该接口的对象(代理)。您可以将AI服务视为应用程序中服务层的一个组件。它提供AI服务。因此得名。

AI Services handle the most common operations:

  • Formatting inputs for the LLM
  • Parsing outputs from the LLM

They also support more advanced features:

  • Chat memory
  • Tools
  • RAG

AI Services can be used to build stateful chatbots that facilitate back-and-forth interactions, as well as to automate processes where each call to the LLM is isolated.

Let's take a look at the simplest possible AI Service. After that, we will explore more complex examples.

AI服务可用于构建促进来回交互的有状态聊天机器人,也可用于自动化流程,其中每次对LLM的调用都是独立的。

让我们看看最简单的AI服务。之后,我们将探索更复杂的示例。

@UserMessage

interface Friend {

    @UserMessage("You are a good friend of mine. Answer using slang. {{it}}")
    String chat(String userMessage);
}

Friend friend = AiServices.create(Friend.class, model);

String answer = friend.chat("Hello"); // Hey! What's shakin'?

@SystemMessage

interface Friend {

    @SystemMessage("You are a good friend of mine. Answer using slang.")
    String chat(String userMessage);
}

Friend friend = AiServices.create(Friend.class, model);

String answer = friend.chat("Hello"); // Hey! What's up?

第二种

LangChain4j support in Spring Boot to build AI and LLM-powered applications. 

https://github.com/ThomasVitale/langchain4j-spring-boot

OpenAI

Gradle:

implementation 'io.thomasvitale.langchain4j:langchain4j-openai-spring-boot-starter:0.9.0'

Configuration:

langchain4j:
  open-ai:
    client:
      api-key: ${OPENAI_API_KEY}

示例:

@RestController
class ChatController {
    private final ChatLanguageModel chatLanguageModel;

    ChatController(ChatLanguageModel chatLanguageModel) {
        this.chatLanguageModel = chatLanguageModel;
    }

    @GetMapping("/ai/chat")
    String chat(@RequestParam(defaultValue = "What did Gandalf say to the Balrog?") String message) {
        return chatLanguageModel.generate(message);
    }
}

Maven:


<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-ollama-spring-boot-starter</artifactId>
    <version>0.36.2</version>
</dependency>

langchain4j-spring-boot/langchain4j-spring-boot-docker-compose/src/main/java/io/thomasvitale/langchain4j/docker/compose/service/connection/ollama/OllamaDockerComposeConnectionDetailsFactory.java at main · ThomasVitale/langchain4j-spring-boot · GitHub

package io.thomasvitale.langchain4j.docker.compose.service.connection.ollama;

import java.net.URI;

import org.springframework.boot.docker.compose.core.RunningService;
import org.springframework.boot.docker.compose.service.connection.DockerComposeConnectionDetailsFactory;
import org.springframework.boot.docker.compose.service.connection.DockerComposeConnectionSource;

import io.thomasvitale.langchain4j.autoconfigure.models.ollama.OllamaConnectionDetails;

/**
 * {@link DockerComposeConnectionDetailsFactory} to create {@link OllamaConnectionDetails}
 * for an {@code "ollama"} service.
 */
public class OllamaDockerComposeConnectionDetailsFactory
        extends DockerComposeConnectionDetailsFactory<OllamaConnectionDetails> {

    private static final String[] OLLAMA_CONTAINER_NAMES = { "docker.io/ollama/ollama", "ollama/ollama", "ollama" };

    private static final Integer OLLAMA_PORT = 11434;

    OllamaDockerComposeConnectionDetailsFactory() {
        super(OLLAMA_CONTAINER_NAMES);
    }

    @Override
    protected OllamaConnectionDetails getDockerComposeConnectionDetails(DockerComposeConnectionSource source) {
        return new OllamaDockerComposeConnectionDetails(source.getRunningService());
    }

    /**
     * {@link OllamaConnectionDetails} backed by a Docker {@link RunningService}.
     */
    private static final class OllamaDockerComposeConnectionDetails extends DockerComposeConnectionDetails
            implements OllamaConnectionDetails {

        private final URI url;

        private OllamaDockerComposeConnectionDetails(RunningService service) {
            super(service);
            this.url = URI.create("http://" + service.host() + ":" + service.ports().get(OLLAMA_PORT));
        }

        @Override
        public URI getUrl() {
            return url;
        }

    }

}

参 考

https://github.com/langchain4j/langchain4j-spring

https://github.com/ThomasVitale/langchain4j-spring-boot

AI Services | LangChain4j

Logo

有“AI”的1024 = 2048,欢迎大家加入2048 AI社区

更多推荐