Kafka | Spring Booo集成Kafka示例

297 阅读2分钟

1、Zookeeper安装

1、在浏览器地址栏输入https://zookeeper.apache.org/releases.html下载ZooKeeper最新发布包,下载成功后得到一个apache-zookeeper-3.7.1-bin.tar.gz的压缩包。

image-20230204112502316

2、将apache-zookeeper-3.7.1-bin.tar.gz压缩包解压到任意文件夹下(笔者汪小成将压缩包解压到D:\00-dev-env文件夹下)。解压后的完整路径为:D:\00-dev-env\apache-zookeeper-3.7.1-bin

image-20230204103201367

3、进入apache-zookeeper-3.7.1-bin文件夹下的conf文件夹,将zoo_sample.cfg复制一份,并重命名为zoo.cfg。打开zoo.cfg文件,修改数据存储目录。

dataDir=../data

ZooKeeper启动时会自动创建data文件夹。

4、进入apache-zookeeper-3.7.1-bin文件夹下的bin文件夹,双击zkServer.cmd运行ZooKeeper。

ZooKeeper默认端口号:2181

2、Kafka安装

1、在浏览器地址栏输入https://kafka.apache.org/downloads下载Kafka最新发布包,下载成功后得到一个kafka_2.13-3.3.2.tgz的压缩包。

image-20230204125451549

Kafka提供了两种不同版本的Scala编译器编译的二进制包,对Java用户来说,两者差不多。官方推荐2.13版本的,所以笔者汪小成这里就下载了2.13版本。

2、将kafka_2.13-3.3.2.tgz压缩包解压到任意文件夹下(笔者汪小成将压缩包解压到D:\00-dev-env文件夹下)。然后将解压后的包重命名为kafka。完整路径为:D:\00-dev-env\kafka

image-20230204132007645

3、进入D:\00-dev-env\kafka文件夹下的config目录,打开server.properties配置文件,修改Kafka日志存储目录:

log.dirs=D:\00-dev-env\kafka\kafka-logs

Kafka启动时会自动创建kafka-logs文件夹。

4、进入D:\00-dev-env\kafka文件夹下的bin --> windows目录,在此目录下打开cmd命令行窗口,运行如下命令启动Kafka:

$ kafka-server-start.bat D:\00-dev-env\kafka\config\server.properties

上面的命令运行完成后,Kafka就启动成功了。

集成Kafka

项目依赖

<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.kafka</groupId>
        <artifactId>spring-kafka</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
</dependencies>

项目配置

# 应用端口号
server:
  port: 28099
# 主题
app:
  topic:
    foo: foo.t
# 日志配置
logging:
  level:
    root: WARN
    org.springframework.web: INFO
    com.example.demo: DEBUG
# Kafka配置
spring:
  kafka:
    consumer:
      group-id: foo
      auto-offset-reset: earliest
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
    producer:
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
      value-serializer: org.apache.kafka.common.serialization.StringSerializer

消息发送

package com.example.demo.service;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;

@Service
public class Sender {

    private static final Logger LOGGER = LoggerFactory.getLogger(Sender.class);

    @Autowired
    private KafkaTemplate<String, String> kafkaTemplate;

    @Value("${app.topic.foo}")
    private String topic;

    public void send(String message) {
        LOGGER.info("sending message='{}' to topic='{}'", message, topic);
        kafkaTemplate.send(topic, message);
    }
}

消息接收

package com.example.demo.service;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.messaging.MessageHeaders;
import org.springframework.messaging.handler.annotation.Headers;
import org.springframework.messaging.handler.annotation.Payload;
import org.springframework.stereotype.Service;

@Service
public class Receiver {

    private static final Logger LOGGER = LoggerFactory.getLogger(Receiver.class);

    @KafkaListener(topics = "${app.topic.foo}")
    public void receive(@Payload String message, @Headers MessageHeaders headers) {
        LOGGER.info("received message='{}'", message);
        headers.keySet().forEach(key -> LOGGER.info("{}: {}", key, headers.get(key)));
    }
}

测试

控制器:

package com.example.demo.controller;

import com.example.demo.service.Sender;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

@RestController
@RequestMapping("/hello")
public class HelloController {

    @Autowired
    private Sender sender;

    @GetMapping("/say")
    public Object sayHello(String name) {
        sender.send(name);
        return String.format("消息已发送,消息内容:%s", name);
    }
}

在浏览器地址栏输入测试地址:http://localhost:28099/hello/say?name=HelloWorld,控制台输出内容如下:

sending message='HelloWorld' to topic='foo.t'
received message='HelloWorld'
kafka_offset: 0
kafka_consumer: org.apache.kafka.clients.consumer.KafkaConsumer@4808c26b
kafka_timestampType: CREATE_TIME
kafka_receivedMessageKey: null
kafka_receivedPartitionId: 0
kafka_receivedTopic: foo.t
kafka_receivedTimestamp: 1599633164734

参考