问题背景
- 源于解决组内头脑风暴讨论的问题;
- 问题大意:有一个 worker 任务,从 redis 队列中拉取数据,将数据存入数据库,在不允许使用 MQ 的前提下,如何保证成功消费?
需求分析
- 支持发布 / 订阅模式;
- 支持应用消费失败,可重新消费,消息不会丢失;
- 支持数据持久化;
- 支持消息堆积。
解决方案
Redis Stream —— 完美满足以上需求
技术调研
1、是否支持发布 / 订阅模式
先看 Stream 生产消息
-
XADD:生产消息(其中的
*表示让 Redis 自动生成唯一的消息 ID); -
ID 的格式是:时间戳-自增序号;
127.0.0.1:6379> XADD queue * time0 1640759542090
"1640761534319-0"
127.0.0.1:6379> XADD queue * time1 1640759542111
"1640761588906-0"
Stream 发布 / 订阅
-
XGROUP:创建消费者组;
-
XREADGROUP:在指定消费组下,开启消费者拉取消息;
/** 创建消费者组 group1,0-0 表示从头拉取消息 */
127.0.0.1:6379> XGROUP CREATE queue group1 0-0
OK
/** group1 的 consumer 开始消费,COUNT 10 表示每次拉取 10 条消息,> 表示拉取最新数据 */
127.0.0.1:6379> XREADGROUP GROUP group1 consumer COUNT 10 STREAMS queue >
1) 1) "queue"
2) 1) 1) "1640761534319-0"
2) 1) "time0"
2) "1640759542090"
2) 1) "1640761588906-0"
2) 1) "time1"
2) "1640759542111"
2、支持应用消费失败,可重新消费,消息不会丢失
- 当消费者组消费完消息后,需要执行 XACK 命令告知 Redis,这时 Redis 就会把这条消息标记为“处理完成”,然后从待处理条目列表中删除该条待处理条目;
/** group1 下的 1640761534319-0 消息已处理完成 */
127.0.0.1:6379> XACK queue group1 1640761534319-0
(integer) 1
-
如果客户端宕机了,消息没来得及 ACK,Redis 会保留这条消息到待处理条目列表,可以使用 XPENDING 命令检查待处理条目列表;
-
待客户端恢复正常,可以重新根据待处理条目列表重新拉取上次宕机时未处理完的消息,这样就可以保证消息不会丢失。
3、是否支持数据持久化
-
Stream 是 Redis 5.0 版本以后 新增的数据类型,它与 Redis 其他的数据类型一样,都会将相应的操作写入到 RDB 和 AOF 中;
-
前提是需要配置好 Redis 的持久化策略,这样就可以保证消息数据的持久化了。
4、是否支持消息堆积
-
Stream 在发布消息时,可以设定队列的最大长度,防止队列积压导致内存溢出;
-
设置队列最大长度后,当队列长度超过上限时,旧消息会被删除,只保留固定长度的新消息;
-
虽然超过最大长度依然会丢消息,但是只要队列设置合理(要根据业务情况进行具体评估),就能避免消息丢失的情况。
Java Demo
1、生产者
/** 模拟头脑风暴问题场景,往 Redis 队列中存储10条消息 */
public class StreamProducer {
private static final Logger LOGGER = LogManager.getLogger(StreamProducer.class);
public static void main(String[] args) throws Exception {
/** 创建连接 */
RedisClient client = RedisClient.create("redis://localhost");
StatefulRedisConnection<String, String> connection = client.connect();
RedisStreamCommands<String, String> commands = connection.sync();
/** 生产消息 */
for(int i=0; i<10; i++){
Map<String, String> msg = Collections.singletonMap("time" + i, String.valueOf(System.currentTimeMillis()));
LOGGER.info(String.format("new msg=%s", msg));
commands.xadd("my_stream", msg);
}
}
}
/** 日志输出结果 */
15:49:49.213 [main] INFO example.StreamProducer - new msg={time0=1640764189213}
15:49:49.231 [main] INFO example.StreamProducer - new msg={time1=1640764189231}
15:49:49.232 [main] INFO example.StreamProducer - new msg={time2=1640764189232}
15:49:49.233 [main] INFO example.StreamProducer - new msg={time3=1640764189233}
15:49:49.233 [main] INFO example.StreamProducer - new msg={time4=1640764189233}
15:49:49.234 [main] INFO example.StreamProducer - new msg={time5=1640764189234}
15:49:49.235 [main] INFO example.StreamProducer - new msg={time6=1640764189234}
15:49:49.235 [main] INFO example.StreamProducer - new msg={time7=1640764189235}
15:49:49.236 [main] INFO example.StreamProducer - new msg={time8=1640764189236}
15:49:49.238 [main] INFO example.StreamProducer - new msg={time9=1640764189237}
/** Redis 客户端查看 */
127.0.0.1:6379> XRANGE my_stream - +
1) 1) "1640764189224-0"
2) 1) "time0"
2) "1640764189213"
2) 1) "1640764189232-0"
2) 1) "time1"
2) "1640764189231"
3) 1) "1640764189232-1"
2) 1) "time2"
2) "1640764189232"
4) 1) "1640764189233-0"
2) 1) "time3"
2) "1640764189233"
5) 1) "1640764189234-0"
2) 1) "time4"
2) "1640764189233"
6) 1) "1640764189234-1"
2) 1) "time5"
2) "1640764189234"
7) 1) "1640764189235-0"
2) 1) "time6"
2) "1640764189234"
8) 1) "1640764189236-0"
2) 1) "time7"
2) "1640764189235"
9) 1) "1640764189236-1"
2) 1) "time8"
2) "1640764189236"
10) 1) "1640764189238-0"
2) 1) "time9"
2) "1640764189237"
127.0.0.1:6379>
2、消费者
public class StreamConsumer {
private static final Logger LOGGER = LogManager.getLogger(StreamConsumer.class);
/** 设置取数数量 */
private static long GET_COUNT = 1;
public static void main(String[] args) throws Exception {
/** 创建连接 */
RedisClient client = RedisClient.create("redis://localhost");
StatefulRedisConnection<String, String> connection = client.connect();
RedisStreamCommands<String, String> commands = connection.sync();
/** 创建消费者组 */
commands.xgroupCreate(XReadArgs.StreamOffset.from("my_stream", "0-0"), "my_group");
/** 模拟消息消费途中,服务宕机的情况 */
for(int i=0; i<10; i++){
/** 控制每次消费一条,从第一个 未发送的消息 开始 */
List<StreamMessage<String, String>> messages = commands.xreadgroup(Consumer.from("my_group", "my_stream"), XReadArgs.Builder.count(GET_COUNT),
XReadArgs.StreamOffset.lastConsumed("my_stream"));
/** 接收到第二条消息时,跳出循环,停止消费,模拟服务宕机,消息未落库的情况 */
if(i==1){
LOGGER.info(String.format("Received %s, but not confirmed, service down!", messages));
break;
}
/** 正常消费消息,模拟正常落库 */
consumption(commands, messages);
}
/** 模拟服务宕机后,进行重启 */
LOGGER.info(String.format("Service restarting..."));
/** 继续消费前检查是否有 已发送但未ACK 的消息 */
List<Object> xpending = commands.xpending("my_stream", "my_group");
LOGGER.info(String.format("Sent but no ACK %s", xpending));
/** 获取 待处理消息的数量(即 已发送但未ACK 消息的数量),为0表示没有 */
Long xpendingCount = (Long) xpending.get(0);
if(xpendingCount != 0){
/** 获取待处理消息的最小和最大ID */
String pendingMsgMin = (String) xpending.get(1);
String pendingMsgMax = (String) xpending.get(2);
/** 设置取数范围 */
Range<String> range = Range.from(Range.Boundary.including(pendingMsgMin), Range.Boundary.including(pendingMsgMax));
/** 获取待处理消息列表 */
List<StreamMessage<String, String>> xrange = commands.xrange("my_stream", range, Limit.from(GET_COUNT));
LOGGER.info(String.format("List of pending messages %s", xrange));
/** 消费待处理消息,模拟正常落库 */
for(StreamMessage<String, String> streamMessage : xrange){
LOGGER.info(String.format("Received pending message %s, Data added successfully!", streamMessage));
}
}
/** 消费完待处理消息之后,继续上次的断点进行消费 */
while(true){
List<StreamMessage<String, String>> messages = commands.xreadgroup(Consumer.from("my_group", "my_stream"), XReadArgs.Builder.count(GET_COUNT),
XReadArgs.StreamOffset.lastConsumed("my_stream"));
/** 正常消费消息,模拟正常落库 */
consumption(commands, messages);
/** 消费完成,跳出循环 */
if(messages.size() == 0){
break;
}
}
/** 删除消费者组 */
commands.xgroupDestroy("my_stream", "my_group");
}
/**
* 正常消费消息,模拟正常落库
*
* @param commands
* @param messages
*/
private static void consumption(RedisStreamCommands<String, String> commands, List<StreamMessage<String, String>> messages){
for (StreamMessage<String, String> message : messages) {
LOGGER.info(String.format("Received %s, Data added successfully!", message));
String lastSeenMessage = message.getId();
/** 手动 ACK */
commands.xack("my_stream", "my_group", lastSeenMessage);
LOGGER.info(String.format("Confirmed message ID %s", lastSeenMessage));
}
}
}
/** 日志输出结果 */
15:53:04.884 [main] INFO example.StreamConsumer - Received StreamMessage[my_stream:1640764189224-0]{time0=1640764189213}, Data added successfully!
15:53:04.886 [main] INFO example.StreamConsumer - Confirmed message ID 1640764189224-0
15:53:04.887 [main] INFO example.StreamConsumer - Received [StreamMessage[my_stream:1640764189232-0]{time1=1640764189231}], but not confirmed, service down!
15:53:04.887 [main] INFO example.StreamConsumer - Service restarting...
15:53:04.889 [main] INFO example.StreamConsumer - Sent but no ACK [1, 1640764189232-0, 1640764189232-0, [[my_stream, 1]]]
15:53:04.890 [main] INFO example.StreamConsumer - List of pending messages [StreamMessage[my_stream:1640764189232-0]{time1=1640764189231}]
15:53:04.890 [main] INFO example.StreamConsumer - Received pending message StreamMessage[my_stream:1640764189232-0]{time1=1640764189231}, Data added successfully!
15:53:04.891 [main] INFO example.StreamConsumer - Received StreamMessage[my_stream:1640764189232-1]{time2=1640764189232}, Data added successfully!
15:53:04.892 [main] INFO example.StreamConsumer - Confirmed message ID 1640764189232-1
15:53:04.893 [main] INFO example.StreamConsumer - Received StreamMessage[my_stream:1640764189233-0]{time3=1640764189233}, Data added successfully!
15:53:04.893 [main] INFO example.StreamConsumer - Confirmed message ID 1640764189233-0
15:53:04.894 [main] INFO example.StreamConsumer - Received StreamMessage[my_stream:1640764189234-0]{time4=1640764189233}, Data added successfully!
15:53:04.895 [main] INFO example.StreamConsumer - Confirmed message ID 1640764189234-0
15:53:04.895 [main] INFO example.StreamConsumer - Received StreamMessage[my_stream:1640764189234-1]{time5=1640764189234}, Data added successfully!
15:53:04.896 [main] INFO example.StreamConsumer - Confirmed message ID 1640764189234-1
15:53:04.897 [main] INFO example.StreamConsumer - Received StreamMessage[my_stream:1640764189235-0]{time6=1640764189234}, Data added successfully!
15:53:04.897 [main] INFO example.StreamConsumer - Confirmed message ID 1640764189235-0
15:53:04.898 [main] INFO example.StreamConsumer - Received StreamMessage[my_stream:1640764189236-0]{time7=1640764189235}, Data added successfully!
15:53:04.899 [main] INFO example.StreamConsumer - Confirmed message ID 1640764189236-0
15:53:04.900 [main] INFO example.StreamConsumer - Received StreamMessage[my_stream:1640764189236-1]{time8=1640764189236}, Data added successfully!
15:53:04.901 [main] INFO example.StreamConsumer - Confirmed message ID 1640764189236-1
15:53:04.902 [main] INFO example.StreamConsumer - Received StreamMessage[my_stream:1640764189238-0]{time9=1640764189237}, Data added successfully!
15:53:04.902 [main] INFO example.StreamConsumer - Confirmed message ID 1640764189238-0