Flow实现了诸多“生产者-消费者”模型,提供了方便的操作符,譬如一类背压策略:后来者抢占——操作符实现transformLatest(其延伸操作符mapLatest、collectLatest),显然地,“transformOldest”似乎并没有被考虑到(or maybe it is just that I have not found it yet),一番尝试,实现起来并没有想象中的那么容易(just for me)。
需求场景
请求数据在业务场景中很常见,当用户行为(点击、滑动)触发请求条件,开始请求数据,不过在此之前,还需判断响应的上次请求是否已完成,避免重复操作。
判断逻辑实现也比较简单,只需要设置一个请求的标志位,在请求开始时和请求完成时将其标记。
Flow对应模型
对应Flow的模型,上流发射用户的交互事件,并过滤符合触发条件的事件,收集时执行请求逻辑。
flowOf(1, 2, 3, 4).filter {
it < 3 // Filter conditions.
}.onEach {
// Refreshing action.
}.launchIn(scope)
以上用户请求flow模型离大功告成还需一步——在请求过程中如何屏蔽掉上游发出的新事件?
Flow操作符很丰富,选取合适的模型应该不难,以上问题可以理解为上下游的"背压"问题(在下游未处理完成时,抛弃上游新事件),很容易想到buffer操作符(debouce 和 sample显然不合适)。
很快迭代出了新模型:
flowOf(1, 2, 3, 4).filter {
it < 3 // Filter conditions.
}.buffer(capacity = 0, onBufferOverflow = BufferOverflow.DROP_LATEST).onEach {
// Refreshing action.
}.launchIn(scope)
按照对需求的理解,在下游处理事件时,并不需要将上游的新事件放入缓冲区,丢弃即可,所以缓冲区的容量取0(而不是1,更不是其他值),溢出策略选择BufferOverflow.DROP_LATEST。
问题与真相
结果对比
期待的结果——只有[1]会被收集,[2,3,4]被丢弃。
实际结果——[1,2]被收集,[3,4]被丢弃。
操作和实际结果,em...,很像conflate操作符——等价于buffer(capacity = 0, onBufferOverflow = BufferOverflow.DROP_OLDEST)。
So,Why is it that?——0的真相
public interface Channel<E> : SendChannel<E>, ReceiveChannel<E> {
public companion object Factory {
public const val UNLIMITED: Int = Int.MAX_VALUE
public const val RENDEZVOUS: Int = 0
public const val CONFLATED: Int = -1
public const val BUFFERED: Int = -2
internal const val OPTIONAL_CHANNEL = -3
}
}
public fun <E> Channel(
capacity: Int = RENDEZVOUS,
onBufferOverflow: BufferOverflow = BufferOverflow.SUSPEND,
onUndeliveredElement: ((E) -> Unit)? = null
): Channel<E> =
when (capacity) {
RENDEZVOUS -> { // actually, it is 0.
if (onBufferOverflow == BufferOverflow.SUSPEND)
RendezvousChannel(onUndeliveredElement) // an efficient implementation of rendezvous channel
else
ArrayChannel(1, onBufferOverflow, onUndeliveredElement) // support buffer overflow with buffered channel
}
CONFLATED -> {
require(onBufferOverflow == BufferOverflow.SUSPEND) {
"CONFLATED capacity cannot be used with non-default onBufferOverflow"
}
ConflatedChannel(onUndeliveredElement)
}
UNLIMITED -> LinkedListChannel(onUndeliveredElement) // ignores onBufferOverflow: it has buffer, but it never overflows
BUFFERED -> ArrayChannel( // uses default capacity with SUSPEND
if (onBufferOverflow == BufferOverflow.SUSPEND) CHANNEL_DEFAULT_CAPACITY else 1,
onBufferOverflow, onUndeliveredElement
)
else -> {
if (capacity == 1 && onBufferOverflow == BufferOverflow.DROP_OLDEST)
ConflatedChannel(onUndeliveredElement) // conflated implementation is more efficient but appears to work in the same way
else
ArrayChannel(capacity, onBufferOverflow, onUndeliveredElement)
}
}
在创建管道的工厂方式Channel中,如果容量(capacity)为0,仅当溢出策略是BufferOverflow.SUSPEND返回RendezvousChannel对象,否则返回ArrayChannel(1, onBufferOverflow, onUndeliveredElement)(容量为1)
真相——模型崩塌
所以buffer(capacity = 0, onBufferOverflow = BufferOverflow.DROP_LATEST)等价于buffer(capacity = 1 onBufferOverflow = BufferOverflow.DROP_LATEST)
沙袋策略
How to implement a buffer with 0 capacity and drop the latest?
-
Flag instead buffer——not so easy
-
Transform buffer——with a sandbag
我的理解是,缓冲区并不能是天衣无缝的,他至少有一个缺口,那么,堵住它!
object Sandbag // By someway else.
private fun <T> Flow<T>.bufferWithNothing(): Flow<T> = transform {
emit(it)
emit(Sandbag) // With the sandbag}.buffer(1, BufferOverflow.DROP_LATEST).filter {
it !== Sandbag
}.map {
it as T
}
这里不得不创建一个容量为1的Channel,同时捏造了一个“沙包”(没错,防汛的那种),每个待发射的值都会捆绑一个“沙包”,堵住缓冲区的“缺口”(如果上个沙包已被丢弃的话),下游的“洪流”散去,沙包也会被过滤“丢掉”——这对上下游是透明的。