Chromium Video之给视频上点色

357 阅读5分钟

网页视频上屏流程梳理

1. 播放器创建

当video标签对应的HTMLVideoElement对象创建好后,HTMLVideoElement会按需创建相应的播放器对象对数据进行必要的预加载,播放器创建主要涉及以下几个流程:

  • 初始化 MediaPlayer
  • 初始化 Surface,并设置给 MediaPlayer,接收播放器的解码数据
  • 对 Surface 相应的 Texture 设置 FrameAvailabe 监听

这样,我们在播放视频时,播放器的解码数据就会通过 Texture 的 FrameAvailabe 源源不断的回调给内核,内核在合适的时机将这些视频Frame绘制上屏,以下是Chromium内核中播放器与Surface的创建源码流程

media_player_renderer_cient.Initialize
  |-->stream_texture_wrapper_impl.Initialize
    |-->PostTask -------------------------------------------|
                                                            |
stream_texture_wrapper_impl.InitializeOnMainThread <--------|
  |-->stream_texture_proxy.BindToTaskRunner
    |-->stream_texture_proxy.BindOnThread
      |-->stream_texture_host_android.BindToCurrentThread
        |-->gpu::mojom::StreamTexture.StartListening [ OnFrameAvailable ]
  |-->init_cb.Run
      |
      |
      |
  |-->media_palayer_renderer_client.OnStreamTextureWrapperInitialized
    |-->mojo_renderer_wrapper.Initialize
      |-->mojo_renderer.Initialize
        |-->InitializeRendererFromUrl
          |-->remote_renderer.Initialize------------------------|
                                                                |
~~~~~~~~~~~~~~~~~~~~mojom~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
                                                                |
media_player_renderer.Initialize <------------------------------|
  |-->create mediaplayer
  |-->OnInitialized --------------------------------------------|
                                                                |
~~~~~~~~~~~~~~~~~~~~mojom~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
                                                                |
mojo_renderer.OnInitialized <-----------------------------------|
  |-->init_cb_.Run
      |
      |
      |
  |-->media_player_renderer_client.OnRemoteRendererInitialized
    |-->renderer_extension_remote_.InitiateScopedSurfaceRequest--------------------|
                                                                                   |
~~~~~~~~~~~~~~~~~~~~mojom~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
                                                                                   |
media_player_renderer.InitiateScopedSurfaceRequest<--------------------------------|    OnScopedSurfaceRequestCompleted
  |-->ScopedSurfaceRequestManager::GetInstance()->RegisterScopedSurfaceRequest---------------------------------------------|
  |-->std::move(callback).Run(surface_request_token_)------------------------------|                                       |
                                                                                   |                                       |
~~~~~~~~~~~~~~~~~~~~mojom~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|                                       |
                                                                                   |                                       |
media_player_renderer_client.OnScopedSurfaceRequested<-----------------------------|                                       |
  |-->texture_wrapper_impl.ForwardStreamTextureForSurfaceRequest                                                           |
    |-->StreamTextureProxy.ForwardStreamTextureForSurfaceRequest [ stream_texture_factory.cc ]                             |
      |-->stream_texture_host_android.ForwardStreamTextureForSurfaceRequest                                                |
        |-->texture_remote_.ForwardForSurfaceRequest-------------------------------|                                       |
                                                                                   |                                       |
~~~~~~~~~~~~~~~~~~~~mojom~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|                                       | 
                                                                                   |                                       |
stream_texture_android.ForwardForSurfaceRequest<-----------------------------------|                                       |
  |-->ScopedSurfaceRequestConduit::GetInstance()->ForwardSurfaceOwnerForSurfaceRequest                                     |
      |                                                                                                                    |
      | ScopedSurfaceRequestManager extends ScopedSurfaceRequestConduit                                                    |
      |                                                                                                                    |
  |-->ScopedSurfaceRequestManager.ForwardSurfaceOwnerForSurfaceRequest                                                     |
    |-->scoped_surface_request_manager.FulfillScopedSurfaceRequest                                                         |
      |-->texture_owner->CreateJavaSurface                                                                                 |
      |-->PostUiTask--------------------------------------------------|                                                    |
                                                                      |                                                    |
scoped_surface_request_manager.CompleteRequestOnUiThread<-------------|                                                    |
  |-->scoped_surface_request_manager.GetAndUnregisterInternal [Get Callback] ----------------------------------------------|
  |-->callback.Run
      |
      |
      |
  |-->media_player_renderer.OnScopedSurfaceRequestCompleted
    |-->media_player.SetVideoSurface

2. 构造 Video Layer

播放器创建时,内核还会为 video 标签创建相应的 Layer 用于最终网页内容的图层合并,由于 Video 标签的特殊性,因此内核单独为 video 标签创建了 VideoLayer,其对应的实现类为 VideoLayerImpl,在视频Frame上屏时,内核会将各类Layer进行统一的绘制操作并进行图层合并,用于最终内容上屏,这里我们只关注VideoLayerImpl,VideoLayerImpl对象构造流程如下

synchronous_compositor_proxy.BeginFrame
  |-->synchronous_layer_tree_frame_sink.BeginFrame
    |-->begin_frame_source.BeginFrame
      |-->begin_frame_source.FilterAndIssueBeginFrame
        |-->BeginFrameObserverBase.OnBeginFrame [begin_frame_source.cc]
          |-->BeginFrameObserverBase.OnBeginFrameDerivedImpl
              |
              | scheduler extends BeginFrameObserverBase
              |
          |-->scheduler.OnBeginFrameDerivedImpl
            |-->scheduler.BeginImplFrameSynchronous
              |-->scheduler.BeginImplFrame
                |-->ProcessScheduledActions
                  |-->SchedulerClient.ScheduledActionSendBeginMainFrame
                      |
                      | proxy_impl extends SchedulerClient
                      |
                  |-->proxy_impl.ScheduledActionSendBeginMainFrame
                    |-->PostTask --------------------------|
                                                           |
proxy_main.BeginMainFrame <--------------------------------|
  |-->PostTask ----------------------------------------|
                                                       |
proxy_main.NotifyReadyToCommitOnImpl <-----------------|
  |-->scheduler.NotifyReadyToCommit
    |-->scheduler.ProcessScheduledActions
      |-->SchedulerClient.ScheduledActionCommit
            |
            | proxy_impl extends SchedulerClient
            |
      |-->proxy_impl.ScheduledActionCommit
        |-->layer_tree_host_impl.FinishCommit
          |-->layer_tree_impl.PullPropertiesFrom
            |-->tree_synchronizer.SynchronizeTrees
              |-->tree_synchronizer.SynchronizeTreesInternal
                |-->tree_synchronizer.PushLayerList
                  |-->tree_synchronizer.ReuseOrCreateLayerImpl
                    |-->video_layer.CreateLayerImpl
                      |-->video_layer_impl.Create

3. 收取视频帧

经过1、2步骤后,Video标签播放上屏前的准备工作基本完成,当我们点击 video 标签播放时,此时播放器对视频留进行解码,并将解码后的原始数据通过 FrameAvailabe 暴露出去,同时通知到内核video标签存在有效的内容进行绘制,具体流程如下

image_reader_gl_owner.OnFrameAvailable
  |-->StreamTexture.RunCallback [ stream_texture_android ]
    |-->StreamTexture.OnFrameAvailable
      |-->mojom::StreamTextureClient.OnFrameAvailable-------------------|
                                                                        |
~~~~~~~~~~~~~~~~~~~~mojom~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
                                                                        |
stream_texture_host_android.OnFrameAvailable<---------------------------|
  |-->StreamTextureHost::Listener.OnFrameAvailable
      |
      | StreamTextureProxy extends StreamTextureHost::Listener
      |
  |-->StreamTextureProxy.OnFrameAvailable [ stream_texture_factory.cc ]
    |-->received_frame_cb_.OnFrameAvailable
        |
        | media_player_renderer_client initialize set the callback
        |
    |-->media_player_renderer_client.OnFrameAvailable
      |-->VideoRendererSink.PaintSingleFrame
          |
          | video_frame_compositor extends VideoRendererSink
          |
      |-->video_frame_compositor.PaintSingleFrame
        |-->VideoFrameProvider::Client.DidReceiveFrame
            |
            | VideoFrameProviderClientImpl extends VideoFrameProvider::Client
            |
        |-->video_frame_provider_client_impl.DidReceiveFrame
          |-->video_layer_impl.SetNeedsRedraw
            |-->layer_tree_impl.SetNeedsRedraw
              |-->layer_tree_host_impl.SetNeedsRedraw
                |-->LayerTreeHostImplClient.SetNeedsRedrawOnImplThread
                    |
                    |
                    |
                |-->proxy_impl.SetNeedsRedrawOnImplThread
                  |-->scheduler.SetNeedsRedraw
                    |-->scheduler_state_machine.SetNeedsRedraw
                      |-->needs_redraw_ == true
                      |-->ShouldDraw will return true to draw content

我们发现,OnFrameAvailabe最终会将SchedulerStateMachine种的needs_redraw_标记为true,SchedulerStateMachine决定了内核是否真的需要绘制内容,从避免无效绘制,具体方法如下

bool SchedulerStateMachine::ShouldDraw() const {
  ....

  return needs_redraw_;
}

SchedulerStateMachine.ShouldDraw 经过一系列逻辑判断后,最终needs_redraw_决定了是否需要绘制内容,因此OnFrameAvailable回调的作用除了是将视频内容数据暴露出去之外,还会通知绘制状态机存在有效的内容去绘制

4. 视频帧打包到OpenGL绘制纹理

Chromium VSync 是如何触发绘制的

以上文章介绍了Chromium在收到Vsync信号后绘制上屏的具体的过程,这里不再赘述,我们只关注关键的绘制上屏部分

WebView在收到Vsync信号后经过一系列的调用后,会触发WebView的onDraw生命周期回调,最终通过

browsr_view_renderer调用到synchronous_compositor_proxy.DemandDrawHwAsync进行网页内容的绘制

Render进程在收到DemandDrawHw后会遍历所有的Layer进行绘制,通过图层合并,合成最终的网页内容帧,这里我们只关注VideoLayerImpl的绘制过程,以下是具体的调用流程

synchronous_compositor_proxy.DemandDrawHwAsync
  |-->synchronous_compositor_proxy.DemandDrawHw
    |-->synchronous_layer_tree_frame_sink.DemandDrawHw
      |-->synchronous_layer_tree_frame_sink.InvokeComposite
          |
          | layer_tree_host_impl extends LayerTreeFrameSinkClient
          |
        |-->layer_tree_host_impl.OnDraw
          |-->LayerTreeHostImplClient.OnDrawForLayerTreeFrameSink
              |
              | proxy_impl extends LayerTreeHostImplClient
              |
          |-->proxy_impl.OnDrawForLayerTreeFrameSink
            |-->scheduler.OnDrawForLayerTreeFrameSink
              |-->scheduler.OnBeginImplFrameDeadline
                |-->scheduler.ProcessScheduledActions
                  |-->scheduler.DrawIfPossible
                    |-->SchedulerClient.ScheduledActionDrawIfPossible
                        |
                        | proxy_impl extends SchedulerClient
                        |
                    |-->proxy_impl.ScheduledActionDrawIfPossible
                      |-->proxy_impl.DrawInternal
                        |-->layer_tree_host_impl.PrepareToDraw
                          |-->layer_tree_host_impl.CalculateRenderPasses
                            |-->layer_impl.WillDraw
                                |
                                | video_layer_impl extends layer_impl
                                |
                            |-->video_layer_impl.WillDraw

VideoLayerImpl.WillDraw方法中经过一系列的调用将视频内容的原始数据打包到OpenGL中的纹理上,具体流程如下

video_layer_impl.WillDraw
  |-->video_resource_updater.ObtainFrameResources
    |-->video_resource_updater.CreateExternalResourcesFromVideoFrame
      |-->video_resource_updater.CreateForHardwarePlanes
        |-->video_resource_updater.CopyHardwarePlane
          |-->video_frame.UpdateReleaseSyncToken
            |-->SyncTokenClient.GenerateSyncToken [video_frame.cc]
                |
                | wait_and_replace_sync_token_client extends SyncTokenClient
                |
            |-->wait_and_replace_sync_token_client.GenerateSyncToken
              |-->InterfaceBase.GenSyncTokenCHROMIUM
                  |
                  | raster_implementation extends InterfaceBase
                  |
              |-->raster_implementation.GenSyncTokenCHROMIUM
                |-->implementation_base.GenSyncToken
                  |-->GpuControl.EnsureWorkVisible
                      |
                      | command_buffer_proxy_impl extends GpuControl
                      |
                  |-->command_buffer_proxy_impl.EnsureWorkVisible
                    |-->gpu_channel_host.VerifyFlush
                      |-->gpu_channel_host.InternalFlush
                        |-->gpu_channel.FlushDeferredRequests -------------------|
                                                                                 |
                                            gpu_channel mojom                    |
                                                                                 |
gpu_channel.FlushDeferredRequests <----------------------------------------------|
  |
  |-->PostTastk -----------------------------------------------------------------|
                                                                                 |
gpu_channel.ExecuteDeferredRequest <---------------------------------------------|
  |-->command_buffer_stub.ExecuteDeferredRequest
    |-->command_buffer_stub.OnAsyncFlush
      |-->command_buffer_service.Flush
        |-->raster_decoder.BeginDecoding
        |-->raster_decoder.DoCommands
          |-->raster_decoder.DoCommandsImpl
        |-->raster_decoder.EndDecoding

这里Chromium对OpenGL的各个指令举行了具体的封装,而视频数据打包到OpenGL纹理的具体指令是 kCopySharedImageINTERNALImmediate,他对应的实现在 raster_decoder_autogen 头文件中,具体实现如下

raster_decoder.DoCommandsImpl
  |-->Command:kCopySharedImageINTERNALImmediate
    |-->raster_decoder_autogen.HandleCopySharedImageINTERNALImmediate
      |-->DoCopySharedImageINTERNAL
          |
          | raster_decoder implements
          |
      |-->raster_decoder.DoCopySharedImageINTERNAL
        |->copy_shared_image_helper.CopySharedImage
          |-->SkCanvas.drawImageRect [将视频内容绘制到SkCanvas上]

5. 合成图层,追加视频绘制区域

VideoLayerImpl.WillDraw将视频帧打包到OpenGL纹理后,将video标签相关的坐标追加到绘制参数中,进行图层合并

proxy_impl.DrawInternal
  |-->layer_tree_host_impl.PrepareToDraw
    |-->layer_tree_host_impl.CalculateRenderPasses
      |-->layer_impl.AppendQuads
          |
          | video_layer_impl extends layer_impl
          |
      |-->video_layer_impl.AppendQuads

6. 网页内容帧绘制信息同步到Browser

Render将网页各图层合成后,将绘制信息打包同步给Browser进程,最终相关数据被记录在SynchronousCompositor::FrameFuture中用以在DrawFunctor的相关绘制回调中使用绘制,具体调用流程如下

proxy_impl.DrawInternal
  |-->layer_tree_host_impl.DrawLayers
    |-->layer_tree_host_impl.GenerateCompositorFrame
      |-->LayerTreeFrameSink.SubmitCompositorFrame
          |
          | synchronous_layer_tree_frame_sink extends LayerTreeFrameSink
          |
      |-->synchronous_layer_tree_frame_sink.SubmitCompositorFrame
        |-->SynchronousLayerTreeFrameSinkClient.SubmitCompositorFrame
            |
            | synchronous_compositor_proxy extends SynchronousLayerTreeFrameSinkClient
            |
        |-->synchronous_compositor_proxy.SubmitCompositorFrame
          |-->hardware_draw_reply_.Run
              |
              |
              |
          |-->synchronous_compositor_proxy.SendDemandDrawHwAsyncReply
            |-->SynchronousCompositorControlHost.ReturnFrame -----------------------------------|
                                                                                                |
~~~~~~~~~~~~~~~~~~~~~~~~~~mojom~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
                                                                                                |
SynchronousCompositorControlHost.ReturnFrame<---------------------------------------------------|
  |-->synchronous_compositor_sync_call_bridge.ReceiveFrameOnIOThread
    |-->SynchronousCompositor::FrameFuture.SetFrame [ 接收网页内容帧数据,用以绘制 ]

SynchronousCompositor::FrameFuture在SynchronousCompositorHost.DemandHwDrawAsync调用中提前占位存储到SynchronousCompositorSyncCallBridge中,同时该对象还同步到RenderThreadManager中,而RenderThreadManager就是内核与Android DrawFunctor对接的桥梁,在DrawFunctor各种回调中管理数据以确保内容正确的绘制,具体调用流程如下

browser_view_renderer.OnDrawHardWare
  |-->synchronous_compositor_host.DemandDrawHwAsync [ return FrameFuture ]
  |-->CompositorFrameConsumer.SetFrameOnUI
      |
      | render_thread_manager extends CompositorFrameConsumer
      |
  |-->render_thread_manager.SetFrameOnUI

7. DrawFunctor & RenderThreadManager 绘制上屏

Android 平台上的 OpenGL 绘制我们这里只关心内核与之适配的相关生命周期回调,分别是

  • OnSync 开始准备绘制数据
  • DrawGL 开始绘制

RenderThreadManager 在 OnSync 中准备绘制数据,流程如下

aw_draw_fn_impl.OnSync
  |-->render_thread_manager.CommitFrameOnRT
    |-->hardware_renderer.CommitFrame
      |-->render_thread_manager.PassFramesOnRT [ return child frames ]
      |-->child_frame_queue_.emplace_back child frame

HardwareRenderer在OnSync过程中取走RenderThreadManager中绘制数据,RenderThreadManager绘制数据取走后,立即清空缓存,避免残留数据造成绘制异常

ChildFrameQueue RenderThreadManager::PassFramesOnRT() {
  base::AutoLock lock(lock_);
  ChildFrameQueue returned_frames;
  returned_frames.swap(child_frames_);
  return returned_frames;
}

RenderThreadManager 在 DrawGL 中绘制内容并上屏,流程如下

aw_draw_fn_impl.DrawGL
  |-->render_thread_manager.DrawOnRT
    |-->hardware_renderer.Draw
      |-->hardware_renderer.DrawAndSwap [ 上屏 ]

基于绘制流程的滤镜小例子

通过以上的源码梳理,我们基本了解了网页的绘制流程,也理清了视频上屏的过程,以及视频相关数据的关键绘制步骤,那么我们对视频绘制数据进行一定程度的定制当然也是可行的,找到视频绘制的关键调用,在绘制前我们给绘制的SkPaint添加相关的颜色滤镜,视频变色了!以下是一段将视频转化为黑白效果的样例

copy_shared_image_helper.cc

const SkScalar gray[20] = {
    0.21f, 0.72f, 0.07f, 0.41f, 0,  // red
    0.21f, 0.72f, 0.07f, 0.41f, 0,  // green
    0.21f, 0.72f, 0.07f, 0.41f, 0,  // blue
    0,     0,     0,     1.0f,  0};
sk_sp<SkColorFilter> filter = SkColorFilters::Matrix(gray);
paint.setColorFilter(filter);