webrtc入门六-android视频渲染

663 阅读2分钟

webrtc渲染接口:VideoSink

public interface VideoSink {
  /**
   * Implementations should call frame.retain() if they need to hold a reference to the frame after
   * this function returns. Each call to retain() should be followed by a call to frame.release()
   * when the reference is no longer needed.
   */
  @CalledByNative void onFrame(VideoFrame frame);
}

实现在EglRenderer中:

这里主要是在渲染之前判断handlerLock、frameLock。如果前一帧pendingFrame没有渲染就释放掉,把当前帧赋值给pendingFrame。

之后在渲染线程调用:renderFrameOnRenderThread。这里使用HandlerThread的Looper.

代码如下:

@Override
  public void onFrame(VideoFrame frame) {
    synchronized (statisticsLock) {
      ++framesReceived;
    }
    final boolean dropOldFrame;
    synchronized (handlerLock) {
      if (renderThreadHandler == null) {
        logD("Dropping frame - Not initialized or already released.");
        return;
      }
      synchronized (frameLock) {
        dropOldFrame = (pendingFrame != null);
        if (dropOldFrame) {
          pendingFrame.release();
        }
        pendingFrame = frame;
        pendingFrame.retain();
        renderThreadHandler.post(this ::renderFrameOnRenderThread);//渲染线程
      }
    }
    if (dropOldFrame) {
      synchronized (statisticsLock) {
        ++framesDropped;
      }
    }
  }

在渲染时,把pendingFrame取出,如果eglBase没有准备好,opengl没有创建,就放弃当前帧;

根据当前的layout和frame的比例计算出缩放比设置到Matrix,然后调用VideoFrameDrawer;

eglBase.swapBuffers交换数据并显示,其实就是交换Surface的缓存和展示buffer。

代码如下:

/**
   * Renders and releases |pendingFrame|.
   */
  private void renderFrameOnRenderThread() {
    // Fetch and render |pendingFrame|.
    final VideoFrame frame;
    synchronized (frameLock) {//锁
      if (pendingFrame == null) {
        return;
      }
      frame = pendingFrame;//上一步的frame
      pendingFrame = null;
    }
    if (eglBase == null || !eglBase.hasSurface()) {//没有初始化opengl,放弃当前frame
      logD("Dropping frame - No surface");
      frame.release();
      return;
    }
    // Check if fps reduction is active.
    final boolean shouldRenderFrame;
    synchronized (fpsReductionLock) {
      if (minRenderPeriodNs == Long.MAX_VALUE) {//如果fps<=0 
        // Rendering is paused.
        shouldRenderFrame = false;
      } else if (minRenderPeriodNs <= 0) {//嗯。。
        // FPS reduction is disabled.
        shouldRenderFrame = true;
      } else {
        final long currentTimeNs = System.nanoTime();
        if (currentTimeNs < nextFrameTimeNs) {//帧过来了,但是还没到刷新时间,不渲染
          logD("Skipping frame rendering - fps reduction is active.");
          shouldRenderFrame = false;
        } else {//渲染的情况:fps>0,当前时间大于等于nextFrameTimeNs
          nextFrameTimeNs += minRenderPeriodNs;
          // The time for the next frame should always be in the future.
          nextFrameTimeNs = Math.max(nextFrameTimeNs, currentTimeNs);
          shouldRenderFrame = true;//下面会用到这个标记
        }
      }
    }

    final long startTimeNs = System.nanoTime();

    final float frameAspectRatio = frame.getRotatedWidth() / (float) frame.getRotatedHeight();
    final float drawnAspectRatio;
    synchronized (layoutLock) {
      drawnAspectRatio = layoutAspectRatio != 0f ? layoutAspectRatio : frameAspectRatio;//取出绘制的宽高比
    }

    final float scaleX;
    final float scaleY;

    if (frameAspectRatio > drawnAspectRatio) {
      scaleX = drawnAspectRatio / frameAspectRatio;
      scaleY = 1f;
    } else {
      scaleX = 1f;
      scaleY = frameAspectRatio / drawnAspectRatio;
    }

    drawMatrix.reset();
    drawMatrix.preTranslate(0.5f, 0.5f);//这个没看懂
    drawMatrix.preScale(mirrorHorizontally ? -1f : 1f, mirrorVertically ? -1f : 1f);//-1:镜像变化
    drawMatrix.preScale(scaleX, scaleY);//使用上面计算好的缩放比缩放
    drawMatrix.preTranslate(-0.5f, -0.5f);

    try {
      if (shouldRenderFrame) {
        GLES20.glClearColor(0 /* red */, 0 /* green */, 0 /* blue */, 0 /* alpha */);
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
        frameDrawer.drawFrame(frame, drawer, drawMatrix, 0 /* viewportX */, 0 /* viewportY */,
            eglBase.surfaceWidth(), eglBase.surfaceHeight());//主要逻辑

        final long swapBuffersStartTimeNs = System.nanoTime();
        if (usePresentationTimeStamp) {
          eglBase.swapBuffers(frame.getTimestampNs());
        } else {
          eglBase.swapBuffers();//交换数据并显示
        }

        final long currentTimeNs = System.nanoTime();
        synchronized (statisticsLock) {
          ++framesRendered;
          renderTimeNs += (currentTimeNs - startTimeNs);
          renderSwapBufferTimeNs += (currentTimeNs - swapBuffersStartTimeNs);
        }
      }

      notifyCallbacks(frame, shouldRenderFrame);//这里我们没有设置过这个listener。他内部又绘制了一遍并生成bitmap回调监听者
    } catch (GlUtil.GlOutOfMemoryException e) {
      logE("Error while drawing frame", e);
      final ErrorCallback errorCallback = this.errorCallback;
      if (errorCallback != null) {
        errorCallback.onGlOutOfMemory();
      }
      // Attempt to free up some resources.
      drawer.release();
      frameDrawer.release();
      bitmapTextureFramebuffer.release();
      // Continue here on purpose and retry again for next frame. In worst case, this is a continous
      // problem and no more frames will be drawn.
    } finally {
      frame.release();
    }
  }

渲染过程主要是处理opengl绘制videoBuffer到Surface的过程。