阅读 1050

iOS 视频编辑 -视频转场

视频转场

视频转场,顾名思义就是由一个视频过渡到另外一个视频,通过添加一定的图像处理效果,让两个视频之间的转场更加顺畅、切合用户需要。

首先先回顾以下视频合成的流程

  1. 获取视频资源AVAsset
  2. 创建自定义合成对象AVMutableComposition
  3. 创建视频组件AVMutableVideoComposition,这个类是处理视频中要编辑的东西。可以设定所需视频的大小、规模以及帧的持续时间。以及管理并设置视频组件的指令。
  4. 创建遵循AVVideoCompositing协议的customVideoCompositorClass,这个类主要用于定义视频合成器的属性和方法。
  5. 在可变组件中添加资源数据,也就是轨道AVMutableCompositionTrack(一般添加2种:音频轨道和视频轨道)。
  6. 创建视频应用层的指令AVMutableVideoCompositionLayerInstruction 用户管理视频框架应该如何被应用和组合,也就是说是子视频在总视频中出现和消失的时间、大小、动画等。
  7. 创建视频导出会话对象AVAssetExportSession,主要是根据 videoComposition 去创建一个新的视频,并输出到一个指定的文件路径中去。

构建转场AVMutableVideoCompositionLayerInstruction

视频转场,首要条件是获取到需要添加转场的两个视频的视频帧数据。所以我们在构建AVMutableVideoCompositionLayerInstruction的时候,添加一段转场所需的AVMutableVideoCompositionLayerInstruction

- (void)buildTransitionComposition:(AVMutableComposition *)composition andVideoComposition:(AVMutableVideoComposition *)videoComposition {
    NSUInteger clipsCount = self.clips.count;
    CMTime nextClipStartTime = kCMTimeZero;

    /// 转场时间为2s
    CMTime transitionDuration = CMTimeMakeWithSeconds(2, 600);

    // Add two video tracks and two audio tracks.
    AVMutableCompositionTrack *compositionVideoTracks[2];
    AVMutableCompositionTrack *compositionAudioTracks[2];
    compositionVideoTracks[0] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    compositionVideoTracks[1] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    compositionAudioTracks[0] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    compositionAudioTracks[1] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

    CMTimeRange *timeRanges = alloca(sizeof(CMTimeRange) * clipsCount);
    CMTimeRange *transitionTimeRanges = alloca(sizeof(CMTimeRange) * clipsCount);

    // Place clips into alternating video & audio tracks in composition
    for (int i = 0; i < clipsCount; i++) {
        AVAsset *asset = [self.clips objectAtIndex:i];
        CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, [asset duration]);

        AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
        [compositionVideoTracks[i] insertTimeRange:timeRange ofTrack:clipVideoTrack atTime:nextClipStartTime error:nil];

        AVAssetTrack *clipAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        [compositionAudioTracks[i] insertTimeRange:timeRange ofTrack:clipAudioTrack atTime:nextClipStartTime error:nil];

        timeRanges[i] = CMTimeRangeMake(nextClipStartTime, timeRange.duration);
        /// 根据转场时间,相对应的剪短当前视频时长
        if (i > 0) {
            timeRanges[i].start = CMTimeAdd(timeRanges[i].start, transitionDuration);
            timeRanges[i].duration = CMTimeSubtract(timeRanges[i].duration, transitionDuration);
        }
        if (i+1 < clipsCount) {
            timeRanges[i].duration = CMTimeSubtract(timeRanges[i].duration, transitionDuration);
        }

        /// 更新下个 asset 开始时间
        nextClipStartTime = CMTimeAdd(nextClipStartTime, asset.duration);
        nextClipStartTime = CMTimeSubtract(nextClipStartTime, transitionDuration);

        /// 处理转场时间
        if (i+1 < clipsCount) {
            transitionTimeRanges[i] = CMTimeRangeMake(nextClipStartTime, transitionDuration);
        }
    }

    NSMutableArray *instructions = [NSMutableArray array];
    for (int i = 0; i < clipsCount; i++) {
        CustomVideoCompositionInstruction *videoInstruction = [[CustomVideoCompositionInstruction alloc] initTransitionWithSourceTrackIDs:@[@(compositionVideoTracks[i].trackID)] forTimeRange:timeRanges[i]];
        videoInstruction.trackID = compositionVideoTracks[i].trackID;
        [instructions addObject:videoInstruction];

        /// 转场
        if (i+1 < clipsCount) {
            CustomVideoCompositionInstruction *videoInstruction = [[CustomVideoCompositionInstruction alloc] initTransitionWithSourceTrackIDs:@[[NSNumber numberWithInt:compositionVideoTracks[0].trackID], [NSNumber numberWithInt:compositionVideoTracks[1].trackID]] forTimeRange:transitionTimeRanges[i]];
            // First track -> Foreground track while compositing
            videoInstruction.foregroundTrackID = compositionVideoTracks[0].trackID;
            // Second track -> Background track while compositing
            videoInstruction.backgroundTrackID = compositionVideoTracks[1].trackID;
            [instructions addObject:videoInstruction];
        }
    }
    videoComposition.instructions = instructions;
}
复制代码

我们添加了视频转场对应的AVMutableVideoCompositionLayerInstruction之后,在对应AVMutableVideoCompositionLayerInstruction的时间范围内,startVideoCompositionRequest:方法会将对应的AVMutableVideoCompositionLayerInstruction回调出来;我们可以根据对应AVMutableVideoCompositionLayerInstruction持有的trackID获取对应转场视频的视频帧,根据对应的视频帧作自定义的图像处理,进而生成转场动画。

对于图像处理可以根据自己所需选择使用 OpenGL、Metal 处理。

- (void)startVideoCompositionRequest:(nonnull AVAsynchronousVideoCompositionRequest *)request {
    @autoreleasepool {
        dispatch_async(_renderingQueue, ^{
            if (self.shouldCancelAllRequests) {
                [request finishCancelledRequest];
            } else {
                NSError *err = nil;
                // Get the next rendererd pixel buffer
                CVPixelBufferRef resultPixels = [self newRenderedPixelBufferForRequest:request error:&err];

                if (resultPixels) {
                    CFRetain(resultPixels);
                    // The resulting pixelbuffer from OpenGL renderer is passed along to the request
                    [request finishWithComposedVideoFrame:resultPixels];
                    CFRelease(resultPixels);
                } else {
                    [request finishWithError:err];
                }
            }
        });
    }
}

- (CVPixelBufferRef)newRenderedPixelBufferForRequest:(AVAsynchronousVideoCompositionRequest *)request error:(NSError **)errOut {
    CVPixelBufferRef dstPixels = nil;
    CustomVideoCompositionInstruction *currentInstruction = request.videoCompositionInstruction;

    float tweenFactor = factorForTimeInRange(request.compositionTime, request.videoCompositionInstruction.timeRange);

    // 获取指定 track 的 pixelBuffer
    if (currentInstruction.trackID) {
        CVPixelBufferRef currentPixelBuffer = [request sourceFrameByTrackID:currentInstruction.trackID];
        dstPixels = currentPixelBuffer;
    } else {
        CVPixelBufferRef currentPixelBuffer1 = [request sourceFrameByTrackID:currentInstruction.foregroundTrackID];
        CVPixelBufferRef currentPixelBuffer2 = [request sourceFrameByTrackID:currentInstruction.backgroundTrackID];
        dstPixels = [self.renderContext newPixelBuffer];

        // ogl
        //[self.oglRenderer renderPixelBuffer:dstPixels foregroundPixelBuffer:currentPixelBuffer1 backgroundPixelBuffer:currentPixelBuffer2 tweenFactor:tweenFactor];

        // metal
        [self.metalRenderer renderPixelBuffer:dstPixels foregroundPixelBuffer:currentPixelBuffer1 backgroundPixelBuffer:currentPixelBuffer2 tweenFactor:tweenFactor];
    }

    return dstPixels;
}
复制代码

下面简单介绍一个透明度转场案例,可以让大家清晰的了解大概的操作流程

如果你正在跳槽或者正准备跳槽不妨动动小手,添加一下咱们的交流群1012951431来获取一份详细的大厂面试资料为你的跳槽多添一份保障。

OpenGL 转场

shader vertex shader

attribute vec4 position;
attribute vec2 texCoord;

varying vec2 texCoordVarying;

void main()
{
    gl_Position = position;
    texCoordVarying = texCoord;
}
复制代码

fragment shader

uniform sampler2D Sampler;

precision mediump float;

varying highp vec2 texCoordVarying;

void main()
{
    vec3 color = texture2D(Sampler, texCoordVarying).rgb;
    gl_FragColor = vec4(color, 1.0);
}
复制代码

设置 shader 着色器

- (BOOL)loadShaders {
    GLuint vertShader, fragShader;
    NSString *vertShaderSource, *fragShaderSource;
    NSString *vertShaderPath = [[NSBundle mainBundle] pathForResource:@"transition.vs" ofType:nil];
    NSString *fragShaderPath = [[NSBundle mainBundle] pathForResource:@"transition.fs" ofType:nil];

    // Create the shader program.
    self.program = glCreateProgram();

    // Create and compile the vertex shader.
    vertShaderSource = [NSString stringWithContentsOfFile:vertShaderPath encoding:NSUTF8StringEncoding error:nil];
    if (![self compileShader:&vertShader type:GL_VERTEX_SHADER source:vertShaderSource]) {
        NSLog(@"Failed to compile vertex shader");
        return NO;
    }

    // Create and compile Y fragment shader.
    fragShaderSource = [NSString stringWithContentsOfFile:fragShaderPath encoding:NSUTF8StringEncoding error:nil];
    if (![self compileShader:&fragShader type:GL_FRAGMENT_SHADER source:fragShaderSource]) {
        NSLog(@"Failed to compile fragment shader");
        return NO;
    }

    // Attach vertex shader to programY.
    glAttachShader(self.program, vertShader);

    // Attach fragment shader to programY.
    glAttachShader(self.program, fragShader);


    // Bind attribute locations. This needs to be done prior to linking.
    glBindAttribLocation(self.program, ATTRIB_VERTEX, "position");
    glBindAttribLocation(self.program, ATTRIB_TEXCOORD, "texCoord");

    // Link the program.
    if (![self linkProgram:self.program]) {
        NSLog(@"Failed to link program");
        if (vertShader) {
            glDeleteShader(vertShader);
            vertShader = 0;
        }
        if (fragShader) {
            glDeleteShader(fragShader);
            fragShader = 0;
        }
        if (_program) {
            glDeleteProgram(_program);
            _program = 0;
        }
        return NO;
    }

    // Get uniform locations.
    uniforms[UNIFORM] = glGetUniformLocation(_program, "Sampler");

    // Release vertex and fragment shaders.
    if (vertShader) {
        glDetachShader(_program, vertShader);
        glDeleteShader(vertShader);
    }
    if (fragShader) {
        glDetachShader(_program, fragShader);
        glDeleteShader(fragShader);
    }

    return YES;
}

- (BOOL)compileShader:(GLuint *)shader type:(GLenum)type source:(NSString *)sourceString
{
    if (sourceString == nil) {
        NSLog(@"Failed to load vertex shader: Empty source string");
        return NO;
    }

    GLint status;
    const GLchar *source;
    source = (GLchar *)[sourceString UTF8String];

    *shader = glCreateShader(type);
    glShaderSource(*shader, 1, &source, NULL);
    glCompileShader(*shader);

#if defined(DEBUG)
    GLint logLength;
    glGetShaderiv(*shader, GL_INFO_LOG_LENGTH, &logLength);
    if (logLength > 0) {
        GLchar *log = (GLchar *)malloc(logLength);
        glGetShaderInfoLog(*shader, logLength, &logLength, log);
        NSLog(@"Shader compile log:\n%s", log);
        free(log);
    }
#endif

    glGetShaderiv(*shader, GL_COMPILE_STATUS, &status);
    if (status == 0) {
        glDeleteShader(*shader);
        return NO;
    }

    return YES;
}

- (BOOL)linkProgram:(GLuint)prog
{
    GLint status;
    glLinkProgram(prog);

#if defined(DEBUG)
    GLint logLength;
    glGetProgramiv(prog, GL_INFO_LOG_LENGTH, &logLength);
    if (logLength > 0) {
        GLchar *log = (GLchar *)malloc(logLength);
        glGetProgramInfoLog(prog, logLength, &logLength, log);
        NSLog(@"Program link log:\n%s", log);
        free(log);
    }
#endif

    glGetProgramiv(prog, GL_LINK_STATUS, &status);
    if (status == 0) {
        return NO;
    }

    return YES;
}
复制代码

渲染处理

- (void)renderPixelBuffer:(CVPixelBufferRef)destinationPixelBuffer
    foregroundPixelBuffer:(CVPixelBufferRef)foregroundPixelBuffer
    backgroundPixelBuffer:(CVPixelBufferRef)backgroundPixelBuffer
              tweenFactor:(float)tween {
    if (!foregroundPixelBuffer || !backgroundPixelBuffer) {
        return;
    }
    [EAGLContext setCurrentContext:self.currentContext];

    CVOpenGLESTextureRef foregroundTexture = [self textureForPixelBuffer:foregroundPixelBuffer];
    CVOpenGLESTextureRef backgroundTexture = [self textureForPixelBuffer:backgroundPixelBuffer];
    CVOpenGLESTextureRef destTexture = [self textureForPixelBuffer:destinationPixelBuffer];

    glUseProgram(self.program);

    glBindFramebuffer(GL_FRAMEBUFFER, self.offscreenBufferHandle);
    glViewport(0, 0, (int)CVPixelBufferGetWidth(destinationPixelBuffer), (int)CVPixelBufferGetHeight(destinationPixelBuffer));

    // 第一个纹理
    glActiveTexture(GL_TEXTURE0);
    glBindTexture(CVOpenGLESTextureGetTarget(foregroundTexture), CVOpenGLESTextureGetName(foregroundTexture));
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

    // 第二个纹理
    glActiveTexture(GL_TEXTURE1);
    glBindTexture(CVOpenGLESTextureGetTarget(backgroundTexture), CVOpenGLESTextureGetName(backgroundTexture));
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

    // Attach the destination texture as a color attachment to the off screen frame buffer
    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, CVOpenGLESTextureGetTarget(destTexture), CVOpenGLESTextureGetName(destTexture), 0);
    if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) {
        NSLog(@"Failed to make complete framebuffer object %x", glCheckFramebufferStatus(GL_FRAMEBUFFER));
        return;
    }

    // clear
    glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
    glClear(GL_COLOR_BUFFER_BIT);

    // 顶点
    GLfloat quadVertexData [] = {
        -1.0, 1.0,
        1.0, 1.0,
        -1.0, -1.0,
        1.0, -1.0,
    };

    // texture data varies from 0 -> 1, whereas vertex data varies from -1 -> 1
    GLfloat quadTextureData [] = {
        0.5 + quadVertexData[0]/2, 0.5 + quadVertexData[1]/2,
        0.5 + quadVertexData[2]/2, 0.5 + quadVertexData[3]/2,
        0.5 + quadVertexData[4]/2, 0.5 + quadVertexData[5]/2,
        0.5 + quadVertexData[6]/2, 0.5 + quadVertexData[7]/2,
    };

    // 应用第一个纹理
    glUniform1i(uniforms[UNIFORM], 0);

    // 设置顶点数据
    glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, quadVertexData);
    glEnableVertexAttribArray(ATTRIB_VERTEX);

    glVertexAttribPointer(ATTRIB_TEXCOORD, 2, GL_FLOAT, 0, 0, quadTextureData);
    glEnableVertexAttribArray(ATTRIB_TEXCOORD);

    // 启用混合模式
    glEnable(GL_BLEND);

    // 混合模式为,全源
    glBlendFunc(GL_ONE, GL_ZERO);

    // 绘制前景
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

    // 应用第二个纹理
    glUniform1i(uniforms[UNIFORM], 1);

    // 设置顶点数据
    glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, quadVertexData);
    glEnableVertexAttribArray(ATTRIB_VERTEX);

    glVertexAttribPointer(ATTRIB_TEXCOORD, 2, GL_FLOAT, 0, 0, quadTextureData);
    glEnableVertexAttribArray(ATTRIB_TEXCOORD);

    // 混合模式绘制背景
    // GL_CONSTANT_ALPHA 采用 glBlendColor 中的 alpha 值
    glBlendColor(0, 0, 0, tween);
    glBlendFunc(GL_CONSTANT_ALPHA, GL_ONE_MINUS_CONSTANT_ALPHA);

    // 绘制背景
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
    glFlush();

    // Periodic texture cache flush every frame
    CVOpenGLESTextureCacheFlush(self.videoTextureCache, 0);

    [EAGLContext setCurrentContext:nil];
}

- (CVOpenGLESTextureRef)textureForPixelBuffer:(CVPixelBufferRef)pixelBuffer {
    CVOpenGLESTextureRef texture = NULL;
    CVReturn err;

    if (!_videoTextureCache) {
        NSLog(@"No video texture cache");
        return texture;
    }

    // Periodic texture cache flush every frame
    CVOpenGLESTextureCacheFlush(_videoTextureCache, 0);

    // CVOpenGLTextureCacheCreateTextureFromImage will create GL texture optimally from CVPixelBufferRef.
    err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                       _videoTextureCache,
                                                       pixelBuffer,
                                                       NULL,
                                                       GL_TEXTURE_2D,
                                                       GL_RGBA,
                                                       (int)CVPixelBufferGetWidth(pixelBuffer),
                                                       (int)CVPixelBufferGetHeight(pixelBuffer),
                                                       GL_RGBA,
                                                       GL_UNSIGNED_BYTE,
                                                       0,
                                                       &texture);

    if (!texture || err) {
        NSLog(@"Error at creating luma texture using CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
    }
    return texture;
}
复制代码

Metal 转场

shader

#include <metal_stdlib>
#import "ShaderTypes.h"

using namespace metal;

typedef struct
{
    float4 clipSpacePosition [[ position ]]; // position 修饰符表示这个是顶点
    float2 textureCoordinate;
    float  factor;
} RasterizerData;

vertex RasterizerData vertexShader(uint vertexID [[ vertex_id ]],
                                   constant Vertex *vertexArray [[ buffer(VertexInputIndexVertices) ]]) {
    RasterizerData out;
    out.clipSpacePosition = float4(vertexArray[vertexID].position, 0.0, 1.0);
    out.textureCoordinate = vertexArray[vertexID].textureCoordinate;
    return out;
}

fragment float4 samplingShader(RasterizerData input [[stage_in]],
                               texture2d<float> foregroundTexture [[ texture(FragmentTextureIndexForeground) ]],
                               texture2d<float> backgroundTexture [[ texture(FragmentTextureIndexbakcground) ]],
                               constant float &factor [[ buffer(FragmentInputIndexFactor) ]]) {
    constexpr sampler textureSampler (mag_filter::linear,
                                      min_filter::linear);
    float3 forgroundColor = foregroundTexture.sample(textureSampler, input.textureCoordinate).rgb;
    float3 backgroundColor = backgroundTexture.sample(textureSampler, input.textureCoordinate).rgb;

    float3 color = forgroundColor * (1 - factor) + backgroundColor * factor;
    return float4(color, 1.0);
}
复制代码

设置 shader 着色器

- (void)setupPipeline {
    id<MTLLibrary> defaultLibrary = [self.device newDefaultLibrary];
    id<MTLFunction> vertexFunction = [defaultLibrary newFunctionWithName:@"vertexShader"];
    id<MTLFunction> fragmentFunction = [defaultLibrary newFunctionWithName:@"samplingShader"];

    MTLRenderPipelineDescriptor *pipelineDescriptor = [[MTLRenderPipelineDescriptor alloc] init];
    pipelineDescriptor.vertexFunction = vertexFunction;
    pipelineDescriptor.fragmentFunction = fragmentFunction;
    pipelineDescriptor.colorAttachments[0].pixelFormat = MTLPixelFormatBGRA8Unorm;

    self.pipelineState = [self.device newRenderPipelineStateWithDescriptor:pipelineDescriptor error:NULL];
    self.commandQueue = [self.device newCommandQueue];
}

- (void)setupVertex {
    Vertex quardVertices[] =
    {   // 顶点坐标,分别是x、y    纹理坐标,x、y;
        { {  1.0, -1.0 },  { 1.f, 1.f } },
        { { -1.0, -1.0 },  { 0.f, 1.f } },
        { { -1.0,  1.0 },  { 0.f, 0.f } },

        { {  1.0, -1.0 },  { 1.f, 1.f } },
        { { -1.0,  1.0 },  { 0.f, 0.f } },
        { {  1.0,  1.0 },  { 1.f, 0.f } },
    };
    self.vertices = [self.device newBufferWithBytes:quardVertices
                                             length:sizeof(quardVertices)
                                            options:MTLResourceStorageModeShared];
    self.numVertices = sizeof(quardVertices) / sizeof(Vertex);
}
复制代码

渲染处理

- (void)renderPixelBuffer:(CVPixelBufferRef)destinationPixelBuffer
    foregroundPixelBuffer:(CVPixelBufferRef)foregroundPixelBuffer
    backgroundPixelBuffer:(CVPixelBufferRef)backgroundPixelBuffer
              tweenFactor:(float)tween {
    id<MTLTexture> destinationTexture = [self textureWithCVPixelBuffer:destinationPixelBuffer];
    id<MTLTexture> foregroundTexture = [self textureWithCVPixelBuffer:foregroundPixelBuffer];
    id<MTLTexture> backgroundTexture = [self textureWithCVPixelBuffer:backgroundPixelBuffer];

    id<MTLCommandBuffer> commandBuffer = [self.commandQueue commandBuffer];
    MTLRenderPassDescriptor *renderDescriptor = [MTLRenderPassDescriptor renderPassDescriptor];
    renderDescriptor.colorAttachments[0].clearColor = MTLClearColorMake(0, 0, 0, 1);
    renderDescriptor.colorAttachments[0].storeAction = MTLStoreActionStore;
    renderDescriptor.colorAttachments[0].loadAction = MTLLoadActionClear;
    renderDescriptor.colorAttachments[0].texture = destinationTexture;

    id<MTLRenderCommandEncoder> renderEncoder = [commandBuffer renderCommandEncoderWithDescriptor:renderDescriptor];
    [renderEncoder setRenderPipelineState:self.pipelineState];
    [renderEncoder setVertexBuffer:self.vertices offset:0 atIndex:VertexInputIndexVertices];
    [renderEncoder setFragmentTexture:foregroundTexture atIndex:FragmentTextureIndexForeground];
    [renderEncoder setFragmentTexture:backgroundTexture atIndex:FragmentTextureIndexbakcground];
    [renderEncoder setFragmentBytes:&tween length:sizeof(tween) atIndex:FragmentInputIndexFactor];
    [renderEncoder drawPrimitives:MTLPrimitiveTypeTriangle
                      vertexStart:0
                      vertexCount:self.numVertices]; // 绘制

    [renderEncoder endEncoding]; // 结束
    [commandBuffer commit];
}

- (id<MTLTexture>)textureWithCVPixelBuffer:(CVPixelBufferRef)pixelBuffer {
    if (pixelBuffer == NULL) {
        return nil;
    }
    id<MTLTexture> texture = nil;
    CVMetalTextureRef metalTextureRef = NULL;
    size_t width = CVPixelBufferGetWidth(pixelBuffer);
    size_t height = CVPixelBufferGetHeight(pixelBuffer);
    MTLPixelFormat pixelFormat = MTLPixelFormatBGRA8Unorm;
    CVReturn status = CVMetalTextureCacheCreateTextureFromImage(NULL,
                                                                _textureCache,
                                                                pixelBuffer,
                                                                NULL,
                                                                pixelFormat,
                                                                width,
                                                                height,
                                                                0,
                                                                &metalTextureRef);
    if (status == kCVReturnSuccess) {
        texture = CVMetalTextureGetTexture(metalTextureRef);
        CFRelease(metalTextureRef);
    }
    return texture;
}
复制代码

更多精彩分享

文章分类
iOS
文章标签