基于QT的opengl 视频裁剪、拼接,4宫格,9宫格

1,410 阅读6分钟

这是我参与8月更文挑战的第28天,活动详情查看:8月更文挑战

一、概述

1.1 前言

在上一篇文章我们讲了Y420P视频数据如何裁剪、拼接、旋转等,但是缺点也很明显,一是工作量大,代码量较大。二是容错率低,因为涉及到大量的浮点型计算,导致在数据拷贝的时候存在误差,可能渲染的时候Y、UV分量就产生问题,视频花屏、变形等。同时,YUV数据拷贝和计算的过程都是交给CPU进行,效率较低。

而这篇文章我们采取另外一种更优方式,利用 opengl 顶点和纹理坐标做文章,变换坐标确定视频块播放位置,代码量更少、容错率极高,易理解。

1.2 概念讲解

1.2.1 顶点数组与顶点坐标

顶点数组是OpenGL 1.1所提供的功能,在绘制各种图形时指定了数据渲染的坐标,表示了一个几何图形。为了避免跑偏,我们不过多的解释opengl相关知识。我们只需要知道我们在文章中是怎么使用顶点数组以达到我们的目的。简单说,顶点数组就是定义一个包含一系列坐标点的数组,opengl其实就是基于顶点的网格绘制,几何图形就是将这些顶点按照一定规则连接起来,那么纹理坐标的4个坐标点,映射到顶点上,openGL就会把这个纹理应用到4个顶点构成的图形上。

顶点坐标与纹理坐标和其对应关系

如上图所示,图1称为顶点坐标,图2称为纹理坐标,opengl为了显卡计算方便, 要求需要对坐标进行归一化处理,因此 顶点坐标的 x,y 区间范围是【-1,1】。

1.2.1 纹理数组与纹理坐标

首先第一点,纹理坐标的区间范围是【0,1】,分别对应顶点坐标的4个点。第二点, 考虑到电脑屏幕的坐标,左上角为(0,0),X轴向右为正向,Y轴向下为正向,和纹理坐标是上下颠倒的。 看上面 TexTure(纹理) 这张图,假如把它上下翻过来,让Y轴向下,是不是和电脑坐标对应,并且更重要的,翻过来之后,它的4个点是和顶点坐标(vertex)也是一一对应的,即左上角对应左上角,右上角对应右上角。因此,我们必须明确顶点坐标和纹理坐标的对应关系,在构建具体数组数据时,需要将纹理数组上下翻转,才能和顶点数组一一对应。

二、实践

由于我们的视频源是 Y420P 格式的,因此需要用 YUV 的方式渲染出来,不废话,直接上代码,此代码可复制到工程中直接使用,前提是配置好 VLC,使用的是 QT5.12.0 版本。

#pragma once

class I420Image
{
public:
    I420Image(int w, int h)
        : width(w)
        , height(h)
    {
        data = new uint8_t[w*h + w * h / 2];
    }
    ~I420Image()
    {
        delete[]data;
    }

    int GetWidth() const { return width; }
    int GetHeight() const { return height; }
    uint8_t *GetY() const { return data; }
    uint8_t *GetU()const { return data + width * height; }
    uint8_t *GetV()const { return data + width * height + width * height / 4; }
protected:
public:
    int width = 0;
    int height = 0;
    uint8_t * data;
};

I420Image 是我们从vlc解析出来之后存储视频帧的类。

#pragma once
#include <QOpenGLWidget>
#include <QOpenGLFunctions>
#include "I420Image.h"

struct libvlc_media_track_info_t;
struct libvlc_media_t;
struct libvlc_instance_t;
struct libvlc_media_player_t;
struct libvlc_event_t;
class vlcOpenglTest : public QOpenGLWidget, public QOpenGLFunctions
{
    Q_OBJECT

public:
    explicit vlcOpenglTest(QWidget *parent = 0);
    ~vlcOpenglTest();
public:
    void play();

public:
    static void *lock_cb(void *opaque, void **planes);
    static void unlock_cb(void *opaque, void *picture, void *const *planes);
    static void display_cb(void *opaque, void *picture);
    static unsigned setup_cb(void **opaque, char *chroma,
        unsigned *width, unsigned *height,
        unsigned *pitches,
        unsigned *lines);
    static void cleanup_cb(void *opaque);

protected:
    virtual void initializeGL() override;
    virtual void paintGL() override;
private:
    void InitShaders();
    I420Image *m_Front;
    I420Image *m_Back;
    GLuint program;
    GLuint tex_y, tex_u, tex_v;
    GLuint sampler_y, sampler_u, sampler_v;
    libvlc_instance_t* m_vlc;
    libvlc_media_player_t *m_vlcplayer;
};
#include "vlcOpenglTest.h"

#ifdef _WIN32
#include <basetsd.h>
typedef SSIZE_T ssize_t;
#endif
#include "vlc/vlc.h"
#include <QPainter>
#include <QOpenGLFunctions_2_0>

using namespace std;

static const char *vertexShader = "\
	#version 430 core\n \
	layout(location = 0) in vec4 vertexIn; \
	layout(location = 1) in vec2 textureIn; \
	out vec2 textureOut;  \
	void main(void)\
	{\
		gl_Position =vertexIn ;\
		textureOut = textureIn;\
	}";

static const char *fragmentShader = "\
#version 430 core\n \
in vec2 textureOut;\
uniform sampler2D tex_y;\
uniform sampler2D tex_u;\
uniform sampler2D tex_v;\
void main(void)\
{\
    vec3 yuv;\
    vec3 rgb;\
    yuv.x = texture2D(tex_y, textureOut).r;\
    yuv.y = texture2D(tex_u, textureOut).r - 0.5;\
    yuv.z = texture2D(tex_v, textureOut).r - 0.5;\
    rgb = mat3( 1,       1,         1,\
                0,       -0.39465,  2.03211,\
                1.13983, -0.58060,  0) * yuv;\
    gl_FragColor = vec4(rgb, 1.0);\
}";


vlcOpenglTest::vlcOpenglTest(QWidget *parent) :
    QOpenGLWidget(parent),
    m_vlcplayer(NULL),
    m_vlc(NULL),
    m_Front(NULL),
    m_Back(NULL)
{
    setGeometry(0, 0, 500, 500);
    m_vlc = libvlc_new(0, 0);

    m_vlcplayer = libvlc_media_player_new(m_vlc);
    libvlc_video_set_callbacks(m_vlcplayer, lock_cb, unlock_cb, display_cb, this);
    libvlc_video_set_format_callbacks(m_vlcplayer, setup_cb, cleanup_cb);
}
vlcOpenglTest::~vlcOpenglTest()
{
    stop();
    libvlc_release(m_vlc);
}


void vlcOpenglTest::play()
{
    QString path = "file:///D:\\ky.mp4";
    libvlc_media_t *pmedia = libvlc_media_new_location(m_vlc, path.toLocal8Bit().data());

    libvlc_media_add_option(pmedia, ":rtsp-tcp=true");
    libvlc_media_add_option(pmedia, ":network-caching=300");
    libvlc_media_player_set_media(m_vlcplayer, pmedia);
    libvlc_media_player_play(m_vlcplayer);

    libvlc_media_release(pmedia);
}

void *vlcOpenglTest::lock_cb(void *opaque, void **planes)
{

    vlcOpenglTest *pthis = static_cast<vlcOpenglTest*>(opaque);

    planes[0] = pthis->m_Back->GetY();
    planes[1] = pthis->m_Back->GetU();
    planes[2] = pthis->m_Back->GetV();

    return pthis->m_Back;
}

void vlcOpenglTest::unlock_cb(void *opaque, void *picture, void * const *planes)
{
    vlcOpenglTest *pthis = static_cast<vlcOpenglTest*>(opaque);

    I420Image* p = pthis->m_Front;
    pthis->m_Front = pthis->m_Back;
    pthis->m_Back = p;
}

void vlcOpenglTest::display_cb(void *opaque, void *picture)
{
    vlcOpenglTest *pthis = static_cast<vlcOpenglTest*>(opaque);
    pthis->update();
}

unsigned vlcOpenglTest::setup_cb(void **opaque, char *chroma, unsigned *width, unsigned *height, unsigned *pitches, unsigned *lines)
{
    vlcOpenglTest *pthis = static_cast<vlcOpenglTest*>(*opaque);
    assert(pthis);

    pthis->m_Front = new I420Image(*width, *height);
    pthis->m_Back = new I420Image(*width, *height);

    pitches[0] = *width;
    lines[0] = *height;

    pitches[1] = pitches[2] = *width / 2;
    lines[1] = lines[2] = *height / 2;

    return 1;
}

void vlcOpenglTest::cleanup_cb(void *opaque)
{
    vlcOpenglTest *pthis = static_cast<vlcOpenglTest*>(opaque);
    assert(pthis);
    if (pthis->m_Front)
    {
        delete pthis->m_Front;
        pthis->m_Front = nullptr;
    }
    if (pthis->m_Back)
    {
        delete pthis->m_Back;
        pthis->m_Back = nullptr;
    }
}

void vlcOpenglTest::initializeGL()
{

    initializeOpenGLFunctions();
    InitShaders();

}


void vlcOpenglTest::paintGL()
{
    // 清除缓冲区
    glClearColor(0.0, 0.0, 0.0, 0.0);
    glClear(GL_COLOR_BUFFER_BIT);

    if (m_Front)
    {

#ifdef QT_NO_DEBUG 
        //release 发布时避免第一帧无数据红屏
        if (*m_Front->GetY() == '\0')
        {
            qDebug() << "data frame  Uninitialized completion  return";
            return;
        }
#endif
        int sourceW = m_Front->GetWidth();
        int sourceH = m_Front->GetHeight();
        int srcLength = 0;
        int desW = sourceW;
        int desH = sourceH;
        
        /*Y*/
        glActiveTexture(GL_TEXTURE0);
        glBindTexture(GL_TEXTURE_2D, tex_y);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RED, desW, desH, 0, GL_RED, GL_UNSIGNED_BYTE, (GLvoid*)m_Front->data);
        glUniform1i(sampler_y, 0);

        /*U*/
        glActiveTexture(GL_TEXTURE1);
        glBindTexture(GL_TEXTURE_2D, tex_u);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RED, desW / 2, desH / 2, 0, GL_RED, GL_UNSIGNED_BYTE, (GLvoid*)(m_Front->data + desW * desH));
        glUniform1i(sampler_u, 1);

        /*V*/
        glActiveTexture(GL_TEXTURE2);
        glBindTexture(GL_TEXTURE_2D, tex_v);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RED, desW / 2, desH / 2, 0, GL_RED, GL_UNSIGNED_BYTE, (GLvoid*)(m_Front->data + desW * desH * 5 / 4));
        glUniform1i(sampler_v, 2);
        //整图
        glDrawArrays(GL_TRIANGLE_FAN, 0, 4);

        //四宫格
        //for (int i = 0; i < 4; i++)
        //{
        //    glDrawArrays(GL_TRIANGLE_FAN, 4 * i, 4);
        //}
        glFlush();
    }
}

void vlcOpenglTest::InitShaders()
{
        static const GLfloat vertexVertices[] = 
        {
            //整图
            -1.0f,  -1.0f,
            1.0f,  -1.0f,
            1.0f,  1.0f,
            -1.0f,  1.0f,

            //四宫格
        	// 顶点逆时针旋转
        	//-1.0f,  0.0f,		// 左上角区域
        	// 0.0f,  0.0f,
        	// 0.0f,  1.0f,
        	//-1.0f,  1.0f,

        	//0.2f, 0.2f,			// 右上角区域
        	//1.0f, 0.2f,
        	//1.0f, 1.0f,
        	//0.2f, 1.0f,

        	//-1.0f,  -1.0f,		// 左下角区域
        	// 0.0f,  -1.0f,
        	// 0.0f,  -0.2f,
        	//-1.0f,  -0.2f,

        	//0.2f, -1.0f,		// 右下角区域
        	//1.0f, -1.0f,
        	//1.0f, 0.0f,
        	//0.2f, 0.0f,
        };

        static  const GLfloat textureVertices[] = {
            //整图
            0.0f, 1.0f,
            1.0f, 1.0f,
            1.0f, 0.0f,
            0.0f, 0.0f,

            //四宫格
        	// (Y按照和顶点一样的方式逆时针4个点,从左下角开始),然后1- Y (翻转处理,不然出来的图像是反的)
        	//0.0f, 1 - 0.5f,		// 左上角区域
        	//0.5f, 1 - 0.5f,
        	//0.5f, 1 - 1.0f,
        	//0.0f, 1 - 1.0f,

        	//0.5f,  1 - 0.5f,		// 右上角区域
        	//1.0f,  1 - 0.5f,
        	//1.0f,  1 - 1.0f,
        	//0.5f,  1 - 1.0f,

        	//0.0f, 1 - (0.5f - 0.5f),		// 左下角区域
        	//0.5f, 1 - (0.5f - 0.5f),
        	//0.5f, 1 - (1.0f - 0.5f),
        	//0.0f, 1 - (1.0f - 0.5f),

        	//0.5f,  1 - (0.5f - 0.5f),	// 右下角区域
        	//1.0f,  1 - (0.5f - 0.5f),
        	//1.0f,  1 - (1.0f - 0.5f),
        	//0.5f,  1 - (1.0f - 0.5f),
        };

    GLint vertCompiled, fragCompiled, linked;
    GLint v, f;

    //Shader: step1
    v = glCreateShader(GL_VERTEX_SHADER);
    f = glCreateShader(GL_FRAGMENT_SHADER);

    //Shader: step2
    glShaderSource(v, 1, &vertexShader, NULL);
    glShaderSource(f, 1, &fragmentShader, NULL);

    //Shader: step3
    glCompileShader(v);
    glGetShaderiv(v, GL_COMPILE_STATUS, &vertCompiled);    //Debug

    glCompileShader(f);
    glGetShaderiv(f, GL_COMPILE_STATUS, &fragCompiled);    //Debug

    //Program: Step1
    program = glCreateProgram();
    //Program: Step2
    glAttachShader(program, v);
    glAttachShader(program, f);


    glVertexAttribPointer(0, 2, GL_FLOAT, 0, 0, vertexVertices);
    glEnableVertexAttribArray(0);

    glVertexAttribPointer(1, 2, GL_FLOAT, 0, 0, textureVertices);
    glEnableVertexAttribArray(1);


    //Program: Step3
    glLinkProgram(program);
    //Debug
    glGetProgramiv(program, GL_LINK_STATUS, &linked);

    glUseProgram(program);

    //Get Uniform Variables Location
    sampler_y = glGetUniformLocation(program, "tex_y");
    sampler_u = glGetUniformLocation(program, "tex_u");
    sampler_v = glGetUniformLocation(program, "tex_v");

    //Init Texture
    glGenTextures(1, &tex_y);
    glBindTexture(GL_TEXTURE_2D, tex_y);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

    glGenTextures(1, &tex_u);
    glBindTexture(GL_TEXTURE_2D, tex_u);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

    glGenTextures(1, &tex_v);
    glBindTexture(GL_TEXTURE_2D, tex_v);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

}

代码中只展示了如何用 opengl 渲染单个视频画面和如何利用顶点分割成 4 个画面,9宫格同理,只是需要计算顶点坐标。InitShaders 方法中,一个图像4个顶点,4张图片16个顶点,9张图片则是36个顶点,因此在实际应用中,此方法可修改为动态赋值。

拼接的话,则需要自己去按照自己想要拼接的位置,将纹理区域对应到自己想要的顶点区域去即可,同时也可以互相交换显示区域,总之,想怎么显示怎么显示。


如有帮助,请多多点赞支持哦。