自学内容网 自学内容网

OpenGL C++视频中添加图片及文字水印播放并录制

        一.前言:

                GitHub地址:GitHub - wangyongyao1989/WyFFmpeg: 音视频相关基础实现

        系列文章:

        1.   OpenGL Texture C++ 预览Camera视频

        2.   OpenGL Texture C++ Camera Filter滤镜;

        3.  OpenGL 自定义SurfaceView Texture C++预览Camera视频;

        4.  OpenGL Texture C++ Camera Filter滤镜视频录制;

        5.  OpenGL C++ 视频中添加图片及文字水印播放录制;

    显示效果:

        

OpenGL视频水印录制

 二、OpenGL C++视频中添加图片及文字水印播放并录制前置储备:

  • 本系列文章中1.  OpenGL Texture C++ 预览Camera视频;基于GLSurfaceView创建OpenGL运行环境为基础,把Camera的YUV数据传入OpenGL的片段着色器所创建的三个纹理(Texture)中对视频图像的显示;
  • 本系列文章中2.  OpenGL Texture C++ Camera Filter滤镜;在文章1的基础上对render过程进行优化,传入不同滤镜类型的片段着色器程序。来实现Camera滤镜视频切换显示。
  • 本系列文章中3.  OpenGL 自定义SurfaceView Texture C++预览Camera视频;在文章1的基础上,用自定义的类GLSurfaceView方式创建一个GLThread的OpenGL运行环境,把Camera的YUV数据传入OpenGL的片段着色器所创建的三个纹理(Texture)中对视频图像的显示。

  • 本系列文章中4.  OpenGL Texture C++ Camera Filter滤镜视频录制; 基于文章1/2/3的代码,结合Google开源项目grafika中的WindowSurface.java/Coregl.java/TextureMovieEncoder2.java/VideoEncoderCore.java创建出录制视频的surface并根据此切换GLContext上下文交换(swapBuffer)渲染与显示的视频数据,最终在VideoEncoderCore中通过MediaCodec获取出encodeData写入MediaMuxer成为MP4文件格式的视频。

  • 本篇5:基于文章1/2/3/4的代码,视频图片水印以另一个着色器程序的形式加入图片纹理,纹理相关的知识可参考本博主的另一篇文章:LearnOpenGL之入门基础-CSDN博客的知识储备;而视频水印文字在前面的基础上再加上另外一个着色器程序的形式加入文字渲染,OpenGL文字渲染可参考本博主的一篇文章:LearnOpenGL之文字渲染_opengl文字渲染接口-CSDN博客

        

三、功能实现流程:

        根据前置储备,该功能的实现流程:

  •  创建一个GLThread的OpenGL运行环境;
  •  YUV数据绘制到OpenGL纹理中;
  •  图片纹理创建绘制到glViewport;
  •  文字渲染绘制到glViewport;
  •  录制视频的surface并根据此切换GLContext上下文交换(swapBuffer)渲染与显示的视频数据 ;
  •  MediaCodec获取出encodeData写入MediaMuxer成为MP4文件格式的视频

     

      1、创建OpenGL的运行环境:

        前面文章4已经解析了OpenGL的运行环境的创建了,细节可以去参考,下面是总体的OpenGL运行环境创建代码:

void GLDrawTextVideoRender::OnSurfaceCreated() {
    m_EglCore = new EglCore(eglGetCurrentContext(), FLAG_RECORDABLE);
    if (!m_EglCore) {
        LOGE("new EglCore failed!");
        return;
    }

    LOGE("OnSurfaceCreated m_ANWindow:%p", m_ANWindow);

    m_WindowSurface = new WindowSurface(m_EglCore, m_ANWindow);
    if (!m_EglCore) {
        LOGE("new WindowSurface failed!");
        return;
    }
    m_WindowSurface->makeCurrent();
}

       2、YUV数据绘制到OpenGL纹理中:

        从外部传入的YUV数据后分别更新到对应的YUV纹理通道中,首先编译YUV通道的着色器程序并取出顶点程序及片元程序中的Uniform的参数位置;然后用YUV三个纹理的参数位置值绑定顶点坐标及纹理坐标;再后创建YUV三个纹理绑定纹理;最后是时渲染。

  •  更新处理YUV数据,代码如下:
void
GLDrawTextVideoRender::draw(uint8_t *buffer, size_t length, size_t width, size_t height,
                            float rotation) {
    draw_text_video_frame frame{};
    frame.width = width;
    frame.height = height;
    frame.stride_y = width;
    frame.stride_uv = width / 2;
    frame.y = buffer;
    frame.u = buffer + width * height;
    frame.v = buffer + width * height * 5 / 4;

    updateFrame(frame);
}


void GLDrawTextVideoRender::updateFrame(const draw_text_video_frame &frame) {
    m_sizeY = frame.width * frame.height;
    m_sizeU = frame.width * frame.height / 4;
    m_sizeV = frame.width * frame.height / 4;

    if (m_pDataY == nullptr || m_width != frame.width || m_height != frame.height) {
        m_pDataY = std::make_unique<uint8_t[]>(m_sizeY + m_sizeU + m_sizeV);
        m_pDataU = m_pDataY.get() + m_sizeY;
        m_pDataV = m_pDataU + m_sizeU;
    }

    m_width = frame.width;
    m_height = frame.height;

    if (m_width == frame.stride_y) {
        memcpy(m_pDataY.get(), frame.y, m_sizeY);
    } else {
        uint8_t *pSrcY = frame.y;
        uint8_t *pDstY = m_pDataY.get();

        for (int h = 0; h < m_height; h++) {
            memcpy(pDstY, pSrcY, m_width);

            pSrcY += frame.stride_y;
            pDstY += m_width;
        }
    }

    if (m_width / 2 == frame.stride_uv) {
        memcpy(m_pDataU, frame.u, m_sizeU);
        memcpy(m_pDataV, frame.v, m_sizeV);
    } else {
        uint8_t *pSrcU = frame.u;
        uint8_t *pSrcV = frame.v;
        uint8_t *pDstU = m_pDataU;
        uint8_t *pDstV = m_pDataV;

        for (int h = 0; h < m_height / 2; h++) {
            memcpy(pDstU, pSrcU, m_width / 2);
            memcpy(pDstV, pSrcV, m_width / 2);

            pDstU += m_width / 2;
            pDstV += m_width / 2;

            pSrcU += frame.stride_uv;
            pSrcV += frame.stride_uv;
        }
    }

    isDirty = true;
}
  •   编译YUV通道的着色器程序并取出顶点程序及片元程序中的Uniform的参数位置,代码如下:
int
GLDrawTextVideoRender::createYUVProgram() {

    //创建YUV视频通道着色器程序
    m_yuv_program = yuvGLShader->createProgram();
    m_vertexShader = yuvGLShader->vertexShader;
    m_pixelShader = yuvGLShader->fraShader;
    LOGI("GLDrawTextVideoRender createProgram m_yuv_program:%d", m_yuv_program);

    if (!m_yuv_program) {
        LOGE("Could not create program.");
        return 0;
    }

    //Get Uniform Variables Location
    m_yuv_vertexPos = (GLuint) glGetAttribLocation(m_yuv_program, "position");
    m_textureYLoc = glGetUniformLocation(m_yuv_program, "s_textureY");
    m_textureULoc = glGetUniformLocation(m_yuv_program, "s_textureU");
    m_textureVLoc = glGetUniformLocation(m_yuv_program, "s_textureV");
    m_yuv_textureCoordLoc = (GLuint) glGetAttribLocation(m_yuv_program, "texcoord");
    return m_yuv_program;
}
  • YUV三个纹理的参数位置值绑定顶点坐标及纹理坐标,代码如下: 
GLuint GLDrawTextVideoRender::useYUVProgram() {
    if (!m_yuv_program && !createYUVProgram()) {
        LOGE("Could not use program.");
        return 0;
    }
    glUseProgram(m_yuv_program);
    glVertexAttribPointer(m_yuv_vertexPos, 3, GL_FLOAT, GL_FALSE, 0, EGLTextVerticek);
    glEnableVertexAttribArray(m_yuv_vertexPos);

    glUniform1i(m_textureYLoc, 0);
    glUniform1i(m_textureULoc, 1);
    glUniform1i(m_textureVLoc, 2);
    glVertexAttribPointer(m_yuv_textureCoordLoc, 3, GL_FLOAT, GL_FALSE, 0, EGLTextTextureCoord);
    glEnableVertexAttribArray(m_yuv_textureCoordLoc);
    return m_yuv_program;
}
  • 创建YUV三个纹理绑定纹理,代码如下:
bool GLDrawTextVideoRender::createYUVTextures() {
    auto widthY = (GLsizei) m_width;
    auto heightY = (GLsizei) m_height;

    glActiveTexture(GL_TEXTURE0);
    glGenTextures(1, &m_textureIdY);
    glBindTexture(GL_TEXTURE_2D, m_textureIdY);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, widthY, heightY, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE,
                 nullptr);

    if (!m_textureIdY) {
        LOGE("OpenGL Error Create Y texture");
        return false;
    }

    GLsizei widthU = (GLsizei) m_width / 2;
    GLsizei heightU = (GLsizei) m_height / 2;

    glActiveTexture(GL_TEXTURE1);
    glGenTextures(1, &m_textureIdU);
    glBindTexture(GL_TEXTURE_2D, m_textureIdU);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, widthU, heightU, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE,
                 nullptr);

    if (!m_textureIdU) {
        LOGE("OpenGL Error Create U texture");
        return false;
    }

    GLsizei widthV = (GLsizei) m_width / 2;
    GLsizei heightV = (GLsizei) m_height / 2;

    glActiveTexture(GL_TEXTURE2);
    glGenTextures(1, &m_textureIdV);
    glBindTexture(GL_TEXTURE_2D, m_textureIdV);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, widthV, heightV, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE,
                 nullptr);

    if (!m_textureIdV) {
        LOGE("OpenGL Error Create V texture");
        return false;
    }

    return true;
}

      3、图片纹理创建绘制到glViewport;

        在这里水印图片,本文实现是另起一对着色器程序。分离出纹理的顶点坐标、纹理坐标的设置并创建绑定。纹理的创建及绑定如上一致。

  •  着色器的编译及Uniform的参数位置的定位,代码如下:
int
GLDrawTextVideoRender::createPicProgram() {
    //创建图片水印着色程序
    m_pic_program = picGLShader->createProgram();
    if (!m_pic_program) {
        LOGE("Could not create m_pic_program.");
        return 0;
    }
    m_pic_vertexPos = (GLuint) glGetAttribLocation(m_pic_program, "position");
    m_pic_textureCoordLoc = (GLuint) glGetAttribLocation(m_pic_program, "texcoord");
    m_texturePicLoc = (GLuint) glGetUniformLocation(m_pic_program, "s_texturePic");
    return m_pic_program;
}
  •  绑定图片纹理创建及绑定:
void GLDrawTextVideoRender::creatPicTexture() {

    if (picData) {
        GLenum format = 0;
        if (picChannels == 1) {
            format = GL_RED;
        } else if (picChannels == 3) {
            format = GL_RGB;
        } else if (picChannels == 4) {
            format = GL_RGBA;
        }
        glGenTextures(1, &m_texturePicLoc);
        glBindTexture(GL_TEXTURE_2D, m_texturePicLoc);
        glTexImage2D(GL_TEXTURE_2D, 0, format, picWidth, picHeight, 0, format, GL_UNSIGNED_BYTE,
                     picData);
        glGenerateMipmap(GL_TEXTURE_2D);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        stbi_image_free(picData);
    } else {
        LOGE("creatPicTexture picData  =(null)");
        stbi_image_free(picData);
    }

    if (!m_texturePicLoc) {
        LOGE("creatPicTexture Error Create PIC texture");
    }

}

void GLDrawTextVideoRender::bindPicTexture() {
    if (m_texturePicLoc) {
        // bind Texture
        glActiveTexture(GL_TEXTURE0);
        glBindTexture(GL_TEXTURE_2D, m_texturePicLoc);
    }
}

  4、文字渲染绘制到glViewport:

        OpenGL文字的渲染绘制文章:LearnOpenGL之文字渲染_opengl文字渲染接口-CSDN博客 已经详细解析了整个文字渲染到glViewport的过程,细节部分可以去参考次篇文章。

  •   freeType的加载,代码如下:
void GLDrawTextVideoRender::LoadFacesByASCII(const char *path) {
    LOGE("GLDrawTextVideoRender::LoadFacesByASCII path:%s", path);
    // FreeType
    FT_Library ft;
    // All functions return a value different than 0 whenever an error occurred
    if (FT_Init_FreeType(&ft))
        LOGE("ERROR::FREETYPE: Could not init FreeType Library");

    // Load font as face
    FT_Face face;
    if (FT_New_Face(ft, path, 0, &face))
        LOGE("ERROR::FREETYPE: Failed to load font");

    // Set size to load glyphs as
    FT_Set_Pixel_Sizes(face, 0, 48);

    // Disable byte-alignment restriction
    glPixelStorei(GL_UNPACK_ALIGNMENT, 1);

    // Load first 128 characters of ASCII set
    for (GLubyte c = 0; c < 128; c++) {
        // Load character glyph
        if (FT_Load_Char(face, c, FT_LOAD_RENDER)) {
            LOGE("ERROR::FREETYTPE: Failed to load Glyph");
            continue;
        }
        // Generate texture
        GLuint texture;
        glGenTextures(1, &texture);
        checkGlError("LoadFacesByASCII glGenTextures");
//        LOGE("fore c= %d",c);
//        LOGE("texture === %d",texture);

        glBindTexture(GL_TEXTURE_2D, texture);
        glTexImage2D(
                GL_TEXTURE_2D,
                0,
                //选择GL_LUMINANCE来于:
                // https://stackoverflow.com/questions/70285879/android12-opengles3-0-glteximage2d-0x502-error
                GL_LUMINANCE,
                face->glyph->bitmap.width,
                face->glyph->bitmap.rows,
                0,
                GL_LUMINANCE,
                GL_UNSIGNED_BYTE,
                face->glyph->bitmap.buffer
        );
        // Set texture options
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        // Now store character for later use
        Character character = {
                texture,
                glm::ivec2(face->glyph->bitmap.width, face->glyph->bitmap.rows),
                glm::ivec2(face->glyph->bitmap_left, face->glyph->bitmap_top),
                static_cast<GLuint>(face->glyph->advance.x)
        };
        Characters.insert(std::pair<GLchar, Character>(c, character));
    }
    glBindTexture(GL_TEXTURE_2D, 0);
    checkGlError("glBindTexture(GL_TEXTURE_2D, 0);");

    // Destroy FreeType once we're finished
    FT_Done_Face(face);
    FT_Done_FreeType(ft);
    LOGE("FT_Done_FreeType");

}
  •  编译文字纹理着色器程序,绑定文字顶点数据 :
void GLDrawTextVideoRender::bindTextVertexData() {
    glm::vec2 viewport(m_backingWidth, m_backingHeight);
    glm::mat4 projection = glm::ortho(0.0f, static_cast<GLfloat>(m_backingWidth), 0.0f,
                                      static_cast<GLfloat>(m_backingHeight));
    textGLShader->use();
    textGLShader->setMat4("projection", projection);
    checkGlError("drawTextShader->setMat4(projection");


    // Configure VAO/VBO for texture quads
    glGenVertexArrays(1, &VAO);
    checkGlError("glGenVertexArrays(1, &VAO);");

    glGenBuffers(1, &VBO);
    glBindVertexArray(VAO);
    glBindBuffer(GL_ARRAY_BUFFER, VBO);
    glBufferData(GL_ARRAY_BUFFER, sizeof(GLfloat) * 6 * 4, NULL, GL_DYNAMIC_DRAW);
    glEnableVertexAttribArray(0);
    glVertexAttribPointer(0, 4, GL_FLOAT, GL_FALSE, 4 * sizeof(GLfloat), 0);
    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glBindVertexArray(0);
    checkGlError("glBindVertexArray");
}
  •    渲染绘制文字,代码如下:
void GLDrawTextVideoRender::RenderText(std::string text, GLfloat x, GLfloat y, GLfloat scale,
                                       glm::vec3 color, glm::vec2 viewport) {
    // 激活对应的渲染状态
    textGLShader->use();
    checkGlError("drawTextShader->use()");
    textGLShader->setVec3("textColor", color.x, color.y, color.z);

    glActiveTexture(GL_TEXTURE0);
    checkGlError("glActiveTexture");
    glBindVertexArray(VAO);
    checkGlError("glBindVertexArray(VAO)");

    // 遍历文本中所有的字符
    std::string::const_iterator c;

    LOGE("RenderText x:%f == y:%f", x, y);
    LOGE("RenderText viewportX:%f == viewportY:%f", viewport.x, viewport.y);

    for (c = text.begin(); c != text.end(); c++) {
        Character ch = Characters[*c];

        GLfloat xpos = x + ch.Bearing.x * scale;
        GLfloat ypos = y - (ch.Size.y - ch.Bearing.y) * scale;

        GLfloat w = ch.Size.x * scale;
        GLfloat h = ch.Size.y * scale;

//        LOGE("TextRenderSample::RenderText [xpos,ypos,w,h]=[%f, %f, %f, %f], ch.advance >> 6 = %d"
//                , xpos, ypos, w, h, ch.Advance >> 6);

        // 对每个字符更新VBO
        GLfloat vertices[6][4] = {
                {xpos,     ypos + h, 0.0, 0.0},
                {xpos,     ypos,     0.0, 1.0},
                {xpos + w, ypos,     1.0, 1.0},

                {xpos,     ypos + h, 0.0, 0.0},
                {xpos + w, ypos,     1.0, 1.0},
                {xpos + w, ypos + h, 1.0, 0.0}
        };
        // 在四边形上绘制字形纹理
        glBindTexture(GL_TEXTURE_2D, ch.TextureID);
        checkGlError("glBindTexture");
        // 更新VBO内存的内容
        glBindBuffer(GL_ARRAY_BUFFER, VBO);
        checkGlError("glBindBuffer");
        // Be sure to use glBufferSubData and not glBufferData
        glBufferSubData(GL_ARRAY_BUFFER, 0, sizeof(vertices), vertices);
        checkGlError("glBufferSubData");
        glBindBuffer(GL_ARRAY_BUFFER, 0);
        checkGlError("glBindBuffer(GL_ARRAY_BUFFER");
        // 绘制四边形
        glDrawArrays(GL_TRIANGLES, 0, 6);
        checkGlError("glDrawArrays(GL_TRIANGLES");

        // 更新位置到下一个字形的原点,注意单位是1/64像素
        // 位偏移6个单位来获取单位为像素的值 (2^6 = 64)
        x += (ch.Advance >> 6) * scale;
    }
    glBindVertexArray(0);
    glBindTexture(GL_TEXTURE_2D, 0);
}

        5.渲染/绘制surface切换及录制

                 前面文章4已经解析了BlitFramebuffer渲染法,细节可以去参考,下面是渲染及录制过程的代码:

void GLDrawTextVideoRender::OnDrawFrame() {
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glClearColor(1.0f, 1.0f, 1.0f, 1.0f);

    //绘制YUV视频数据纹理
    if (!updateYUVTextures() || !useYUVProgram()) return;
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
    //绘制图片水印数据纹理
    bindPicTexture();
    usePicProgram();
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

    //绘制文字水印
    glPixelStorei(GL_UNPACK_ALIGNMENT, 1); //禁用byte-alignment限制
    glEnable(GL_BLEND);
    //glEnable(GL_CULL_FACE);
    glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
    checkGlError("glBlendFunc");
    glm::vec2 viewport(m_backingWidth, m_backingHeight);
    RenderText("https://blog.csdn.net/wangyongyao1989", 300.0f, 500.0f, 1.0f, glm::vec3(0.8, 0.1f, 0.1f), viewport);


//    LOGE("OnDrawFrame thread:%ld", pthread_self());
    if (m_TextureMovieEncoder2 != nullptr) {
        m_TextureMovieEncoder2->frameAvailableSoon();
    }
    if (m_InputWindowSurface != nullptr) {
        m_InputWindowSurface->makeCurrentReadFrom(*m_WindowSurface);
        glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
        glClear(GL_COLOR_BUFFER_BIT);
        checkGlError("before glBlitFramebuffer");
        glBlitFramebuffer(0, 0, m_backingWidth, m_backingHeight, offX, offY, off_right, off_bottom,
                          GL_COLOR_BUFFER_BIT, GL_NEAREST);
        m_InputWindowSurface->swapBuffers();

    }

    //切换到m_WindowSurface
    m_WindowSurface->makeCurrent();
    m_WindowSurface->swapBuffers();

}

        

四、Android代码实现

        在Android中实现视频中添加图片及文字水印播放并录制的功能,下面把全程代码贴出来。

1.顶点及片元程序:

        分有YUV视频通道/图片水印/文字水印的顶点及片元程序:

  • YUV视频通道顶点程序draw_text_video_vert.glsl:
#version 320 es

out vec2 v_texcoord;
in vec4 position;
in vec2 texcoord;

void main() {
    v_texcoord = texcoord;      //YUV纹理数据的输入
    gl_Position =  position;
}

  • YUV视频通道片元程序draw_text_video_fragment.glsl
#version 320 es

precision mediump float;

in vec2 v_texcoord;

uniform lowp sampler2D s_textureY;
uniform lowp sampler2D s_textureU;
uniform lowp sampler2D s_textureV;

out vec4 FragColor;

void main() {
     float y, u, v, r, g, b;
     y = texture(s_textureY, v_texcoord).r;
     u = texture(s_textureU, v_texcoord).r;
     v = texture(s_textureV, v_texcoord).r;
     u = u - 0.5;
     v = v - 0.5;
     r = y + 1.403 * v;
     g = y - 0.344 * u - 0.714 * v;
     b = y + 1.770 * u;
     FragColor = vec4(r, g, b, 1.0f);
}
  • 图片水印与YUV视频通道公用一个顶点程序,而片元程序draw_pic_fragment.glsl:
#version 320 es

precision mediump float;

in vec2 v_texcoord;
uniform lowp sampler2D s_texturePic;


out vec4 FragColor;

void main() {
     FragColor = texture(s_texturePic, v_texcoord);
}
  •  文字水印的顶点程序gl_draw_text_vert.glsl:
#version 320 es

layout (location = 0) in vec4 vertex; // <vec2 pos, vec2 tex>

out vec2 TexCoords;
uniform mat4 projection;

void main()
{
    gl_Position = projection * vec4(vertex.xy, 0.0, 1.0);
    TexCoords = vertex.zw;
}
  • 文字水印的片元程序gl_draw_text_fragment.glsl:
#version 320 es
precision mediump float;

out vec4 FragColor;

in vec2 TexCoords;

uniform sampler2D text;
uniform vec3 textColor;

void main()
{
    vec4 sampled = vec4(1.0, 1.0, 1.0, texture(text, TexCoords).r);

    FragColor = vec4(textColor, 1.0) * sampled;
}

        2.Android端的Java程序:

        把需要的顶点/片元程序,freetype,图片源的路径地址传入给C++层。

  • GLDrawTextSurfaceView的代码:
package com.wangyongyao.glplay.view;

import android.content.Context;
import android.graphics.Point;
import android.util.AttributeSet;
import android.util.Log;
import android.util.Size;
import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.SurfaceView;

import androidx.annotation.NonNull;

import com.wangyongyao.glplay.OpenGLPlayCallJni;
import com.wangyongyao.glplay.camera.Camera2Helper2;
import com.wangyongyao.glplay.camera.GLCamera2Listener;
import com.wangyongyao.glplay.utils.OpenGLPlayFileUtils;

/**
 * author : wangyongyao https://github.com/wangyongyao1989
 * Create Time : 2024/11/6 16:24
 * Descibe : MyyFFmpeg com.wangyongyao.glplay.view
 */
public class GLDrawTextSurfaceView extends SurfaceView implements SurfaceHolder.Callback, GLCamera2Listener {
    private static String TAG = GLDrawTextSurfaceView.class.getSimpleName();
    private OpenGLPlayCallJni mJniCall;
    private Context mContext;

    private int mWidth;
    private int mHeight;
    private Camera2Helper2 camera2Helper;
    private SurfaceHolder mHolder;
    private Surface mSurface;


    public GLDrawTextSurfaceView(Context context, OpenGLPlayCallJni jniCall) {
        super(context);
        mContext = context;
        mJniCall = jniCall;
        init();
    }

    public GLDrawTextSurfaceView(Context context, AttributeSet attrs) {
        super(context, attrs);
        mContext = context;
        init();
    }


    @Override
    protected void onAttachedToWindow() {
        super.onAttachedToWindow();

    }

    private void init() {
        //获取SurfaceHolder对象
        mHolder = getHolder();
        //注册SurfaceHolder的回调方法
        mHolder.addCallback(this);

        String fragPath = OpenGLPlayFileUtils.getModelFilePath(mContext
                , "draw_text_video_play_frament.glsl");
        String vertexPath = OpenGLPlayFileUtils.getModelFilePath(mContext
                , "draw_text_video_play_vert.glsl");
        String picFragPath = OpenGLPlayFileUtils.getModelFilePath(mContext
                , "draw_pic_frament.glsl");
        mJniCall.glDrawTextSurfaceInit(0, vertexPath, fragPath, picFragPath);

        String textFragPath = OpenGLPlayFileUtils.getModelFilePath(mContext
                , "gl_draw_text_fragment.glsl");
        String textVertexPath = OpenGLPlayFileUtils.getModelFilePath(mContext
                , "gl_draw_text_vertex.glsl");
        String modelPath = OpenGLPlayFileUtils.getModelFilePath(mContext
                , "arial.ttf");
        mJniCall.glTextSharderPath(textVertexPath, textFragPath, modelPath);

        String picPath = OpenGLPlayFileUtils.getModelFilePath(mContext
                , "yao.jpg");
        mJniCall.glDrawTextPicPath(picPath);

    }


    private void stopCameraPreview() {
        if (camera2Helper != null) {
            camera2Helper.stop();
        }
    }


    @Override
    public void surfaceCreated(@NonNull SurfaceHolder holder) {
        Log.e(TAG, "surfaceCreated");
        mSurface = holder.getSurface();
        if (mJniCall != null) {
            mJniCall.glDrawTextSurfaceCreated(mSurface, null);
        }
    }

    @Override
    public void surfaceChanged(@NonNull SurfaceHolder holder, int format, int width, int height) {
        Log.e(TAG, "onSurfaceChanged width:" + width + ",height" + height
                + "===surface:" + mSurface.toString());
        if (mJniCall != null) {
            mJniCall.glDrawTextSurfaceChanged(width, height);
        }
        mWidth = width;
        mHeight = height;
        startCameraPreview(width, height);
    }

    @Override
    public void surfaceDestroyed(@NonNull SurfaceHolder holder) {
        if (mJniCall != null) {
            mJniCall.glDrawTextSurfaceDestroy();
        }
    }


    private void startCameraPreview(int width, int height) {
        if (camera2Helper == null) {
            camera2Helper = new Camera2Helper2.Builder()
                    .cameraListener(this)
                    .specificCameraId(Camera2Helper2.CAMERA_ID_BACK)
                    .context(mContext)
                    .previewViewSize(new Point(width, height))
                    .rotation(90)
                    .build();
        }

        camera2Helper.start();
    }


    @Override
    public void onPreviewFrame(byte[] yuvData, int width, int height) {
//        Log.e(TAG, "onPreviewFrame" );
        if (mJniCall != null) {
            mJniCall.glDrawTextSurfaceDraw(yuvData, width, height, 90);
            mJniCall.glDrawTextSurfaceRender();
        }
    }

    @Override
    public void onCameraOpened(Size previewSize, int displayOrientation) {
        Log.e(TAG, "onCameraOpened previewSize:" + previewSize.toString()
                + "==displayOrientation:" + displayOrientation);
    }

    @Override
    public void onCameraClosed() {
        Log.e(TAG, "onCameraClosed:");

    }

    @Override
    public void onCameraError(Exception e) {
        Log.e(TAG, "onCameraError:" + e.toString());

    }

    public void destroyRender() {
        mJniCall.glDrawTextSurfaceDestroy();
        stopCameraPreview();
    }

    public void startRecord(String path) {
        mJniCall.glDrawTextSurfaceStartRecord(path);
    }

    public void stopRecord() {
        mJniCall.glDrawTextSurfaceStopRecord();
    }


}
  •    OpenGLPlayCallJni.java中native加载的代码:
package com.wangyongyao.glplay;

import android.content.res.AssetManager;
import android.view.Surface;

/**
 * author : wangyongyao https://github.com/wangyongyao1989
 * Create Time : 2024/9/3 14:17
 * Descibe : MyyFFmpeg com.wangyongyao.openglplay
 */
public class OpenGLPlayCallJni {

    static {
        System.loadLibrary("glplay");
    }



    /*********************** GL视频中绘制文本并录制********************/
    public void glDrawTextSurfaceInit(int type, String vertexPath, String fragPath, String fragPath1) {
        native_draw_text_surface_init(type, vertexPath, fragPath, fragPath1);
    }

    public void glTextSharderPath(String vertexPath, String fragPath,String freeTypePath) {
        native_text_sharder_path(vertexPath, fragPath,freeTypePath);
    }

    public void glDrawTextPicPath(String vertexPath) {
        native_draw_text_pic_path(vertexPath);
    }

    public void glDrawTextSurfaceCreated(Surface surface, AssetManager assetManager) {
        native_draw_text_surface_created(surface, assetManager);
    }

    public void glDrawTextSurfaceChanged(int width, int height) {
        native_draw_text_surface_changed(width, height);
    }

    public void glDrawTextSurfaceRender() {
        native_draw_text_surface_render();
    }

    public void glDrawTextSurfaceDraw(byte[] data, int width, int height, int rotation) {
        native_draw_text_surface_draw(data, width, height, rotation);
    }

    public void glDrawTextSurfaceDestroy() {
        native_draw_text_surface_destroy();
    }

    public void glDrawTextSurfaceStartRecord(String recordPath) {
        native_draw_text_surface_start_record(recordPath);
    }

    public void glDrawTextSurfaceStopRecord() {
        native_draw_text_surface_stop_record();
    }


    private native void native_draw_text_surface_init(int type, String vertexPath, String fragPath, String fragPath1);

    private native void native_text_sharder_path(String vertexPath, String fragPath,String freeTypePath);

    private native void native_draw_text_pic_path(String picPath);

    private native void native_draw_text_surface_created(Surface surface, AssetManager assetManager);

    private native void native_draw_text_surface_changed(int width, int height);

    private native void native_draw_text_surface_render();

    private native void native_draw_text_surface_draw(byte[] data, int width, int height, int rotation);

    private native void native_draw_text_surface_destroy();

    private native void native_draw_text_surface_start_record(String recordPath);

    private native void native_draw_text_surface_stop_record();

}

        3.JNI层代码:

#include <jni.h>
#include <string>
#include <android/log.h>
#include "includeopengl/OpenglesFlashLight.h"
#include "includeopengl/OpenglesCameraPre.h"
#include "includeopengl/OpenGLShader.h"
#include "includeopengl/OpenGLCamera3D.h"
#include <android/native_window_jni.h>
#include <android/asset_manager_jni.h>
#include "GLDrawTextVideoRender.h"

#define LOG_TAG "wy"
#define LOGI(...) __android_log_print(ANDROID_LOG_INFO,LOG_TAG,__VA_ARGS__)
#define LOGD(...) __android_log_print(ANDROID_LOG_DEBUG, LOG_TAG, __VA_ARGS__)
#define LOGW(...) __android_log_print(ANDROID_LOG_WARN, LOG_TAG, __VA_ARGS__)
#define LOGE(...) __android_log_print(ANDROID_LOG_ERROR,LOG_TAG,__VA_ARGS__)
using namespace std;
//包名+类名字符串定义:
const char *java_call_jni_class = "com/wangyongyao/glplay/OpenGLPlayCallJni";

GLDrawTextVideoRender *gLDrawTextVideoRender;



/*********************** OpenGL SurfaceViewNew显示视频并录制*******************/
extern "C"
JNIEXPORT void JNICALL
cpp_surfaceview_new_video_init(JNIEnv *env, jobject thiz, jint type,
                               jstring vertex,
                               jstring frag) {
    const char *vertexPath = env->GetStringUTFChars(vertex, nullptr);
    const char *fragPath = env->GetStringUTFChars(frag, nullptr);
    if (eglsurfaceViewRender == nullptr)
        eglsurfaceViewRender = new EGLSurfaceViewVideoRender();

    eglsurfaceViewRender->setSharderPath(vertexPath, fragPath);

    env->ReleaseStringUTFChars(vertex, vertexPath);
    env->ReleaseStringUTFChars(frag, fragPath);
}

extern "C"
JNIEXPORT void JNICALL
cpp_surfaceview_new_video_created(JNIEnv *env, jobject thiz,
                                  jobject surface,
                                  jobject assetManager) {
    if (eglsurfaceViewRender != nullptr) {
        ANativeWindow *window = surface ? ANativeWindow_fromSurface(env, surface) : nullptr;
        auto *aAssetManager = assetManager ? AAssetManager_fromJava(env, assetManager) : nullptr;
        eglsurfaceViewRender->surfaceCreated(window, aAssetManager);
    }
}

extern "C"
JNIEXPORT void JNICALL
cpp_surfaceview_new_video_changed(JNIEnv *env, jobject thiz,
                                  jint width,
                                  jint height) {
    if (eglsurfaceViewRender != nullptr) {
        eglsurfaceViewRender->surfaceChanged((size_t) width, (size_t) height);
    }
}

extern "C"
JNIEXPORT void JNICALL
cpp_surfaceview_new_video_render(JNIEnv *env, jobject thiz) {
    if (eglsurfaceViewRender != nullptr) {
        eglsurfaceViewRender->render();
    }
}

extern "C"
JNIEXPORT void JNICALL
cpp_surfaceview_new_video_draw(JNIEnv *env, jobject obj, jbyteArray data, jint width, jint height,
                               jint rotation) {
    jbyte *bufferPtr = env->GetByteArrayElements(data, nullptr);
    jsize arrayLength = env->GetArrayLength(data);

    if (eglsurfaceViewRender != nullptr) {

        eglsurfaceViewRender->draw((uint8_t *) bufferPtr, (size_t) arrayLength, (size_t) width,
                                   (size_t) height,
                                   rotation);
    }

    env->ReleaseByteArrayElements(data, bufferPtr, 0);
}

extern "C"
JNIEXPORT void JNICALL
cpp_surfaceview_new_video_destroy(JNIEnv *env, jobject thiz) {
    if (eglsurfaceViewRender != nullptr)
        eglsurfaceViewRender->release();
}

extern "C"
JNIEXPORT void JNICALL
cpp_surfaceview_new_video_start_record(JNIEnv *env, jobject thiz, jstring recordPath) {

    if (eglsurfaceViewRender == nullptr) return;
    const char *path = env->GetStringUTFChars(recordPath, nullptr);
    eglsurfaceViewRender->startEncoder(path);
    env->ReleaseStringUTFChars(recordPath, path);

}

extern "C"
JNIEXPORT void JNICALL
cpp_surfaceview_new_video_stop_record(JNIEnv *env, jobject thiz) {
    if (eglsurfaceViewRender == nullptr) return;
    eglsurfaceViewRender->stopEncoder();
}

/*********************** OpenGL GL视频中绘制水印图片及文本并录制*******************/
extern "C"
JNIEXPORT void JNICALL
cpp_draw_text_surface_init(JNIEnv *env, jobject thiz, jint type,
                           jstring vertex,
                           jstring frag, jstring frag1) {
    const char *vertexPath = env->GetStringUTFChars(vertex, nullptr);
    const char *fragPath = env->GetStringUTFChars(frag, nullptr);
    const char *fragPath1 = env->GetStringUTFChars(frag1, nullptr);

    if (gLDrawTextVideoRender == nullptr)
        gLDrawTextVideoRender = new GLDrawTextVideoRender();

    gLDrawTextVideoRender->setSharderPath(vertexPath, fragPath, fragPath1);

    env->ReleaseStringUTFChars(vertex, vertexPath);
    env->ReleaseStringUTFChars(frag, fragPath);
    env->ReleaseStringUTFChars(frag1, fragPath1);

}


extern "C"
JNIEXPORT void JNICALL
cpp_text_sharder_path(JNIEnv *env, jobject thiz,
                      jstring vertex,
                      jstring frag,
                      jstring freeType) {
    const char *vertexPath = env->GetStringUTFChars(vertex, nullptr);
    const char *fragPath = env->GetStringUTFChars(frag, nullptr);
    const char *freeTypePath = env->GetStringUTFChars(freeType, nullptr);

    if (gLDrawTextVideoRender == nullptr)
        gLDrawTextVideoRender = new GLDrawTextVideoRender();

    gLDrawTextVideoRender->setTextSharderPath(vertexPath, fragPath, freeTypePath);

    env->ReleaseStringUTFChars(vertex, vertexPath);
    env->ReleaseStringUTFChars(frag, fragPath);
    env->ReleaseStringUTFChars(freeType, freeTypePath);

}

extern "C"
JNIEXPORT void JNICALL
cpp_draw_text_pic_path(JNIEnv *env, jobject thiz, jstring picPath) {
    const char *cPicPath = env->GetStringUTFChars(picPath, nullptr);
    if (gLDrawTextVideoRender == nullptr)
        gLDrawTextVideoRender = new GLDrawTextVideoRender();

    gLDrawTextVideoRender->setPicTextPath(cPicPath);

    env->ReleaseStringUTFChars(picPath, cPicPath);
}

extern "C"
JNIEXPORT void JNICALL
cpp_draw_text_surface_created(JNIEnv *env, jobject thiz,
                              jobject surface,
                              jobject assetManager) {
    if (gLDrawTextVideoRender != nullptr) {
        ANativeWindow *window = surface ? ANativeWindow_fromSurface(env, surface) : nullptr;
        auto *aAssetManager = assetManager ? AAssetManager_fromJava(env, assetManager) : nullptr;
        gLDrawTextVideoRender->surfaceCreated(window, aAssetManager);
    }
}

extern "C"
JNIEXPORT void JNICALL
cpp_draw_text_surface_changed(JNIEnv *env, jobject thiz,
                              jint width,
                              jint height) {
    if (gLDrawTextVideoRender != nullptr) {
        gLDrawTextVideoRender->surfaceChanged((size_t) width, (size_t) height);
    }
}

extern "C"
JNIEXPORT void JNICALL
cpp_draw_text_surface_render(JNIEnv *env, jobject thiz) {
    if (gLDrawTextVideoRender != nullptr) {
        gLDrawTextVideoRender->render();
    }
}

extern "C"
JNIEXPORT void JNICALL
cpp_draw_text_surface_draw(JNIEnv *env, jobject obj, jbyteArray data, jint width, jint height,
                           jint rotation) {
    jbyte *bufferPtr = env->GetByteArrayElements(data, nullptr);
    jsize arrayLength = env->GetArrayLength(data);

    if (gLDrawTextVideoRender != nullptr) {

        gLDrawTextVideoRender->draw((uint8_t *) bufferPtr, (size_t) arrayLength, (size_t) width,
                                    (size_t) height,
                                    rotation);
    }

    env->ReleaseByteArrayElements(data, bufferPtr, 0);
}

extern "C"
JNIEXPORT void JNICALL
cpp_draw_text_surface_destroy(JNIEnv *env, jobject thiz) {
    if (gLDrawTextVideoRender != nullptr)
        gLDrawTextVideoRender->release();
}

extern "C"
JNIEXPORT void JNICALL
cpp_draw_text_surface_start_record(JNIEnv *env, jobject thiz, jstring recordPath) {

    if (gLDrawTextVideoRender == nullptr) return;
    const char *path = env->GetStringUTFChars(recordPath, nullptr);
    gLDrawTextVideoRender->startEncoder(path);
    env->ReleaseStringUTFChars(recordPath, path);

}

extern "C"
JNIEXPORT void JNICALL
cpp_draw_text_surface_stop_record(JNIEnv *env, jobject thiz) {
    if (gLDrawTextVideoRender == nullptr) return;
    gLDrawTextVideoRender->stopEncoder();
}


static const JNINativeMethod methods[] = {
  
        /*********************** OpenGL GL视频中绘制水印图片及文本并录制*******************/

        {"native_draw_text_surface_init",               "(I"
                                                        "Ljava/lang/String;"
                                                        "Ljava/lang/String;"
                                                        "Ljava/lang/String;)V",  (void *) cpp_draw_text_surface_init},
        {"native_text_sharder_path",
                                                        "(Ljava/lang/String;"
                                                        "Ljava/lang/String;"
                                                        "Ljava/lang/String;)V",  (void *) cpp_text_sharder_path},

        {"native_draw_text_pic_path",                   "(Ljava/lang/String;)V", (void *) cpp_draw_text_pic_path},
        {"native_draw_text_surface_created",
                                                        "(Landroid/view/Surface;"
                                                        "Landroid/content/res"
                                                        "/AssetManager;)V",      (void *) cpp_draw_text_surface_created},

        {"native_draw_text_surface_changed",            "(II)V",                 (void *) cpp_draw_text_surface_changed},
        {"native_draw_text_surface_render",             "()V",                   (void *) cpp_draw_text_surface_render},
        {"native_draw_text_surface_draw",               "([BIII)V",              (void *) cpp_draw_text_surface_draw},
        {"native_draw_text_surface_destroy",            "()V",                   (void *) cpp_draw_text_surface_destroy},
        {"native_draw_text_surface_start_record",       "(Ljava/lang/String;)V", (void *) cpp_draw_text_surface_start_record},
        {"native_draw_text_surface_stop_record",        "()V",                   (void *) cpp_draw_text_surface_stop_record},

};

// 定义注册方法
JNIEXPORT jint JNICALL JNI_OnLoad(JavaVM *vm, void *reserved) {
    LOGD("动态注册");
    JNIEnv *env;
    if ((vm)->GetEnv((void **) &env, JNI_VERSION_1_6) != JNI_OK) {
        LOGD("动态注册GetEnv  fail");
        return JNI_ERR;
    }

    // 获取类引用
    jclass clazz = env->FindClass(java_call_jni_class);

    // 注册native方法
    jint regist_result = env->RegisterNatives(clazz, methods,
                                              sizeof(methods) / sizeof(methods[0]));
    if (regist_result) { // 非零true 进if
        LOGE("动态注册 fail regist_result = %d", regist_result);
    } else {
        LOGI("动态注册 success result = %d", regist_result);
    }
    return JNI_VERSION_1_6;
}

extern "C" void JNI_OnUnload(JavaVM *jvm, void *p) {
    JNIEnv *env = NULL;
    if (jvm->GetEnv((void **) (&env), JNI_VERSION_1_6) != JNI_OK) {
        return;
    }
    jclass clazz = env->FindClass(java_call_jni_class);
    if (env != NULL) {
        env->UnregisterNatives(clazz);
    }

}



        4.GLDrawTextVideoRender的头文件:

//  Author : wangyongyao https://github.com/wangyongyao1989
// Created by MMM on 2024/11/6.
//

#ifndef MYYFFMPEG_GLDRAWTEXTVIDEORENDER_H
#define MYYFFMPEG_GLDRAWTEXTVIDEORENDER_H

#ifndef MYYFFMPEG_GLDrawTextVideoRender_H
#define MYYFFMPEG_GLDrawTextVideoRender_H

#endif //MYYFFMPEG_GLDrawTextVideoRender_H

#include <cstdint>
#include <memory>
#include <android/native_window.h>
#include <android/asset_manager.h>
#include "OpenGLShader.h"
#include <EGL/egl.h>
#include <GLES3/gl3.h>
#include "EglCore.h"
#include "WindowSurface.h"
#include "Looper.h"
#include "VideoEncoderCore.h"
#include "TextureMovieEncoder2.h"
#include <stb_image.h>

#include <map>
// FreeType
#include "../includeopengl/freetype_2_9_1/ft2build.h"
#include "../includeopengl/freetype_2_9_1/freetype/ftglyph.h"
#include FT_FREETYPE_H


/// Holds all state information relevant to a character as loaded using FreeType
struct Character {
    GLuint TextureID;   // ID handle of the glyph texture
    glm::ivec2 Size;    // Size of glyph
    glm::ivec2 Bearing;  // Offset from baseline to left/top of glyph
    GLuint Advance;    // Horizontal offset to advance to next glyph
};


enum {
    MSG_Draw_Text_SurfaceCreated,
    MSG_Draw_Text_SurfaceChanged,
    MSG_Draw_Text_DrawFrame,
    MSG_Draw_Text_SurfaceDestroyed,
};


struct draw_text_video_frame {
    size_t width;
    size_t height;
    size_t stride_y;
    size_t stride_uv;
    uint8_t *y;
    uint8_t *u;
    uint8_t *v;
};

// Vertices for a full screen quad.
static const float EGLTextVerticek[12] = {
        -1.f, 1.f, 0,
        -1.f, -1.f, 0,
        1.f, 1.f, 0,
        1.f, -1.f, 0
};

static const float EGLTextVerticek1[12] = {
        0.3f, -0.3f, 0,
        0.3f, -0.8f, 0,
        0.8f, -0.3f, 0,
        0.8f, -0.8f, 0
};

// Texture coordinates for mapping entire texture.
static const float EGLTextTextureCoord[12] = {
        0, 0, 0,
        0, 1, 0,
        1, 0, 0,
        1, 1, 0,
};

static const float EGLPicTextureCoord[12] = {
        0, 0, 0,
        0, 1, 0,
        1, 0, 0,
        1, 1, 0,
};

static const size_t BIT_RATE_DRAW_TEXT = 4000000;   // 4Mbps
static const size_t VIDEO_WIDTH_DRAW_TEXT = 1280;
static const size_t VIDEO_HEIGHT_DRAW_TEXT = 720;

class GLDrawTextVideoRender : public Looper {

public:
    GLDrawTextVideoRender();

    ~GLDrawTextVideoRender();

    void surfaceCreated(ANativeWindow *window, AAssetManager *assetManager);

    void surfaceChanged(size_t width, size_t height);

    void render();

    void updateFrame(const draw_text_video_frame &frame);

    void release();

    void draw(uint8_t *buffer, size_t length, size_t width, size_t height, float rotation);

    void setParameters(uint32_t params);

    uint32_t getParameters();

    bool
    setSharderPath(const char *vertexPath, const char *fragmentPath, const char *fragmentPath1);

    bool setTextSharderPath(const char *vertexPath, const char *fragmentPath,const char *freeTypePath);

    bool setPicTextPath(const char *picPath);

    bool setSharderStringPath(string vertexPath, string fragmentPath);

    void startEncoder(const char *recordPath);

    void stopEncoder();

private:

    void handleMessage(LooperMessage *msg);

    void OnSurfaceCreated();

    void OnSurfaceChanged(int w, int h);

    void OnDrawFrame();

    void OnSurfaceDestroyed();

    bool createYUVTextures();

    void creatPicTexture();     //创建图片纹理

    void bindTextVertexData();

    void RenderText(std::string text, GLfloat x, GLfloat y, GLfloat scale,
                    glm::vec3 color, glm::vec2 viewport);

    void LoadFacesByASCII(const char *path);

    bool updateYUVTextures();

    void bindPicTexture();

    void deleteYUVTextures();

    void deletePicTextures();

    int createYUVProgram();

    int createPicProgram();

    int createTextProgram();

    GLuint useYUVProgram();

    void usePicProgram();

    void printGLString(const char *name, GLenum s);

    void checkGlError(const char *op);


    void delete_program(GLuint &program);

    GLuint m_yuv_program = 0;
    GLuint m_pic_program = 0;
    GLuint m_text_program = 0;

    GLuint m_vertexShader = 0;
    GLuint m_pixelShader = 0;

    std::unique_ptr<uint8_t[]> m_pDataY = nullptr;

    uint8_t *m_pDataU = nullptr;
    uint8_t *m_pDataV = nullptr;

    __unused  size_t m_length = 0;
    size_t m_sizeY = 0;
    size_t m_sizeU = 0;
    size_t m_sizeV = 0;

    GLuint m_textureIdY = 0;
    GLuint m_textureIdU = 0;
    GLuint m_textureIdV = 0;

    GLuint m_yuv_vertexPos = 0;
    GLuint m_pic_vertexPos = 0;

    GLuint m_yuv_textureCoordLoc = 0;
    GLuint m_pic_textureCoordLoc = 0;

    GLint m_textureYLoc = 0;
    GLint m_textureULoc = 0;
    GLint m_textureVLoc = 0;
    GLuint m_texturePicLoc = 0;

    size_t m_width = 0;
    size_t m_height = 0;
    size_t m_backingWidth = 0;
    size_t m_backingHeight = 0;
    uint32_t m_params = 0;
    float m_rotation = 0;

    bool isDirty;

    OpenGLShader *yuvGLShader = nullptr;
    OpenGLShader *picGLShader = nullptr;
    OpenGLShader *textGLShader = nullptr;


    EGLDisplay display = nullptr;
    EGLSurface winsurface = nullptr;

    EglCore *m_EglCore = nullptr;
    WindowSurface *m_WindowSurface = nullptr;

    ANativeWindow *m_ANWindow = nullptr;

    VideoEncoderCore *m_VideoEncoderCore = nullptr;

    TextureMovieEncoder2 *m_TextureMovieEncoder2 = nullptr;
    WindowSurface *m_InputWindowSurface = nullptr;

    size_t offX;
    size_t offY;
    size_t off_right;
    size_t off_bottom;

    int picChannels;
    int picWidth, picHeight;
    unsigned char *picData;

    map<GLchar, Character> Characters;
    GLuint VAO = GL_NONE;
    GLuint VBO = GL_NONE;

    string freeTypePathString;

};


#endif //MYYFFMPEG_GLDRAWTEXTVIDEORENDER_H

        5.GLDrawTextVideoRender.cpp的代码:

//  Author : wangyongyao https://github.com/wangyongyao1989
// Created by MMM on 2024/11/6.
//

#include "GLDrawTextVideoRender.h"
#include "OpenGLShader.h"

void GLDrawTextVideoRender::surfaceCreated(ANativeWindow *window, AAssetManager *assetManager) {
    m_ANWindow = window;
    postMessage(MSG_Draw_Text_SurfaceCreated, false);
}

void GLDrawTextVideoRender::surfaceChanged(size_t width, size_t height) {
    postMessage(MSG_Draw_Text_SurfaceChanged, width, height);
}

void GLDrawTextVideoRender::render() {
    postMessage(MSG_Draw_Text_DrawFrame, false);
}

void GLDrawTextVideoRender::release() {
    postMessage(MSG_Draw_Text_SurfaceDestroyed, false);
}

void GLDrawTextVideoRender::updateFrame(const draw_text_video_frame &frame) {
    m_sizeY = frame.width * frame.height;
    m_sizeU = frame.width * frame.height / 4;
    m_sizeV = frame.width * frame.height / 4;

    if (m_pDataY == nullptr || m_width != frame.width || m_height != frame.height) {
        m_pDataY = std::make_unique<uint8_t[]>(m_sizeY + m_sizeU + m_sizeV);
        m_pDataU = m_pDataY.get() + m_sizeY;
        m_pDataV = m_pDataU + m_sizeU;
    }

    m_width = frame.width;
    m_height = frame.height;

    if (m_width == frame.stride_y) {
        memcpy(m_pDataY.get(), frame.y, m_sizeY);
    } else {
        uint8_t *pSrcY = frame.y;
        uint8_t *pDstY = m_pDataY.get();

        for (int h = 0; h < m_height; h++) {
            memcpy(pDstY, pSrcY, m_width);

            pSrcY += frame.stride_y;
            pDstY += m_width;
        }
    }

    if (m_width / 2 == frame.stride_uv) {
        memcpy(m_pDataU, frame.u, m_sizeU);
        memcpy(m_pDataV, frame.v, m_sizeV);
    } else {
        uint8_t *pSrcU = frame.u;
        uint8_t *pSrcV = frame.v;
        uint8_t *pDstU = m_pDataU;
        uint8_t *pDstV = m_pDataV;

        for (int h = 0; h < m_height / 2; h++) {
            memcpy(pDstU, pSrcU, m_width / 2);
            memcpy(pDstV, pSrcV, m_width / 2);

            pDstU += m_width / 2;
            pDstV += m_width / 2;

            pSrcU += frame.stride_uv;
            pSrcV += frame.stride_uv;
        }
    }

    isDirty = true;
}

void
GLDrawTextVideoRender::draw(uint8_t *buffer, size_t length, size_t width, size_t height,
                            float rotation) {
    draw_text_video_frame frame{};
    frame.width = width;
    frame.height = height;
    frame.stride_y = width;
    frame.stride_uv = width / 2;
    frame.y = buffer;
    frame.u = buffer + width * height;
    frame.v = buffer + width * height * 5 / 4;

    updateFrame(frame);
}

void GLDrawTextVideoRender::setParameters(uint32_t params) {
    m_params = params;
}

uint32_t GLDrawTextVideoRender::getParameters() {
    return m_params;
}

bool GLDrawTextVideoRender::createYUVTextures() {
    auto widthY = (GLsizei) m_width;
    auto heightY = (GLsizei) m_height;

    glActiveTexture(GL_TEXTURE0);
    glGenTextures(1, &m_textureIdY);
    glBindTexture(GL_TEXTURE_2D, m_textureIdY);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, widthY, heightY, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE,
                 nullptr);

    if (!m_textureIdY) {
        LOGE("OpenGL Error Create Y texture");
        return false;
    }

    GLsizei widthU = (GLsizei) m_width / 2;
    GLsizei heightU = (GLsizei) m_height / 2;

    glActiveTexture(GL_TEXTURE1);
    glGenTextures(1, &m_textureIdU);
    glBindTexture(GL_TEXTURE_2D, m_textureIdU);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, widthU, heightU, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE,
                 nullptr);

    if (!m_textureIdU) {
        LOGE("OpenGL Error Create U texture");
        return false;
    }

    GLsizei widthV = (GLsizei) m_width / 2;
    GLsizei heightV = (GLsizei) m_height / 2;

    glActiveTexture(GL_TEXTURE2);
    glGenTextures(1, &m_textureIdV);
    glBindTexture(GL_TEXTURE_2D, m_textureIdV);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, widthV, heightV, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE,
                 nullptr);

    if (!m_textureIdV) {
        LOGE("OpenGL Error Create V texture");
        return false;
    }

    return true;
}

bool GLDrawTextVideoRender::updateYUVTextures() {

    if (!m_textureIdY && !m_textureIdU && !m_textureIdV) return false;
//    LOGE("updateTextures m_textureIdY:%d,m_textureIdU:%d,m_textureIdV:%d,===isDirty:%d",
//         m_textureIdY,
//         m_textureIdU, m_textureIdV, isDirty);

    if (isDirty) {
        glActiveTexture(GL_TEXTURE0);
        glBindTexture(GL_TEXTURE_2D, m_textureIdY);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, (GLsizei) m_width, (GLsizei) m_height, 0,
                     GL_LUMINANCE, GL_UNSIGNED_BYTE, m_pDataY.get());

        glActiveTexture(GL_TEXTURE1);
        glBindTexture(GL_TEXTURE_2D, m_textureIdU);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, (GLsizei) m_width / 2, (GLsizei) m_height / 2,
                     0,
                     GL_LUMINANCE, GL_UNSIGNED_BYTE, m_pDataU);

        glActiveTexture(GL_TEXTURE2);
        glBindTexture(GL_TEXTURE_2D, m_textureIdV);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, (GLsizei) m_width / 2, (GLsizei) m_height / 2,
                     0,
                     GL_LUMINANCE, GL_UNSIGNED_BYTE, m_pDataV);

        isDirty = false;
        return true;
    }

    return false;
}

void GLDrawTextVideoRender::bindPicTexture() {
    if (m_texturePicLoc) {
        // bind Texture
        glActiveTexture(GL_TEXTURE0);
        glBindTexture(GL_TEXTURE_2D, m_texturePicLoc);
    }
}

int
GLDrawTextVideoRender::createPicProgram() {
    //创建图片水印着色程序
    m_pic_program = picGLShader->createProgram();
    if (!m_pic_program) {
        LOGE("Could not create m_pic_program.");
        return 0;
    }
    m_pic_vertexPos = (GLuint) glGetAttribLocation(m_pic_program, "position");
    m_pic_textureCoordLoc = (GLuint) glGetAttribLocation(m_pic_program, "texcoord");
    m_texturePicLoc = (GLuint) glGetUniformLocation(m_pic_program, "s_texturePic");
    return m_pic_program;
}

int
GLDrawTextVideoRender::createTextProgram() {
    //创建文字水印着色程序
    m_text_program = textGLShader->createProgram();
    if (!m_text_program) {
        LOGE("Could not create m_text_program.");
        return 0;
    }
    return m_text_program;
}

int
GLDrawTextVideoRender::createYUVProgram() {

    //创建YUV视频通道着色器程序
    m_yuv_program = yuvGLShader->createProgram();
    m_vertexShader = yuvGLShader->vertexShader;
    m_pixelShader = yuvGLShader->fraShader;
    LOGI("GLDrawTextVideoRender createProgram m_yuv_program:%d", m_yuv_program);

    if (!m_yuv_program) {
        LOGE("Could not create program.");
        return 0;
    }

    //Get Uniform Variables Location
    m_yuv_vertexPos = (GLuint) glGetAttribLocation(m_yuv_program, "position");
    m_textureYLoc = glGetUniformLocation(m_yuv_program, "s_textureY");
    m_textureULoc = glGetUniformLocation(m_yuv_program, "s_textureU");
    m_textureVLoc = glGetUniformLocation(m_yuv_program, "s_textureV");
    m_yuv_textureCoordLoc = (GLuint) glGetAttribLocation(m_yuv_program, "texcoord");
    return m_yuv_program;
}


GLuint GLDrawTextVideoRender::useYUVProgram() {
    if (!m_yuv_program && !createYUVProgram()) {
        LOGE("Could not use program.");
        return 0;
    }
    glUseProgram(m_yuv_program);
    glVertexAttribPointer(m_yuv_vertexPos, 3, GL_FLOAT, GL_FALSE, 0, EGLTextVerticek);
    glEnableVertexAttribArray(m_yuv_vertexPos);

    glUniform1i(m_textureYLoc, 0);
    glUniform1i(m_textureULoc, 1);
    glUniform1i(m_textureVLoc, 2);
    glVertexAttribPointer(m_yuv_textureCoordLoc, 3, GL_FLOAT, GL_FALSE, 0, EGLTextTextureCoord);
    glEnableVertexAttribArray(m_yuv_textureCoordLoc);
    return m_yuv_program;
}


void GLDrawTextVideoRender::usePicProgram() {
    glUseProgram(m_pic_program);
    glVertexAttribPointer(m_pic_vertexPos, 3, GL_FLOAT, GL_FALSE, 0, EGLTextVerticek1);
    glEnableVertexAttribArray(m_pic_vertexPos);

    glUniform1i(m_texturePicLoc, 3);
    glVertexAttribPointer(m_pic_textureCoordLoc, 3, GL_FLOAT, GL_FALSE, 0, EGLTextTextureCoord);
    glEnableVertexAttribArray(m_pic_textureCoordLoc);
}


bool
GLDrawTextVideoRender::setSharderPath(const char *vertexPath, const char *fragmentPath,
                                      const char *fragmentPath1) {
    yuvGLShader->getSharderPath(vertexPath, fragmentPath);
    picGLShader->getSharderStringPath(vertexPath, fragmentPath1);
    return 0;
}

bool
GLDrawTextVideoRender::setTextSharderPath(const char *vertexPath, const char *fragmentPath,
                                          const char *freeTypePath) {
    textGLShader->getSharderPath(vertexPath, fragmentPath);
    freeTypePathString = freeTypePath;
    return 0;
}

bool
GLDrawTextVideoRender::setPicTextPath(const char *picPath) {
    LOGI("setPicTextPath picPath:%s", picPath);
    picData = stbi_load(picPath, &picWidth, &picHeight, &picChannels, 0);
    return 0;
}

bool GLDrawTextVideoRender::setSharderStringPath(string vertexPath, string fragmentPath) {
    yuvGLShader->getSharderStringPath(vertexPath, fragmentPath);
    return 0;
}

GLDrawTextVideoRender::GLDrawTextVideoRender() {
    yuvGLShader = new OpenGLShader();
    picGLShader = new OpenGLShader();
    textGLShader = new OpenGLShader();
}

GLDrawTextVideoRender::~GLDrawTextVideoRender() {
    deleteYUVTextures();
    deletePicTextures();
    delete_program(m_yuv_program);
    delete_program(m_pic_program);
    m_vertexShader = 0;
    m_pixelShader = 0;
    if (m_pDataY) {
        m_pDataY = nullptr;
    }
    if (m_pDataU) {
        delete m_pDataU;
        m_pDataU = nullptr;
    }
    if (m_pDataV) {
        delete m_pDataV;
        m_pDataV = nullptr;
    }

    if (yuvGLShader) {
        delete yuvGLShader;
        yuvGLShader = nullptr;
    }

    if (picGLShader) {
        delete picGLShader;
        picGLShader = nullptr;
    }

    if (textGLShader) {
        delete textGLShader;
        textGLShader = nullptr;
    }

    if (display) {
        display = nullptr;
    }

    if (winsurface) {
        winsurface = nullptr;
    }

    if (m_EglCore) {
        delete m_EglCore;
        m_EglCore = nullptr;
    }

    if (m_WindowSurface) {
        delete m_WindowSurface;
        m_WindowSurface = nullptr;
    }
    quit();

    if (picData) {
        picData = nullptr;
    }

    if (VAO) {
        glDeleteVertexArrays(1, &VAO);

    }
    if (VBO) {
        glDeleteBuffers(1, &VBO);
    }

    std::map<GLchar, Character>::const_iterator iter;
    for (iter = Characters.begin(); iter != Characters.end(); iter++) {
        glDeleteTextures(1, &Characters[iter->first].TextureID);
    }

}

void GLDrawTextVideoRender::delete_program(GLuint &program) {
    if (program) {
        glUseProgram(0);
        glDeleteProgram(program);
        program = 0;
    }
}

void GLDrawTextVideoRender::deleteYUVTextures() {
    if (m_textureIdY) {
        glActiveTexture(GL_TEXTURE0);
        glBindTexture(GL_TEXTURE_2D, 0);
        glDeleteTextures(1, &m_textureIdY);

        m_textureIdY = 0;
    }

    if (m_textureIdU) {
        glActiveTexture(GL_TEXTURE1);
        glBindTexture(GL_TEXTURE_2D, 0);
        glDeleteTextures(1, &m_textureIdU);

        m_textureIdU = 0;
    }

    if (m_textureIdV) {
        glActiveTexture(GL_TEXTURE2);
        glBindTexture(GL_TEXTURE_2D, 0);
        glDeleteTextures(1, &m_textureIdV);
        m_textureIdV = 0;
    }
}

void GLDrawTextVideoRender::deletePicTextures() {
    if (m_texturePicLoc) {
        glActiveTexture(GL_TEXTURE0);
        glBindTexture(GL_TEXTURE_2D, 0);
        glDeleteTextures(1, &m_texturePicLoc);
        m_texturePicLoc = 0;
    }
}


void GLDrawTextVideoRender::handleMessage(LooperMessage *msg) {
    Looper::handleMessage(msg);
    switch (msg->what) {
        case MSG_Draw_Text_SurfaceCreated: {
            OnSurfaceCreated();
        }
            break;
        case MSG_Draw_Text_SurfaceChanged:
            OnSurfaceChanged(msg->arg1, msg->arg2);
            break;
        case MSG_Draw_Text_DrawFrame:
            OnDrawFrame();
            break;
        case MSG_Draw_Text_SurfaceDestroyed:
            OnSurfaceDestroyed();
            break;
        default:
            break;
    }
}

void GLDrawTextVideoRender::OnSurfaceCreated() {
    m_EglCore = new EglCore(eglGetCurrentContext(), FLAG_RECORDABLE);
    if (!m_EglCore) {
        LOGE("new EglCore failed!");
        return;
    }

    LOGE("OnSurfaceCreated m_ANWindow:%p", m_ANWindow);

    m_WindowSurface = new WindowSurface(m_EglCore, m_ANWindow);
    if (!m_EglCore) {
        LOGE("new WindowSurface failed!");
        return;
    }
    m_WindowSurface->makeCurrent();
}

void GLDrawTextVideoRender::OnSurfaceChanged(int w, int h) {
    m_backingWidth = w;
    m_backingHeight = h;
    LOGE("OnSurfaceChanged m_backingWidth:%d,m_backingHeight:%d", m_backingWidth, m_backingHeight);
    float windowAspect = (float) m_backingHeight / (float) m_backingWidth;
    size_t outWidth, outHeight;
    if (VIDEO_HEIGHT_DRAW_TEXT > VIDEO_WIDTH_DRAW_TEXT * windowAspect) {
        // limited by narrow width; reduce height
        outWidth = VIDEO_WIDTH_DRAW_TEXT;
        outHeight = (int) (VIDEO_WIDTH_DRAW_TEXT * windowAspect);
    } else {
        // limited by short height; restrict width
        outHeight = VIDEO_HEIGHT_DRAW_TEXT;
        outWidth = (int) (VIDEO_HEIGHT_DRAW_TEXT / windowAspect);
    }
    LOGE(" outWidth:%d,outHeight:%d", outWidth, outHeight);

    offX = (VIDEO_WIDTH_DRAW_TEXT - outWidth) / 2;
    offY = (VIDEO_HEIGHT_DRAW_TEXT - outHeight) / 2;
    off_right = offX + outWidth;
    off_bottom = offY + outHeight;
    //Adjusting window 1920x1104 to +14,+0 1252x720
    LOGE("Adjusting window offX:%d,offY:%d,off_right:%d,off_bottom:%d", offX, offY, off_right,
         off_bottom);
    // Set OpenGL options
    glEnable(GL_CULL_FACE);
    glEnable(GL_BLEND);
    glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

    useYUVProgram();
    createPicProgram();
    createTextProgram();

    createYUVTextures();
    creatPicTexture();

    const char *freeTypePath = freeTypePathString.data();
    LoadFacesByASCII(freeTypePath);
    bindTextVertexData();
}

void GLDrawTextVideoRender::OnDrawFrame() {
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glClearColor(1.0f, 1.0f, 1.0f, 1.0f);

    //绘制YUV视频数据纹理
    if (!updateYUVTextures() || !useYUVProgram()) return;
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
    //绘制图片水印数据纹理
    bindPicTexture();
    usePicProgram();
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

    //绘制文字水印
    glPixelStorei(GL_UNPACK_ALIGNMENT, 1); //禁用byte-alignment限制
    glEnable(GL_BLEND);
    //glEnable(GL_CULL_FACE);
    glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
    checkGlError("glBlendFunc");
    glm::vec2 viewport(m_backingWidth, m_backingHeight);
    RenderText("https://blog.csdn.net/wangyongyao1989", 300.0f, 500.0f, 1.0f, glm::vec3(0.8, 0.1f, 0.1f), viewport);


//    LOGE("OnDrawFrame thread:%ld", pthread_self());
    if (m_TextureMovieEncoder2 != nullptr) {
        m_TextureMovieEncoder2->frameAvailableSoon();
    }
    if (m_InputWindowSurface != nullptr) {
        m_InputWindowSurface->makeCurrentReadFrom(*m_WindowSurface);
        glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
        glClear(GL_COLOR_BUFFER_BIT);
        checkGlError("before glBlitFramebuffer");
        glBlitFramebuffer(0, 0, m_backingWidth, m_backingHeight, offX, offY, off_right, off_bottom,
                          GL_COLOR_BUFFER_BIT, GL_NEAREST);
        m_InputWindowSurface->swapBuffers();

    }

    //切换到m_WindowSurface
    m_WindowSurface->makeCurrent();
    m_WindowSurface->swapBuffers();

}

void GLDrawTextVideoRender::OnSurfaceDestroyed() {
    m_vertexShader = 0;
    m_pixelShader = 0;

    if (m_pDataY) {
        m_pDataY = nullptr;
    }

    if (m_pDataU) {
        m_pDataU = nullptr;
    }

    if (m_pDataV) {
        m_pDataV = nullptr;
    }

    if (yuvGLShader) {
        delete yuvGLShader;
        yuvGLShader = nullptr;
    }

    if (display) {
        display = nullptr;
    }

    if (winsurface) {
        winsurface = nullptr;
    }

    if (m_EglCore) {
        delete m_EglCore;
        m_EglCore = nullptr;
    }

    if (m_WindowSurface) {
        delete m_WindowSurface;
        m_WindowSurface = nullptr;
    }

    quit();

}

void GLDrawTextVideoRender::printGLString(const char *name, GLenum s) {
    const char *v = (const char *) glGetString(s);
    LOGI("OpenGL %s = %s\n", name, v);
}

void GLDrawTextVideoRender::checkGlError(const char *op) {
    for (GLint error = glGetError(); error; error = glGetError()) {
        LOGI("after %s() glError (0x%x)\n", op, error);
    }
}

void GLDrawTextVideoRender::startEncoder(const char *recordPath) {
    LOGD("GLDrawTextVideoRender::startEncoder()");
    m_VideoEncoderCore = new VideoEncoderCore(VIDEO_WIDTH_DRAW_TEXT, VIDEO_HEIGHT_DRAW_TEXT,
                                              BIT_RATE_DRAW_TEXT, recordPath);
    m_InputWindowSurface = new WindowSurface(m_EglCore, m_VideoEncoderCore->getInputSurface());
    m_TextureMovieEncoder2 = new TextureMovieEncoder2(m_VideoEncoderCore);
}

void GLDrawTextVideoRender::stopEncoder() {
    LOGD("GLDrawTextVideoRender::stopEncoder()");
    if (m_TextureMovieEncoder2 != nullptr) {
        m_TextureMovieEncoder2->stopRecording();
    }
    if (m_VideoEncoderCore != nullptr) {
        m_VideoEncoderCore->release();
        m_VideoEncoderCore = nullptr;
    }
    if (m_InputWindowSurface != nullptr) {
        m_InputWindowSurface->release();
        m_InputWindowSurface = nullptr;
    }

    if (m_TextureMovieEncoder2 != nullptr) {
        m_TextureMovieEncoder2 = nullptr;
    }
}


void GLDrawTextVideoRender::creatPicTexture() {

    if (picData) {
        GLenum format = 0;
        if (picChannels == 1) {
            format = GL_RED;
        } else if (picChannels == 3) {
            format = GL_RGB;
        } else if (picChannels == 4) {
            format = GL_RGBA;
        }
        glGenTextures(1, &m_texturePicLoc);
        glBindTexture(GL_TEXTURE_2D, m_texturePicLoc);
        glTexImage2D(GL_TEXTURE_2D, 0, format, picWidth, picHeight, 0, format, GL_UNSIGNED_BYTE,
                     picData);
        glGenerateMipmap(GL_TEXTURE_2D);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        stbi_image_free(picData);
    } else {
        LOGE("creatPicTexture picData  =(null)");
        stbi_image_free(picData);
    }

    if (!m_texturePicLoc) {
        LOGE("creatPicTexture Error Create PIC texture");
    }

}

void GLDrawTextVideoRender::bindTextVertexData() {
    glm::vec2 viewport(m_backingWidth, m_backingHeight);
    glm::mat4 projection = glm::ortho(0.0f, static_cast<GLfloat>(m_backingWidth), 0.0f,
                                      static_cast<GLfloat>(m_backingHeight));
    textGLShader->use();
    textGLShader->setMat4("projection", projection);
    checkGlError("drawTextShader->setMat4(projection");


    // Configure VAO/VBO for texture quads
    glGenVertexArrays(1, &VAO);
    checkGlError("glGenVertexArrays(1, &VAO);");

    glGenBuffers(1, &VBO);
    glBindVertexArray(VAO);
    glBindBuffer(GL_ARRAY_BUFFER, VBO);
    glBufferData(GL_ARRAY_BUFFER, sizeof(GLfloat) * 6 * 4, NULL, GL_DYNAMIC_DRAW);
    glEnableVertexAttribArray(0);
    glVertexAttribPointer(0, 4, GL_FLOAT, GL_FALSE, 4 * sizeof(GLfloat), 0);
    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glBindVertexArray(0);
    checkGlError("glBindVertexArray");
}

void GLDrawTextVideoRender::RenderText(std::string text, GLfloat x, GLfloat y, GLfloat scale,
                                       glm::vec3 color, glm::vec2 viewport) {
    // 激活对应的渲染状态
    textGLShader->use();
    checkGlError("drawTextShader->use()");
    textGLShader->setVec3("textColor", color.x, color.y, color.z);

    glActiveTexture(GL_TEXTURE0);
    checkGlError("glActiveTexture");
    glBindVertexArray(VAO);
    checkGlError("glBindVertexArray(VAO)");

    // 遍历文本中所有的字符
    std::string::const_iterator c;

    LOGE("RenderText x:%f == y:%f", x, y);
    LOGE("RenderText viewportX:%f == viewportY:%f", viewport.x, viewport.y);

    for (c = text.begin(); c != text.end(); c++) {
        Character ch = Characters[*c];

        GLfloat xpos = x + ch.Bearing.x * scale;
        GLfloat ypos = y - (ch.Size.y - ch.Bearing.y) * scale;

        GLfloat w = ch.Size.x * scale;
        GLfloat h = ch.Size.y * scale;

//        LOGE("TextRenderSample::RenderText [xpos,ypos,w,h]=[%f, %f, %f, %f], ch.advance >> 6 = %d"
//                , xpos, ypos, w, h, ch.Advance >> 6);

        // 对每个字符更新VBO
        GLfloat vertices[6][4] = {
                {xpos,     ypos + h, 0.0, 0.0},
                {xpos,     ypos,     0.0, 1.0},
                {xpos + w, ypos,     1.0, 1.0},

                {xpos,     ypos + h, 0.0, 0.0},
                {xpos + w, ypos,     1.0, 1.0},
                {xpos + w, ypos + h, 1.0, 0.0}
        };
        // 在四边形上绘制字形纹理
        glBindTexture(GL_TEXTURE_2D, ch.TextureID);
        checkGlError("glBindTexture");
        // 更新VBO内存的内容
        glBindBuffer(GL_ARRAY_BUFFER, VBO);
        checkGlError("glBindBuffer");
        // Be sure to use glBufferSubData and not glBufferData
        glBufferSubData(GL_ARRAY_BUFFER, 0, sizeof(vertices), vertices);
        checkGlError("glBufferSubData");
        glBindBuffer(GL_ARRAY_BUFFER, 0);
        checkGlError("glBindBuffer(GL_ARRAY_BUFFER");
        // 绘制四边形
        glDrawArrays(GL_TRIANGLES, 0, 6);
        checkGlError("glDrawArrays(GL_TRIANGLES");

        // 更新位置到下一个字形的原点,注意单位是1/64像素
        // 位偏移6个单位来获取单位为像素的值 (2^6 = 64)
        x += (ch.Advance >> 6) * scale;
    }
    glBindVertexArray(0);
    glBindTexture(GL_TEXTURE_2D, 0);
}


void GLDrawTextVideoRender::LoadFacesByASCII(const char *path) {
    LOGE("GLDrawTextVideoRender::LoadFacesByASCII path:%s", path);
    // FreeType
    FT_Library ft;
    // All functions return a value different than 0 whenever an error occurred
    if (FT_Init_FreeType(&ft))
        LOGE("ERROR::FREETYPE: Could not init FreeType Library");

    // Load font as face
    FT_Face face;
    if (FT_New_Face(ft, path, 0, &face))
        LOGE("ERROR::FREETYPE: Failed to load font");

    // Set size to load glyphs as
    FT_Set_Pixel_Sizes(face, 0, 48);

    // Disable byte-alignment restriction
    glPixelStorei(GL_UNPACK_ALIGNMENT, 1);

    // Load first 128 characters of ASCII set
    for (GLubyte c = 0; c < 128; c++) {
        // Load character glyph
        if (FT_Load_Char(face, c, FT_LOAD_RENDER)) {
            LOGE("ERROR::FREETYTPE: Failed to load Glyph");
            continue;
        }
        // Generate texture
        GLuint texture;
        glGenTextures(1, &texture);
        checkGlError("LoadFacesByASCII glGenTextures");
//        LOGE("fore c= %d",c);
//        LOGE("texture === %d",texture);

        glBindTexture(GL_TEXTURE_2D, texture);
        glTexImage2D(
                GL_TEXTURE_2D,
                0,
                //选择GL_LUMINANCE来于:
                // https://stackoverflow.com/questions/70285879/android12-opengles3-0-glteximage2d-0x502-error
                GL_LUMINANCE,
                face->glyph->bitmap.width,
                face->glyph->bitmap.rows,
                0,
                GL_LUMINANCE,
                GL_UNSIGNED_BYTE,
                face->glyph->bitmap.buffer
        );
        // Set texture options
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        // Now store character for later use
        Character character = {
                texture,
                glm::ivec2(face->glyph->bitmap.width, face->glyph->bitmap.rows),
                glm::ivec2(face->glyph->bitmap_left, face->glyph->bitmap_top),
                static_cast<GLuint>(face->glyph->advance.x)
        };
        Characters.insert(std::pair<GLchar, Character>(c, character));
    }
    glBindTexture(GL_TEXTURE_2D, 0);
    checkGlError("glBindTexture(GL_TEXTURE_2D, 0);");

    // Destroy FreeType once we're finished
    FT_Done_Face(face);
    FT_Done_FreeType(ft);
    LOGE("FT_Done_FreeType");

}


五.GitHub项目地址:   

以上的全部代码都放在GitHub - wangyongyao1989/WyFFmpeg: 音视频相关基础实现,码代码不易,喜欢的觉得不错的。希望您点个star,谢谢🙏。


原文地址:https://blog.csdn.net/wangyongyao1989/article/details/143776795

免责声明:本站文章内容转载自网络资源,如本站内容侵犯了原著者的合法权益,可联系本站删除。更多内容请关注自学内容网(zxcms.com)!