OpenGL Texture C++ Camera Filter滤镜视频录制

      一.前言:

        GitHub地址:GitHub - wangyongyao1989/WyFFmpeg: 音视频相关基础实现

        系列文章:

        1.  OpenGL Texture C++ 预览Camera视频

        2.  OpenGL Texture C++ Camera Filter滤镜;

        3.  OpenGL 自定义SurfaceView Texture C++预览Camera视频;

        4.  OpenGL Texture C++ Camera Filter滤镜视频录制;

        显示效果:

                

Camera滤镜视频录制

        二.滤镜视频录制前置储备:

  •  本系列文章中1.  OpenGL Texture C++ 预览Camera视频;基于GLSurfaceView创建OpenGL运行环境为基础,把Camera的YUV数据传入OpenGL的片段着色器所创建的三个纹理(Texture)中对视频图像的显示。
  •  本系列文章中2.  OpenGL Texture C++ Camera Filter滤镜;在文章1的基础上对render过程进行优化,传入不同滤镜类型的片段着色器程序。来实现Camera滤镜视频切换显示。
  •  本系列文章中3.  OpenGL 自定义SurfaceView Texture C++预览Camera视频;在文章1的基础上,用自定义的类GLSurfaceView方式创建一个GLThread的OpenGL运行环境,把Camera的YUV数据传入OpenGL的片段着色器所创建的三个纹理(Texture)中对视频图像的显示。
  •  本篇4.  OpenGL Texture C++ Camera Filter滤镜视频录制; 基于文章1/2/3的代码,结合Google开源项目grafika中的WindowSurface.java/Coregl.java/TextureMovieEncoder2.java/VideoEncoderCore.java创建出录制视频的surface并根据此切换GLContext上下文交换(swapBuffer)渲染与显示的视频数据,最终在VideoEncoderCore中通过MediaCodec获取出encodeData写入MediaMuxer成为MP4文件格式的视频。

        三.Android下离屏渲染的三种方式:

        在Android系统下可以使用三种方法实现同时将OpenGL的内容输出给多个目标(屏幕和编码器)。第一种方法是二次渲染法;第二种方法是使用FBO;第三种是使用BlitFramebuffer。

  •  二次渲染法:OpenGL/EGL用于渲染,它收到视频帧后调用Shader程序进行渲染,之后将渲染后的结果输出给SurfaceView的Surface,让其在屏幕上显示;接下来,调用EGL的eglMakeCurrent方法,将默认Surface从SurfaceView的Surface改为MediaCodec的Surface,然后再次调用Shader程序进行渲染,并将渲染后的结果输出给MediaCodec的Surface进行编码。也就是说,二次渲染就是调用两次Shader程序进行渲染,每次渲染后的结果输送给不同的目标Surface,因此称为二次渲染法。
  •  FBO渲染法:通过另外一种Shader程序(这种Shader程序是专门用于处理FBO纹理的)再次进行渲染,渲染成功后将结果输出给屏幕,然后切换Surface再次调用Shader(第二个Shader),这次渲染的结果将会输送给MediaCodec的Surface进行编码。通过FBO方法我们只需要对原模型渲染一次,将结果保存到FBO。之后再对FBO中的内容进行多次渲染,通过这种方式来提高效率。
  •  BlitFramebuffer渲染法:该方法不再使用FBO做缓存,而是像二次渲染法一样,先将渲染的内容输出到当前Surface中,但并不展示到屏幕上。相当于把当前的Surface当作一个缓冲区,然后切换Surface,此时MediaCodec的Surface变成了当前Surface,接下来利用OpenGL3.0提供的API BlitFramebuffer从原来的Surface拷贝数据到当前Surface中,再调用EGL的eglSwapBuffers将Surface中的内容送编码器编码。之后再将当前Surface切回原来的Surface,也就是SurfaceView的Surface,同样调用EGL的eglSwapBuffers方法,将其内容显示到屏幕上。实现了OpenGL仅渲染一次,却可以输出给多个目标,这种方法是最高效的。

        在grafika中的RecordFBOActivity.java为我们提供了这三种方式的实现:

        代码如下:  

private void doFrame(long timeStampNanos) {
            // If we're not keeping up 60fps -- maybe something in the system is busy, maybe
            // recording is too expensive, maybe the CPU frequency governor thinks we're
            // not doing and wants to drop the clock frequencies -- we need to drop frames
            // to catch up.  The "timeStampNanos" value is based on the system monotonic
            // clock, as is System.nanoTime(), so we can compare the values directly.
            //
            // Our clumsy collision detection isn't sophisticated enough to deal with large
            // time gaps, but it's nearly cost-free, so we go ahead and do the computation
            // either way.
            //
            // We can reduce the overhead of recording, as well as the size of the movie,
            // by recording at ~30fps instead of the display refresh rate.  As a quick hack
            // we just record every-other frame, using a "recorded previous" flag.

            update(timeStampNanos);

            long diff = System.nanoTime() - timeStampNanos;
            long max = mRefreshPeriodNanos - 2000000;   // if we're within 2ms, don't bother
            if (diff > max) {
                // too much, drop a frame
                Log.d(TAG, "diff is " + (diff / 1000000.0) + " ms, max " + (max / 1000000.0) +
                        ", skipping render");
                mRecordedPrevious = false;
                mPreviousWasDropped = true;
                mDroppedFrames++;
                return;
            }

            boolean swapResult;

            if (!mRecordingEnabled || mRecordedPrevious) {
                mRecordedPrevious = false;
                // Render the scene, swap back to front.
                draw();
                swapResult = mWindowSurface.swapBuffers();
            } else {
                mRecordedPrevious = true;

                // recording
                if (mRecordMethod == RECMETHOD_DRAW_TWICE) {
                    //Log.d(TAG, "MODE: draw 2x");
                    //二次渲染法
                    // Draw for display, swap.
                    draw();
                    swapResult = mWindowSurface.swapBuffers();

                    // Draw for recording, swap.
                    mVideoEncoder.frameAvailableSoon();
                    mInputWindowSurface.makeCurrent();
                    // If we don't set the scissor rect, the glClear() we use to draw the
                    // light-grey background will draw outside the viewport and muck up our
                    // letterboxing.  Might be better if we disabled the test immediately after
                    // the glClear().  Of course, if we were clearing the frame background to
                    // black it wouldn't matter.
                    //
                    // We do still need to clear the pixels outside the scissor rect, of course,
                    // or we'll get garbage at the edges of the recording.  We can either clear
                    // the whole thing and accept that there will be a lot of overdraw, or we
                    // can issue multiple scissor/clear calls.  Some GPUs may have a special
                    // optimization for zeroing out the color buffer.
                    //
                    // For now, be lazy and zero the whole thing.  At some point we need to
                    // examine the performance here.
                    GLES20.glClearColor(0f, 0f, 0f, 1f);
                    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

                    GLES20.glViewport(mVideoRect.left, mVideoRect.top,
                            mVideoRect.width(), mVideoRect.height());
                    GLES20.glEnable(GLES20.GL_SCISSOR_TEST);
                    GLES20.glScissor(mVideoRect.left, mVideoRect.top,
                            mVideoRect.width(), mVideoRect.height());
                    draw();
                    GLES20.glDisable(GLES20.GL_SCISSOR_TEST);
                    mInputWindowSurface.setPresentationTime(timeStampNanos);
                    mInputWindowSurface.swapBuffers();

                    // Restore.
                    GLES20.glViewport(0, 0, mWindowSurface.getWidth(), mWindowSurface.getHeight());
                    mWindowSurface.makeCurrent();

                } else if (mEglCore.getGlVersion() >= 3 &&
                        mRecordMethod == RECMETHOD_BLIT_FRAMEBUFFER) {
                    //Log.d(TAG, "MODE: blitFramebuffer");
                    // Draw the frame, but don't swap it yet.
                    draw();
                    //BlitFramebuffer渲染法
                    mVideoEncoder.frameAvailableSoon();
                    mInputWindowSurface.makeCurrentReadFrom(mWindowSurface);
                    // Clear the pixels we're not going to overwrite with the blit.  Once again,
                    // this is excessive -- we don't need to clear the entire screen.
                    GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
                    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
                    GlUtil.checkGlError("before glBlitFramebuffer");
                    Log.v(TAG, "glBlitFramebuffer: 0,0," + mWindowSurface.getWidth() + "," +
                            mWindowSurface.getHeight() + "  " + mVideoRect.left + "," +
                            mVideoRect.top + "," + mVideoRect.right + "," + mVideoRect.bottom +
                            "  COLOR_BUFFER GL_NEAREST");
                    GLES30.glBlitFramebuffer(
                            0, 0, mWindowSurface.getWidth(), mWindowSurface.getHeight(),
                            mVideoRect.left, mVideoRect.top, mVideoRect.right, mVideoRect.bottom,
                            GLES30.GL_COLOR_BUFFER_BIT, GLES30.GL_NEAREST);
                    int err;
                    if ((err = GLES30.glGetError()) != GLES30.GL_NO_ERROR) {
                        Log.w(TAG, "ERROR: glBlitFramebuffer failed: 0x" +
                                Integer.toHexString(err));
                    }
                    mInputWindowSurface.setPresentationTime(timeStampNanos);
                    mInputWindowSurface.swapBuffers();

                    // Now swap the display buffer.
                    mWindowSurface.makeCurrent();
                    swapResult = mWindowSurface.swapBuffers();

                } else {
                    //Log.d(TAG, "MODE: offscreen + blit 2x");
                    //FBO渲染法
                    // Render offscreen.
                    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, mFramebuffer);
                    GlUtil.checkGlError("glBindFramebuffer");
                    draw();

                    // Blit to display.
                    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
                    GlUtil.checkGlError("glBindFramebuffer");
                    mFullScreen.drawFrame(mOffscreenTexture, mIdentityMatrix);
                    swapResult = mWindowSurface.swapBuffers();

                    // Blit to encoder.
                    mVideoEncoder.frameAvailableSoon();
                    mInputWindowSurface.makeCurrent();
                    GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);    // again, only really need to
                    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);     //  clear pixels outside rect
                    GLES20.glViewport(mVideoRect.left, mVideoRect.top,
                            mVideoRect.width(), mVideoRect.height());
                    mFullScreen.drawFrame(mOffscreenTexture, mIdentityMatrix);
                    mInputWindowSurface.setPresentationTime(timeStampNanos);
                    mInputWindowSurface.swapBuffers();

                    // Restore previous values.
                    GLES20.glViewport(0, 0, mWindowSurface.getWidth(), mWindowSurface.getHeight());
                    mWindowSurface.makeCurrent();
                }
            }

            mPreviousWasDropped = false;

            if (!swapResult) {
                // This can happen if the Activity stops without waiting for us to halt.
                Log.w(TAG, "swapBuffers failed, killing renderer thread");
                shutdown();
                return;
            }

            // Update the FPS counter.
            //
            // Ideally we'd generate something approximate quickly to make the UI look
            // reasonable, then ease into longer sampling periods.
            final int NUM_FRAMES = 120;
            final long ONE_TRILLION = 1000000000000L;
            if (mFpsCountStartNanos == 0) {
                mFpsCountStartNanos = timeStampNanos;
                mFpsCountFrame = 0;
            } else {
                mFpsCountFrame++;
                if (mFpsCountFrame == NUM_FRAMES) {
                    // compute thousands of frames per second
                    long elapsed = timeStampNanos - mFpsCountStartNanos;
                    mActivityHandler.sendFpsUpdate((int)(NUM_FRAMES * ONE_TRILLION / elapsed),
                            mDroppedFrames);

                    // reset
                    mFpsCountStartNanos = timeStampNanos;
                    mFpsCountFrame = 0;
                }
            }
        }

        四. MP4视频录制:

        在 grafika中的VideoEncoderCore通过对MediaCodec/MediaMuxer/MediaCodec.BufferInfo初始化录制视频该有的参数配置,创建一个InputSurface与WindowSurface绑定生成一个mInputWindowSurface。之后这个mInputWindowSurface给(三)中的二次渲染/FBO渲染/BlitFramebuffer渲染进行GLConext上下文切换(eglMakeCurrent)及数据交换(eglSwapBuffer)。

  •  MediaCodec/MediaMuxer/MediaCodec.BufferInfo初始化录制视频该有的参数配置:
/**
     * Configures encoder and muxer state, and prepares the input Surface.
     */
    public VideoEncoderCore(int width, int height, int bitRate, File outputFile)
            throws IOException {
        mBufferInfo = new MediaCodec.BufferInfo();

        MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, width, height);

        // Set some properties.  Failing to specify some of these can cause the MediaCodec
        // configure() call to throw an unhelpful exception.
        format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
                MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
        format.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
        format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
        format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
        if (VERBOSE) Log.d(TAG, "format: " + format);

        // Create a MediaCodec encoder, and configure it with our format.  Get a Surface
        // we can use for input and wrap it with a class that handles the EGL work.
        mEncoder = MediaCodec.createEncoderByType(MIME_TYPE);
        mEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        mInputSurface = mEncoder.createInputSurface();
        mEncoder.start();

        // Create a MediaMuxer.  We can't add the video track and start() the muxer here,
        // because our MediaFormat doesn't have the Magic Goodies.  These can only be
        // obtained from the encoder after it has started processing data.
        //
        // We're not actually interested in multiplexing audio.  We just want to convert
        // the raw H.264 elementary stream we get from MediaCodec into a .mp4 file.
        mMuxer = new MediaMuxer(outputFile.toString(),
                MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);

        mTrackIndex = -1;
        mMuxerStarted = false;
    }
  •  MediaCodec去除encodeData数据输出给MediaMuxer写入:
/**
     * Extracts all pending data from the encoder and forwards it to the muxer.
     * <p>
     * If endOfStream is not set, this returns when there is no more data to drain.  If it
     * is set, we send EOS to the encoder, and then iterate until we see EOS on the output.
     * Calling this with endOfStream set should be done once, right before stopping the muxer.
     * <p>
     * We're just using the muxer to get a .mp4 file (instead of a raw H.264 stream).  We're
     * not recording audio.
     */
    public void drainEncoder(boolean endOfStream) {
        final int TIMEOUT_USEC = 10000;
        if (VERBOSE) Log.d(TAG, "drainEncoder(" + endOfStream + ")");

        if (endOfStream) {
            if (VERBOSE) Log.d(TAG, "sending EOS to encoder");
            mEncoder.signalEndOfInputStream();
        }

        ByteBuffer[] encoderOutputBuffers = mEncoder.getOutputBuffers();
        while (true) {
            int encoderStatus = mEncoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
            if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                // no output available yet
                if (!endOfStream) {
                    break;      // out of while
                } else {
                    if (VERBOSE) Log.d(TAG, "no output available, spinning to await EOS");
                }
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                // not expected for an encoder
                encoderOutputBuffers = mEncoder.getOutputBuffers();
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                // should happen before receiving buffers, and should only happen once
                if (mMuxerStarted) {
                    throw new RuntimeException("format changed twice");
                }
                MediaFormat newFormat = mEncoder.getOutputFormat();
                Log.d(TAG, "encoder output format changed: " + newFormat);

                // now that we have the Magic Goodies, start the muxer
                mTrackIndex = mMuxer.addTrack(newFormat);
                mMuxer.start();
                mMuxerStarted = true;
            } else if (encoderStatus < 0) {
                Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " +
                        encoderStatus);
                // let's ignore it
            } else {
                ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
                if (encodedData == null) {
                    throw new RuntimeException("encoderOutputBuffer " + encoderStatus +
                            " was null");
                }

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                    // The codec config data was pulled out and fed to the muxer when we got
                    // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                    if (VERBOSE) Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                    mBufferInfo.size = 0;
                }

                if (mBufferInfo.size != 0) {
                    if (!mMuxerStarted) {
                        throw new RuntimeException("muxer hasn't started");
                    }

                    // adjust the ByteBuffer values to match BufferInfo (not needed?)
                    encodedData.position(mBufferInfo.offset);
                    encodedData.limit(mBufferInfo.offset + mBufferInfo.size);

                    mMuxer.writeSampleData(mTrackIndex, encodedData, mBufferInfo);
                    if (VERBOSE) {
                        Log.d(TAG, "sent " + mBufferInfo.size + " bytes to muxer, ts=" +
                                mBufferInfo.presentationTimeUs);
                    }
                }

                mEncoder.releaseOutputBuffer(encoderStatus, false);

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    if (!endOfStream) {
                        Log.w(TAG, "reached end of stream unexpectedly");
                    } else {
                        if (VERBOSE) Log.d(TAG, "end of stream reached");
                    }
                    break;      // out of while
                }
            }
        }
    }

        五.BlitFramebuffer方式C++层实现视频录制:

        以上可知grafika的三种离屏渲染方式及MP4的视频录制过程,现在结合文章1/2/3的代码实现一个自定义SurfaceView的Texture C++ Camera Filter滤镜视频录制的功能。

1. OpenGL运行环境初始化:

        如系列文章3. OpenGL 自定义SurfaceView Texture C++预览Camera视频;自定义SurfaceView对OpenGL运行环境创建,并显示出视频。在此引入grafika的核心类EglCore/WindowSurface取代文章3原有EGLConetxt、EglDisplay、EGLWindowSurface运行环境必要对象的初始化的过程:

        文章3原有创建过程代码:

void
OpenglesSurfaceViewVideoRender::init(ANativeWindow *window, AAssetManager *assetManager,
                                     size_t width,
                                     size_t height) {
    LOGI("OpenglesSurfaceViewVideoRender init==%d, %d", width, height);
    m_backingWidth = width;
    m_backingHeight = height;
    ///EGL
    //1 EGL display创建和初始化
    display = eglGetDisplay(EGL_DEFAULT_DISPLAY);
    if (display == EGL_NO_DISPLAY) {
        LOGE("eglGetDisplay failed!");
        return;
    }
    if (EGL_TRUE != eglInitialize(display, 0, 0)) {
        LOGE("eglInitialize failed!");
        return;
    }
    //2 surface
    //2-1 surface窗口配置
    //输出配置
    EGLConfig config;
    EGLint configNum;
    EGLint configSpec[] = {
            EGL_RED_SIZE, 8,
            EGL_GREEN_SIZE, 8,
            EGL_BLUE_SIZE, 8,
            EGL_SURFACE_TYPE, EGL_WINDOW_BIT, EGL_NONE
    };
    if (EGL_TRUE != eglChooseConfig(display, configSpec, &config, 1, &configNum)) {
        LOGE("eglChooseConfig failed!");
        return;
    }
    //创建surface
    ANativeWindow_acquire(window);
    ANativeWindow_setBuffersGeometry(window, 0, 0, AHARDWAREBUFFER_FORMAT_R8G8B8A8_UNORM);
    winsurface = eglCreateWindowSurface(display, config, window, 0);
    if (winsurface == EGL_NO_SURFACE) {
        LOGE("eglCreateWindowSurface failed!");
        return;
    }
 
    //3 context 创建关联的上下文
    const EGLint ctxAttr[] = {
            EGL_CONTEXT_CLIENT_VERSION, 2, EGL_NONE
    };
    EGLContext context = eglCreateContext(display, config, EGL_NO_CONTEXT, ctxAttr);
    if (context == EGL_NO_CONTEXT) {
        LOGE("eglCreateContext failed!");
        return;
    }
    if (EGL_TRUE != eglMakeCurrent(display, winsurface, winsurface, context)) {
        LOGE("eglMakeCurrent failed!");
        return;
    }
 
    useProgram();
    createTextures();
 
}
 

       替换成 EglCore/WindowSurface的创建方式:

void EGLSurfaceViewVideoRender::OnSurfaceCreated() {
    m_EglCore = new EglCore(eglGetCurrentContext(), FLAG_RECORDABLE);
    if (!m_EglCore) {
        LOGE("new EglCore failed!");
        return;
    }

    LOGE("OnSurfaceCreated m_ANWindow:%p", m_ANWindow);

    m_WindowSurface = new WindowSurface(m_EglCore, m_ANWindow);
    if (!m_EglCore) {
        LOGE("new WindowSurface failed!");
        return;
    }
    m_WindowSurface->makeCurrent();
}

         2.EglCore类:

//  Author : wangyongyao https://github.com/wangyongyao1989
// Created by MMM on 2024/10/11.
// 来源于google开源项目https://github.com/google/grafika 中的EglCore.java类的改造。


#include "../includeopengl/EglCore.h"
#include <assert.h>

EglCore::EglCore() {
    init(NULL, 0);
}


EglCore::~EglCore() {
    release();
}

/**
 * 构造方法
 * @param sharedContext
 * @param flags
 */
EglCore::EglCore(EGLContext sharedContext, int flags) {
    init(sharedContext, flags);
}

/**
 * 初始化
 * @param sharedContext
 * @param flags
 * @return
 */
bool EglCore::init(EGLContext sharedContext, int flags) {
    assert(mEGLDisplay == EGL_NO_DISPLAY);
    if (mEGLDisplay != EGL_NO_DISPLAY) {
        LOGE("EGL already set up");
        return false;
    }
    if (sharedContext == NULL) {
        sharedContext = EGL_NO_CONTEXT;
    }

    mEGLDisplay = eglGetDisplay(EGL_DEFAULT_DISPLAY);
    assert(mEGLDisplay != EGL_NO_DISPLAY);
    if (mEGLDisplay == EGL_NO_DISPLAY) {
        LOGE("unable to get EGL14 display.\n");
        return false;
    }

    if (!eglInitialize(mEGLDisplay, 0, 0)) {
        mEGLDisplay = EGL_NO_DISPLAY;
        LOGE("unable to initialize EGL14");
        return false;
    }

    // 尝试使用GLES3
    if ((flags & FLAG_TRY_GLES3) != 0) {
        EGLConfig config = getConfig(flags, 3);
        if (config != NULL) {
            int attrib3_list[] = {
                    EGL_CONTEXT_CLIENT_VERSION, 3,
                    EGL_NONE
            };
            EGLContext context = eglCreateContext(mEGLDisplay, config,
                                                  sharedContext, attrib3_list);
            checkEglError("eglCreateContext");
            if (eglGetError() == EGL_SUCCESS) {
                mEGLConfig = config;
                mEGLContext = context;
                mGlVersion = 3;
            }
        }
    }
    // 如果GLES3没有获取到,则尝试使用GLES2
    if (mEGLContext == EGL_NO_CONTEXT) {
        EGLConfig config = getConfig(flags, 2);
        assert(config != NULL);
        int attrib2_list[] = {
                EGL_CONTEXT_CLIENT_VERSION, 2,
                EGL_NONE
        };
        EGLContext context = eglCreateContext(mEGLDisplay, config,
                                              sharedContext, attrib2_list);
        checkEglError("eglCreateContext");
        if (eglGetError() == EGL_SUCCESS) {
            mEGLConfig = config;
            mEGLContext = context;
            mGlVersion = 2;
        }
    }

    // 获取eglPresentationTimeANDROID方法的地址
    eglPresentationTimeANDROID = (EGL_PRESENTATION_TIME_ANDROIDPROC)
            eglGetProcAddress("eglPresentationTimeANDROID");
    if (!eglPresentationTimeANDROID) {
        LOGE("eglPresentationTimeANDROID is not available!");
    }

    

    int values[1] = {0};
    eglQueryContext(mEGLDisplay, mEGLContext, EGL_CONTEXT_CLIENT_VERSION, values);
    LOGD("EGLContext created, client version %d", values[0]);

    return true;
}


/**
 * 获取合适的EGLConfig
 * @param flags
 * @param version
 * @return
 */
EGLConfig EglCore::getConfig(int flags, int version) {
    int renderableType = EGL_OPENGL_ES2_BIT;
    if (version >= 3) {
        renderableType |= EGL_OPENGL_ES3_BIT_KHR;
    }
    int attribList[] = {
            EGL_RED_SIZE, 8,
            EGL_GREEN_SIZE, 8,
            EGL_BLUE_SIZE, 8,
            EGL_ALPHA_SIZE, 8,
            //EGL_DEPTH_SIZE, 16,
            //EGL_STENCIL_SIZE, 8,
            EGL_RENDERABLE_TYPE, renderableType,
            EGL_NONE, 0,      // placeholder for recordable [@-3]
            EGL_NONE
    };
    int length = sizeof(attribList) / sizeof(attribList[0]);
    if ((flags & FLAG_RECORDABLE) != 0) {
        attribList[length - 3] = EGL_RECORDABLE_ANDROID;
        attribList[length - 2] = 1;
    }
    EGLConfig configs = NULL;
    int numConfigs;
    if (!eglChooseConfig(mEGLDisplay, attribList, &configs, 1, &numConfigs)) {
        LOGW("unable to find RGB8888 / %d  EGLConfig", version);
        return NULL;
    }
    return configs;
}

/**
 * 释放资源
 */
void EglCore::release() {
    if (mEGLDisplay != EGL_NO_DISPLAY) {
        eglMakeCurrent(mEGLDisplay, EGL_NO_SURFACE, EGL_NO_SURFACE, EGL_NO_CONTEXT);
        eglDestroyContext(mEGLDisplay, mEGLContext);
        eglReleaseThread();
        eglTerminate(mEGLDisplay);
    }

    mEGLDisplay = EGL_NO_DISPLAY;
    mEGLContext = EGL_NO_CONTEXT;
    mEGLConfig = NULL;
}

/**
 * 获取EGLContext
 * @return
 */
EGLContext EglCore::getEGLContext() {
    return mEGLContext;
}

/**
 * 销毁EGLSurface
 * @param eglSurface
 */
void EglCore::releaseSurface(EGLSurface eglSurface) {
    eglDestroySurface(mEGLDisplay, eglSurface);
}

/**
 * 创建EGLSurface
 * @param surface
 * @return
 */
EGLSurface EglCore::createWindowSurface(ANativeWindow *surface) {
    assert(surface != NULL);
    if (surface == NULL) {
        LOGE("ANativeWindow is NULL!");
        return NULL;
    }
    int surfaceAttribs[] = {
            EGL_NONE
    };

    ANativeWindow_acquire(surface);
    ANativeWindow_setBuffersGeometry(surface, 0, 0, AHARDWAREBUFFER_FORMAT_R8G8B8A8_UNORM);

    LOGD("eglCreateWindowSurface start");
    EGLSurface eglSurface = eglCreateWindowSurface(mEGLDisplay, mEGLConfig, surface, surfaceAttribs);
    checkEglError("eglCreateWindowSurface");
    assert(eglSurface != NULL);
    if (eglSurface == NULL) {
        LOGE("EGLSurface is NULL!");
        return NULL;
    }
    return eglSurface;
}

/**
 * 创建离屏渲染的EGLSurface
 * @param width
 * @param height
 * @return
 */
EGLSurface EglCore::createOffscreenSurface(int width, int height) {
    int surfaceAttribs[] = {
            EGL_WIDTH, width,
            EGL_HEIGHT, height,
            EGL_NONE
    };
    EGLSurface eglSurface = eglCreatePbufferSurface(mEGLDisplay, mEGLConfig, surfaceAttribs);
    assert(eglSurface != NULL);
    if (eglSurface == NULL) {
        LOGE("Surface was null");
        return NULL;
    }
    return eglSurface;
}

/**
 * 切换到当前的上下文
 * @param eglSurface
 */
void EglCore::makeCurrent(EGLSurface eglSurface) {
    if (mEGLDisplay == EGL_NO_DISPLAY) {
        LOGD("Note: makeCurrent w/o display.\n");
    }
    if (!eglMakeCurrent(mEGLDisplay, eglSurface, eglSurface, mEGLContext)) {
        // TODO 抛出异常
        LOGD("Note: eglMakeCurrent error.\n");
    }
}

/**
 * 切换到某个上下文
 * @param drawSurface
 * @param readSurface
 */
void EglCore::makeCurrent(EGLSurface drawSurface, EGLSurface readSurface) {
    if (mEGLDisplay == EGL_NO_DISPLAY) {
        LOGD("Note: makeCurrent w/o display.\n");
    }
    if (!eglMakeCurrent(mEGLDisplay, drawSurface, readSurface, mEGLContext)) {
        // TODO 抛出异常
    }
}

/**
 *
 */
void EglCore::makeNothingCurrent() {
    if (!eglMakeCurrent(mEGLDisplay, EGL_NO_SURFACE, EGL_NO_SURFACE, EGL_NO_CONTEXT)) {
        // TODO 抛出异常
    }
}

/**
 * 交换显示
 * @param eglSurface
 * @return
 */
bool EglCore::swapBuffers(EGLSurface eglSurface) {
    return eglSwapBuffers(mEGLDisplay, eglSurface);
}

/**
 * 设置显示时间戳pts
 * @param eglSurface
 * @param nsecs
 */
void EglCore::setPresentationTime(EGLSurface eglSurface, long nsecs) {
    eglPresentationTimeANDROID(mEGLDisplay, eglSurface, nsecs);
}

/**
 * 是否处于当前上下文
 * @param eglSurface
 * @return
 */
bool EglCore::isCurrent(EGLSurface eglSurface) {
    return mEGLContext == eglGetCurrentContext() &&
           eglSurface == eglGetCurrentSurface(EGL_DRAW);
}

/**
 * 查询surface
 * @param eglSurface
 * @param what
 * @return
 */
int EglCore::querySurface(EGLSurface eglSurface, int what) {
    int value;
    eglQuerySurface(mEGLContext, eglSurface, what, &value);
    return value;
}

/**
 * 查询字符串
 * @param what
 * @return
 */
const char* EglCore::queryString(int what) {
    return eglQueryString(mEGLDisplay, what);
}

/**
 * 获取GLES版本号
 * @return
 */
int EglCore::getGlVersion() {
    return mGlVersion;
}

/**
 * 检查是否出错
 * @param msg
 */
void EglCore::checkEglError(const char *msg) {
    int error;
    if ((error = eglGetError()) != EGL_SUCCESS) {
        // TODO 抛出异常
        LOGE("%s: EGL error: %x", msg, error);
    }
}

        3.WindowSurface类:

//  Author : wangyongyao https://github.com/wangyongyao1989
// Created by MMM on 2024/10/11.
// 来源于google开源项目 https://github.com/google/grafika WindowSurface.java类的改造。


#include "../includeopengl/WindowSurface.h"
#include <assert.h>

WindowSurface::WindowSurface(EglCore *eglCore, ANativeWindow *window, bool releaseSurface)
        : EglSurfaceBase(eglCore) {
    mSurface = window;
    createWindowSurface(mSurface);
    mReleaseSurface = releaseSurface;
}

WindowSurface::WindowSurface(EglCore *eglCore, ANativeWindow *window)
        : EglSurfaceBase(eglCore) {
    createWindowSurface(window);
    mSurface = window;
}

void WindowSurface::release() {
    releaseEglSurface();
    if (mSurface != NULL) {
        ANativeWindow_release(mSurface);
        mSurface = NULL;
    }

}

void WindowSurface::recreate(EglCore *eglCore) {
    assert(mSurface != NULL);
    if (mSurface == NULL) {
        LOGE("not yet implemented ANativeWindow");
        return;
    }
    mEglCore = eglCore;
    createWindowSurface(mSurface);
}

       4.BlitFramebuffer渲染:

          m_InputWindowSurface 进行GLConext上下文切换(eglMakeCurrent)及数据交换(eglSwapBuffer)。这时m_InputWindowSurface就有了OpenGL即要渲染显示的数据了

void EGLSurfaceViewVideoRender::OnDrawFrame() {
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
    if (!updateTextures() /*|| !useProgram()*/) return;

    //窗口显示
    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

//    LOGE("OnDrawFrame thread:%ld", pthread_self());
    if (m_TextureMovieEncoder2 != nullptr) {
        m_TextureMovieEncoder2->frameAvailableSoon();
    }
    if (m_InputWindowSurface != nullptr) {
        m_InputWindowSurface->makeCurrentReadFrom(*m_WindowSurface);
        glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
        glClear(GL_COLOR_BUFFER_BIT);
        checkGlError("before glBlitFramebuffer");
        glBlitFramebuffer(0, 0, m_backingWidth, m_backingHeight, offX, offY, off_right, off_bottom,
                          GL_COLOR_BUFFER_BIT, GL_NEAREST);
//        m_InputWindowSurface->setPresentationTime(40002204);
        m_InputWindowSurface->swapBuffers();

    }

    //切换到m_WindowSurface
    m_WindowSurface->makeCurrent();
    m_WindowSurface->swapBuffers();

}

        5. TextureMovieEncoder2类:

        创建一个录制视频的子线程,与渲染GLThread线程分离。管理录制视频过程中的状态。

//  Author : wangyongyao https://github.com/wangyongyao1989
// Created by MMM on 2024/10/15.
//

#include "../includeopengl/TextureMovieEncoder2.h"

TextureMovieEncoder2::TextureMovieEncoder2(VideoEncoderCore *videoEncoderCore) {
    m_VideoEncoderCore = videoEncoderCore;

}

TextureMovieEncoder2::~TextureMovieEncoder2() {
    if (m_VideoEncoderCore != nullptr) {
        m_VideoEncoderCore->release();
        m_VideoEncoderCore = nullptr;
    }
    quit();
}


void TextureMovieEncoder2::stopRecording() {
    postMessage(MSG_STOP_RECORDING);
}

bool TextureMovieEncoder2::isRecording() {

    return false;
}

void TextureMovieEncoder2::frameAvailableSoon() {
//    LOGE("TextureMovieEncoder2::frameAvailableSoon");
    postMessage(MSG_FRAME_AVAILABLE);
}

void TextureMovieEncoder2::handleFrameAvailable() {
//    LOGE("TextureMovieEncoder2::handleMessage handleFrameAvailable");
    m_VideoEncoderCore->drainEncoder(false);
}

void TextureMovieEncoder2::handleStopRecording() {
    LOGE("TextureMovieEncoder2::handleMessage handleStopRecording");
    m_VideoEncoderCore->drainEncoder(true);
    m_VideoEncoderCore->release();
}


void TextureMovieEncoder2::handleMessage(LooperMessage *msg) {
    Looper::handleMessage(msg);
    switch (msg->what) {
        case MSG_STOP_RECORDING: {
//            LOGE("TextureMovieEncoder2::handleMessage MSG_STOP_RECORDING");
            handleStopRecording();
        }
            break;
        case MSG_FRAME_AVAILABLE:
//            LOGE("TextureMovieEncoder2::handleMessage MSG_FRAME_AVAILABLE");
            handleFrameAvailable();
            break;
        default:
            break;
    }
}

        6.VideoEncoderCore类:

        MediaCodec/MediaMuxer/MediaCodec.BufferInfo初始化录制视频该有的参数配置,创建一个InputSurface与WindowSurface绑定生成一个mInputWindowSurface。

//  Author : wangyongyao https://github.com/wangyongyao1989
// Created by MMM on 2024/10/15.
//

#include "../includeopengl/VideoEncoderCore.h"


VideoEncoderCore::VideoEncoderCore(size_t width, size_t height, size_t bitRate,
                                   const char *outPutFile) {
    LOGD("VideoEncoderCore- width: %d, height: %d, bitRate: %d ,outPutFile: %s", width, height,
         bitRate, outPutFile);
    m_MediaMuxer_fp = fopen(outPutFile, "wb+");// 打开新建一个文件。
    if (m_MediaMuxer_fp == nullptr) {
        LOGE("MediaCodecMuxer:: Mp4 file fopen err!");
        return;
    }

    // 由于muexr的原因,这里需要转换一下。
    m_MediaMuxer_fd = fileno(m_MediaMuxer_fp);
    if (m_MediaMuxer_fd < 0) {
        perror("mp4 file err: ");
        LOGE("MediaCodecMuxer:: Mp4 file open err! = %d", m_MediaMuxer_fd);
    }
    m_AMediaFormat = AMediaFormat_new();
    // H.264 Advanced Video Coding
    AMediaFormat_setString(m_AMediaFormat, AMEDIAFORMAT_KEY_MIME, MIME_TYPE);
    AMediaFormat_setInt32(m_AMediaFormat, AMEDIAFORMAT_KEY_WIDTH, width);
    AMediaFormat_setInt32(m_AMediaFormat, AMEDIAFORMAT_KEY_HEIGHT, height);
    AMediaFormat_setInt32(m_AMediaFormat, AMEDIAFORMAT_KEY_COLOR_FORMAT, COLOR_FormatSurface);
    AMediaFormat_setInt32(m_AMediaFormat, AMEDIAFORMAT_KEY_BIT_RATE, bitRate);
    // 30fps
    AMediaFormat_setInt32(m_AMediaFormat, AMEDIAFORMAT_KEY_FRAME_RATE, 30);
    // 5 seconds between I-frames
    AMediaFormat_setInt32(m_AMediaFormat, AMEDIAFORMAT_KEY_I_FRAME_INTERVAL, 5);

    m_AMediaCodec = AMediaCodec_createEncoderByType(MIME_TYPE);
    if (m_AMediaCodec == NULL) {
        LOGE("ERROR: AMediaCodec_createEncoderByType");
        return;
    }

    media_status_t configureStatus = AMediaCodec_configure(m_AMediaCodec, m_AMediaFormat, NULL,
                                                           NULL,
                                                           AMEDIACODEC_CONFIGURE_FLAG_ENCODE);
    if (configureStatus != AMEDIA_OK) {
        LOGE("ERROR: AMediaCodec_createEncoderByType");
        return;
    }
    //创建一个录制的WindowSurface,用于视频的录制
    media_status_t createInputSurfaceStatus = AMediaCodec_createInputSurface(m_AMediaCodec,
                                                                             &m_Encoder_WindowSurface);
    if (createInputSurfaceStatus != AMEDIA_OK) {
        LOGE("ERROR: AMediaCodec_createInputSurface :%d", createInputSurfaceStatus);
        return;
    }

    media_status_t codecStart = AMediaCodec_start(m_AMediaCodec);
    if (codecStart != AMEDIA_OK) {
        LOGE("ERROR: AMediaCodec_start");
        return;
    }

    // 新建一个复合输出
    m_AMediaMuxer = AMediaMuxer_new(m_MediaMuxer_fd, AMEDIAMUXER_OUTPUT_FORMAT_MPEG_4);
    LOGD(" AMediaMuxer_new OK");

    mTrackIndex = -1;
    mMuxerStarted = false;

}

VideoEncoderCore::~VideoEncoderCore() {
    release();
}


void VideoEncoderCore::drainEncoder(bool endOfStream) {
//    LOGE("drainEncoder thread:%ld", pthread_self());
    if (endOfStream) {
        LOGE("sending EOS to encoder");
        AMediaCodec_signalEndOfInputStream(m_AMediaCodec);
        return;
    }

    while (true) {
        AMediaCodecBufferInfo info;
        //time out usec 1
        ssize_t status = AMediaCodec_dequeueOutputBuffer(m_AMediaCodec, &info, 1);
//        LOGW("AMediaCodec_dequeueOutputBuffer status %d", status);

        if (status == AMEDIACODEC_INFO_TRY_AGAIN_LATER) {

            if (!endOfStream) {
                break;
            } else {
                LOGI("video no output available, spinning to await EOS");
            }
        } else if (status == AMEDIACODEC_INFO_OUTPUT_BUFFERS_CHANGED) {
            // not expected for an encoder
        } else if (status == AMEDIACODEC_INFO_OUTPUT_FORMAT_CHANGED) {
            if (mMuxerStarted) {
                LOGW("format changed twice");
            }

            AMediaFormat *fmt = AMediaCodec_getOutputFormat(m_AMediaCodec);
            const char *s = AMediaFormat_toString(fmt);
            LOGI("video output format %s", s);

            mTrackIndex = AMediaMuxer_addTrack(m_AMediaMuxer, fmt);

            if (mTrackIndex != -1) {

                LOGI("AMediaMuxer_start");
                AMediaMuxer_start(m_AMediaMuxer);
                mMuxerStarted = true;
            }

        } else {
            uint8_t *encodeData = AMediaCodec_getOutputBuffer(m_AMediaCodec, status,
                                                              NULL/* out_size */);
            if (encodeData == NULL) {
                LOGE("encoder output buffer was null");
            }
            if ((info.flags & AMEDIACODEC_BUFFER_FLAG_CODEC_CONFIG) != 0) {
                LOGI("ignoring AMEDIACODEC_BUFFER_FLAG_CODEC_CONFIG");
                info.size = 0;
            }

            size_t dataSize = info.size;

            if (dataSize != 0) {

                if (!mMuxerStarted) {
                    LOGE("muxer has't started");
                }
//                info.presentationTimeUs = frameIndex * 1000000L / frameRate;
//                LOGI("AMediaMuxer_writeSampleData video size %d", dataSize);
                AMediaMuxer_writeSampleData(m_AMediaMuxer, mTrackIndex, encodeData, &info);
            } else {
                LOGI("Info emptye %d", dataSize);
            }

            AMediaCodec_releaseOutputBuffer(m_AMediaCodec, status, false);

            if ((info.flags & AMEDIACODEC_BUFFER_FLAG_END_OF_STREAM) != 0) {

                if (!endOfStream) {
                    LOGW("reached end of stream unexpectly");
                } else {
                    LOGI("video end of stream reached");
                }
                break;
            }
        }
    }


}

void VideoEncoderCore::release() {
    if (m_AMediaCodec != nullptr) {
        AMediaCodec_stop(m_AMediaCodec);
    }
    if (m_AMediaMuxer != nullptr) {
        AMediaMuxer_stop(m_AMediaMuxer);
    }

    if (m_AMediaCodec != nullptr) {
        AMediaCodec_delete(m_AMediaCodec);
        m_AMediaCodec = nullptr;
    }
    if (m_AMediaMuxer != nullptr) {
        AMediaMuxer_delete(m_AMediaMuxer);
        m_AMediaMuxer = nullptr;
    }

    if (m_MediaMuxer_fp != nullptr) {
        delete m_MediaMuxer_fp;
        m_MediaMuxer_fp = nullptr;
    }
}

ANativeWindow *VideoEncoderCore::getInputSurface() {
    return m_Encoder_WindowSurface;
}

        7.滤镜视频的录制:

        在 2.  OpenGL Texture C++ Camera Filter滤镜;对于滤镜的切换传入不同的滤镜相关的着色器程序后。再次基础上就完成了OpenGL Texture C++ Camera Filter滤镜视频录制的整个功能实现。

        所有的项目代码都放在 GitHub地址:GitHub - wangyongyao1989/WyFFmpeg: 音视频相关基础实现

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值