android编码h264(一):使用x264编码yuv为h264数据的例子(软编码)

本文介绍了一个基于Android平台的X264编码实践案例,详细解析了从摄像头捕获NV21格式数据,转换为YUV420格式,并最终编码为H264的过程。

先说下简单流程:

 1.camera回调nv21 yuv;

 2.nv21转yuv420;

 3.x264编码h264,回调回java层

 4.写文件,生成.h264文件;

 5.使用vlc等播放器播放。


android java层的代码比较简单,简单说下:

这个demo启动会,surfaceView会显示Camera拍摄到的数据,Activity需要继承

SurfaceHolder.Callback,Camera.PreviewCallback接口

SurfaceHolder.Callback的方法有:

void surfaceCreated(SurfaceHolder var1);

void surfaceChanged(SurfaceHolder var1, int var2, int var3, int var4);

void surfaceDestroyed(SurfaceHolder var1);

surfaceCreated方法一般可以做一些变量的初始化,在本地例子中,用来初始化x264编码器,打开照相机,代码如下:

@Override
public void surfaceCreated(SurfaceHolder holder) {
    // TODO Auto-generated method stub
    x264.initX264Encode(width, height, fps, bitrate);
    camera = getBackCamera();
    startcamera(camera);
}
因为是做demo,所以width,height,fps,bitrate都是自己写死的,如果做的好一些,应该先检查camera支持的分辨率,从而选择width,height,然后获取camera的帧率,再initx264Encoder对象

surfaceDestory方法可以将需要释放的对象释放,本例中,用来关闭照相机,关闭编码器,关闭写好的文件

@Override
public void surfaceDestroyed(SurfaceHolder holder) {
    // TODO Auto-generated method stub
    if (null != camera) {
        camera.setPreviewCallback(null);
        camera.stopPreview();
        camera.release();
        camera = null;
    }
    x264.CloseX264Encode();
    try {
        outputStream.flush();
        outputStream.close();
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
}

Camera.PreviewCallBack接口的方法是:

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
    // TODO Auto-generated method stub
    time += timespan;
    byte[] yuv420 = new byte[width*height*3/2];
    YUV420SP2YUV420(data,yuv420,width,height);
    x264.PushOriStream(yuv420, yuv420.length, time);
}
会回调data ,为yuv数据,yuv数据格式的设置在startcamera中,代码如下:

parameters.setPreviewFormat(ImageFormat.NV21);
按说要检查手机camera支持回调的yuv数据格式,但是一般都支持nv21,所以先设置成nv21,

然后再通过方法将nv21转换为yuv420,因为x264编码器设置的支持的yuv格式为420的,代码如下:

private void YUV420SP2YUV420(byte[] yuv420sp, byte[] yuv420, int width, int height)
{
    if (yuv420sp == null ||yuv420 == null)return;
    int framesize = width*height;
    int i = 0, j = 0;
    //copy y
    for (i = 0; i < framesize; i++)
    {
        yuv420[i] = yuv420sp[i];
    }
    i = 0;
    for (j = 0; j < framesize/2; j+=2)
    {
        yuv420[i + framesize*5/4] = yuv420sp[j+framesize];
        i++;
    }
    i = 0;
    for(j = 1; j < framesize/2;j+=2)
    {
        yuv420[i+framesize] = yuv420sp[j+framesize];
        i++;
    }
}

转好以后,调用native层的接口来将yuv编码为h264,编码完成后,会通过回调抛回来

x264.PushOriStream(yuv420, yuv420.length, time);

private x264sdk.listener l = new x264sdk.listener(){

    @Override
    public void h264data(byte[] buffer, int length) {
        // TODO Auto-generated method stub
        try {
            outputStream.write(buffer, 0, buffer.length);
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
    }
};
抛回来后,写文件,java的代码基本就这么多。

下面简单说下x264编码器:

1.初始化x264编码器

voidx264Encode::initX264Encode(int width, int height, int fps, int bite)

这个函数需要上层传入视频的分辨率及帧率和比特率,具体代码请看例子吧,调重要的说下

x264支持多线程编码,如果需要启动多线程,则需要设置_x264_param->i_threads参数


2.编码yuv为h264

voidx264Encode::startEncoder(uint8_t * dataptr, char *&bufdata,int &buflen, int &isKeyFrame)

第一个参数就是yuv数据,第二个参数为编码成功后h264数据的地址,第三个参数返回h264的长度,第四个参数返回是否为I帧


3.回调给java

使用h264callbackFunc回调,传到JNI层,JNI通过CallVoidMethod回调到java层,JNI回调java代码如下:


void CALLBACK H264DataCallBackFunc(void* pdata,int datalen)
{
h264datacallback.name = "H264DataCallBackFunc";
h264datacallback.signature = "([BI)V";
JavaEnv java;
if (java.istarch) {
JNIEnv* menv= NULL;
VM->AttachCurrentThread(&menv, NULL);
jbyteArray pcmdata = menv->NewByteArray(datalen);
menv->SetByteArrayRegion(pcmdata, 0, datalen,(jbyte*)pdata);
java.env->CallVoidMethod(ehobj,h264datacallback.getMID(java.env, jclz),pcmdata,datalen);
}
}


4.写文件->结束->关闭编码器

完整的例子的下载地址如下:


https://github.com/sszhangpengfei/android_x264_encoder



首先,需要在Android项目中引入MediaCodec和OpenGLES的相关库: ```java import android.media.MediaCodec; import android.media.MediaCodecInfo; import android.media.MediaFormat; import android.media.MediaMuxer; import android.opengl.EGL14; import android.opengl.EGLContext; import android.opengl.GLES20; import android.opengl.GLUtils; import android.os.Environment; import java.io.File; import java.io.IOException; import java.nio.ByteBuffer; ``` 接着,创建个名为VideoEncoder的类,用于实现视频编码: ```java public class VideoEncoder { private static final String MIME_TYPE = "video/avc"; private static final int FRAME_RATE = 25; private static final int I_FRAME_INTERVAL = 10; private static final int TIMEOUT_US = 10000; private MediaCodec mEncoder; private MediaFormat mFormat; private MediaMuxer mMuxer; private int mTrackIndex; private boolean mMuxerStarted; private int mWidth; private int mHeight; private EGLContext mEglContext; private int mTextureId; public VideoEncoder(int width, int height, EGLContext eglContext, int textureId) { mWidth = width; mHeight = height; mEglContext = eglContext; mTextureId = textureId; } public void start(String outputPath) throws IOException { mFormat = MediaFormat.createVideoFormat(MIME_TYPE, mWidth, mHeight); mFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface); mFormat.setInteger(MediaFormat.KEY_BIT_RATE, mWidth * mHeight * 4); mFormat.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE); mFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, I_FRAME_INTERVAL); mEncoder = MediaCodec.createEncoderByType(MIME_TYPE); mEncoder.configure(mFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE); mEncoder.start(); mMuxer = new MediaMuxer(outputPath, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4); mTrackIndex = -1; mMuxerStarted = false; } public void stop() { if (mEncoder != null) { mEncoder.stop(); mEncoder.release(); mEncoder = null; } if (mMuxer != null) { if (mMuxerStarted) { mMuxer.stop(); } mMuxer.release(); mMuxer = null; mTrackIndex = -1; mMuxerStarted = false; } } public void drainEncoder(boolean endOfStream) { final int TIMEOUT_USEC = 10000; if (endOfStream) { mEncoder.signalEndOfInputStream(); } MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo(); while (true) { int encoderStatus = mEncoder.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC); if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) { if (!endOfStream) { break; } } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) { if (mMuxerStarted) { throw new RuntimeException("format changed twice"); } MediaFormat newFormat = mEncoder.getOutputFormat(); mTrackIndex = mMuxer.addTrack(newFormat); mMuxer.start(); mMuxerStarted = true; } else if (encoderStatus < 0) { throw new RuntimeException("unexpected result from encoder.dequeueOutputBuffer: " + encoderStatus); } else { ByteBuffer encodedData = mEncoder.getOutputBuffer(encoderStatus); if (encodedData == null) { throw new RuntimeException("encoderOutputBuffer " + encoderStatus + " was null"); } if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) { bufferInfo.size = 0; } if (bufferInfo.size != 0) { if (!mMuxerStarted) { throw new RuntimeException("muxer hasn't started"); } encodedData.position(bufferInfo.offset); encodedData.limit(bufferInfo.offset + bufferInfo.size); mMuxer.writeSampleData(mTrackIndex, encodedData, bufferInfo); } mEncoder.releaseOutputBuffer(encoderStatus, false); if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) { if (!endOfStream) { throw new RuntimeException("reached end of stream unexpectedly"); } break; } } } } public void encodeFrame() { // Set up the EGL context and surface for offscreen rendering OffscreenSurface surface = new OffscreenSurface(mEglContext, mWidth, mHeight); surface.makeCurrent(); // Set up the texture renderer TextureRenderer renderer = new TextureRenderer(); renderer.surfaceCreated(); renderer.setExternalTexture(mTextureId); renderer.setRenderSize(mWidth, mHeight); // Draw the frame GLES20.glViewport(0, 0, mWidth, mHeight); renderer.surfaceChanged(); renderer.drawFrame(); // Read the pixel data into a buffer ByteBuffer pixelBuffer = ByteBuffer.allocateDirect(mWidth * mHeight * 4); GLES20.glReadPixels(0, 0, mWidth, mHeight, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, pixelBuffer); // Convert the pixel data from RGBA to YUV420 byte[] yuv420 = new byte[mWidth * mHeight * 3 / 2]; NV21Converter.RGBtoYUV420SemiPlanar(pixelBuffer.array(), mWidth, mHeight, yuv420); // Encode the frame and write it to the muxer int inputBufferIndex = mEncoder.dequeueInputBuffer(TIMEOUT_US); if (inputBufferIndex >= 0) { ByteBuffer inputBuffer = mEncoder.getInputBuffer(inputBufferIndex); inputBuffer.clear(); inputBuffer.put(yuv420); mEncoder.queueInputBuffer(inputBufferIndex, 0, yuv420.length, System.nanoTime() / 1000, 0); } // Release the resources surface.release(); } } ``` 其中,OffscreenSurface类用于创建离屏渲染的EGLContext和Surface,NV21Converter类用于将RGBA格式的像素数据转换为YUV420格式的数据。 最后,我们可以在GLSurfaceView中使用VideoEncoder类进行视频编码: ```java public class MainActivity extends AppCompatActivity { private GLSurfaceView mGLSurfaceView; private VideoEncoder mVideoEncoder; private int mTextureId; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); mGLSurfaceView = findViewById(R.id.glsurfaceview); mGLSurfaceView.setEGLContextClientVersion(2); mGLSurfaceView.setRenderer(new Renderer()); mGLSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY); mTextureId = TextureHelper.createTexture(); mVideoEncoder = new VideoEncoder(640, 480, EGL14.eglGetCurrentContext(), mTextureId); try { String outputPath = Environment.getExternalStorageDirectory().getAbsolutePath() + "/output.mp4"; mVideoEncoder.start(outputPath); } catch (IOException e) { e.printStackTrace(); } } @Override protected void onDestroy() { super.onDestroy(); mVideoEncoder.stop(); GLES20.glDeleteTextures(1, new int[]{mTextureId}, 0); } private class Renderer implements GLSurfaceView.Renderer { @Override public void onSurfaceCreated(GL10 gl, EGLConfig config) { GLES20.glClearColor(0.0f, 0.0f, 0.0f, 0.0f); } @Override public void onSurfaceChanged(GL10 gl, int width, int height) { GLES20.glViewport(0, 0, width, height); } @Override public void onDrawFrame(GL10 gl) { GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT); // Render the texture GLES20.glUseProgram(TextureRenderer.DEFAULT_VERTEX_SHADER_PROGRAM); TextureRenderer.renderTexture(mTextureId); // Encode the frame mVideoEncoder.encodeFrame(); } } } ``` 在GLSurfaceView的回调函数onDrawFrame中,我们首先使用TextureRenderer类将纹理渲染到屏幕上,然后调用VideoEncoder类的encodeFrame方法,将当前帧的像素数据编码为H.264格式的视频帧,并写入MP4文件中。
评论 9
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值