Android's MediaCodec (API 16): AAC + AVC / H.264 live stream(003)

http://stackoverflow.com/questions/36313646/androids-mediacodec-api-16-aac-avc-h-264-live-stream-is-unstable


I have application (Qt + Android), that creates live stream from Android's Camera (AVC) + AudioRecorder (AAC) and then sends encoded data to RTMP server using librtmp library (v 2.4).

AVC MediaCodec main func.:

public void videoEncode(byte[] data) {
    // Video buffers
    videoCodecInputBuffers = videoMediaCodec.getInputBuffers();
    videoCodecOutputBuffers = videoMediaCodec.getOutputBuffers();

    int inputBufferIndex = videoMediaCodec.dequeueInputBuffer(-1);
    if (inputBufferIndex >= 0) {
        videoInputBuffer = videoCodecInputBuffers[inputBufferIndex];
        videoCodecInputData = YV12toYUV420Planar(data, encWidth * encHeight);
        videoInputBuffer.clear();
        videoInputBuffer.put(videoCodecInputData);
        videoMediaCodec.queueInputBuffer(inputBufferIndex, 0, videoCodecInputData.length, 0, 0);
    }

    // Get AVC/H.264 frame
    int outputBufferIndex = videoMediaCodec.dequeueOutputBuffer(videoBufferInfo, 0);
    while(outputBufferIndex >= 0) {
        videoOutputBuffer = videoCodecOutputBuffers[outputBufferIndex];
        videoOutputBuffer.get(videoCodecOutputData, 0, videoBufferInfo.size);

        // H.264 / AVC header
        if(videoCodecOutputData[0] == 0x00 && videoCodecOutputData[1] == 0x00 && videoCodecOutputData[2] == 0x00 && videoCodecOutputData[3] == 0x01) {

            // I-frame
            boolean keyFrame = false;
            if((videoBufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) == MediaCodec.BUFFER_FLAG_SYNC_FRAME) {
                resetTimestamp();
                keyFrame = true;
            }

            int currentTimestamp = cameraAndroid.calcTimestamp();
            if(prevTimestamp == currentTimestamp) currentTimestamp++;
            sendVideoData(videoCodecOutputData, videoBufferInfo.size, currentTimestamp, cameraAndroid.calcTimestamp()); // Native C func
            prevTimestamp = currentTimestamp;

            // SPS / PPS sent
            spsPpsFrame = true;
        }

        videoMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
        outputBufferIndex = videoMediaCodec.dequeueOutputBuffer(videoBufferInfo, 0);
    }
}

AAC MediaCodec main func.:

public void audioEncode(byte[] data) {

    // Audio buffers
    audioCodecInputBuffers = audioMediaCodec.getInputBuffers();
    audioCodecOutputBuffers = audioMediaCodec.getOutputBuffers();

    // Add raw chunk into buffer
    int inputBufferIndex = audioMediaCodec.dequeueInputBuffer(-1);
    if (inputBufferIndex >= 0) {
        audioInputBuffer = audioCodecInputBuffers[inputBufferIndex];
        audioInputBuffer.clear();
        audioInputBuffer.put(data);
        audioMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0);
    }

    // Encode AAC
    int outputBufferIndex = audioMediaCodec.dequeueOutputBuffer(audioBufferInfo, 0),
        audioOutputBufferSize = 0;
    while(outputBufferIndex >= 0) {
        audioOutputBuffer = audioCodecOutputBuffers[outputBufferIndex];
        audioOutputBuffer.get(audioCodecOutputData, 0, audioBufferInfo.size);

        if(spsPpsFrame || esdsChunk) {
            int currentTimestamp = cameraAndroid.calcTimestamp();
            if(prevTimestamp == currentTimestamp) currentTimestamp++;
            sendAudioData(audioCodecOutputData, audioBufferInfo.size, currentTimestamp); // Native C func
            prevTimestamp = currentTimestamp;
            esdsChunk = false;
        }

        // Next chunk
        audioMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
        outputBufferIndex = audioMediaCodec.dequeueOutputBuffer(audioBufferInfo, 0);
    }
}

Camera frames encoded in setPreviewCallbackWithBuffer and AudioRecorder's chunks in other thread:

audioThread = new Thread(new Runnable() {
    public void run() {
        audioBufferSize = AudioRecord.getMinBufferSize(44100, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
        while(!audioThread.interrupted()) {
            int ret = mic.read(audioCodecInputData, 0, audioBufferSize);
            if(ret >= 0)
                cameraAndroid.audioEncode(audioCodecInputData);
        }
    }
});

sendVideoData and sendAudioData are native C functions (librtmp func-s + JNI):

public synchronized native void sendVideoData(byte[] buf, int size, int timestamp, boolean keyFrame);
public synchronized native void sendAudioData(byte[] buf, int size, int timestamp);

The main thing, that I can't understood is: why live stream is absolutely unstable, when I playing them from Adobe Flash Player? First 1-2 seconds of stream is absolutely correct, but then I always see I-frames every 2 seconds (videoMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 2)) and very bad sound stream, that I can hear for milliseconds during I-frame interval and then it interrupts.

Can someone show to me correct way for creating stable live stream, please? Where I'm wrong?

Also, I post here AVC/AAC MediaCodec settings (may be something wrong here?):

// H.264/AVC (advanced video coding) format
MediaFormat videoMediaFormat = MediaFormat.createVideoFormat("video/avc", encWidth, encHeight);
videoMediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
videoMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, encWidth * encHeight * 4);                        // бит в секунду
videoMediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, fps);                                           // FPS
videoMediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, iFrameInterval);                          // interval секунд между I-frames
videoMediaCodec = MediaCodec.createEncoderByType("video/avc");
videoMediaCodec.configure(videoMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

// AAC (advanced audio coding) format
MediaFormat audioMediaFormat = MediaFormat.createAudioFormat("audio/mp4a-latm", 44100, 1);              // mime-type, sample rate, channel count
audioMediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 64 * 1000);                                       // kbps
audioMediaFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
audioMediaFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, audioBufferSize);                           // 4096 (default) / 4736 * 1 (min audio buffer size)
audioMediaCodec = MediaCodec.createEncoderByType("audio/mp4a-latm");
audioMediaCodec.configure(audioMediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

Update: I tried to play stream with ffmpeg (thanks @Robert Rowntree) and what I see in console constantly:

Non-monotonous DTS in output stream 0:1; previous: 95054, current: 46136; changing to 95056. This may result in incorrect timestamps in the output file.

So, I check output from android app, but I can't see wrong lines (a - encoded AAC chunk, v - encoded AVC frame, integer value - timestamp in milliseconds): output.txt

Is that correct timestamps?

share improve this question
 
 
can u play your stream by other clients? VLC say.. –  Robert Rowntree  Mar 30 '16 at 18:46
 
yeah, as I wrote upper - stream playable, but unstable. If I disable audio, before streaming - everything OK. –  0x0000dead  Mar 30 '16 at 20:09

Your Answer


内容概要:本文详细介绍了基于FPGA的144输出通道可切换电压源系统的设计与实现,涵盖系统总体架构、FPGA硬件设计、上位机软件设计以及系统集成方案。系统由上位机控制软件(PC端)、FPGA控制核心和高压输出模块(144通道)三部分组成。FPGA硬件设计部分详细描述了Verilog代码实现,包括PWM生成模块、UART通信模块和温度监控模块。硬件设计说明中提及了FPGA选型、PWM生成方式、通信接口、高压输出模块和保护电路的设计要点。上位机软件采用Python编写,实现了设备连接、命令发送、序列控制等功能,并提供了一个图形用户界面(GUI)用于方便的操作和配置。 适合人群:具备一定硬件设计和编程基础的电子工程师、FPGA开发者及科研人员。 使用场景及目标:①适用于需要精确控制多通道电压输出的实验环境或工业应用场景;②帮助用户理解和掌握FPGA在复杂控制系统中的应用,包括PWM控制、UART通信及多通道信号处理;③为研究人员提供一个可扩展的平台,用于测试和验证不同的电压源控制算法和策略。 阅读建议:由于涉及硬件和软件两方面的内容,建议读者先熟悉FPGA基础知识和Verilog语言,同时具备一定的Python编程经验。在阅读过程中,应结合硬件电路图和代码注释,逐步理解系统的各个组成部分及其相互关系。此外,实际动手搭建和调试该系统将有助于加深对整个设计的理解。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值