在这一篇文章中,我们将以C2SoftHevcDec为例,了解Codec2组件应该如何实现SimpleC2Component提供的接口,看完这一节我们应该对如何实现Codec2组件有清晰的思路。
1、onInit
组件第一次启动或者reset之后第一次启动,SimpleC2Component会调用子类实现的onInit方法来创建并初始化解码器。
C2SoftHevcDec使用了外部静态库libhevcdec,这部分内容我们不做详细了解,本文对相关API调用的描述大多为我的个人猜测。
c2_status_t C2SoftHevcDec::onInit() {
// 1
status_t err = initDecoder();
return err == OK ? C2_OK : C2_CORRUPTED;
}
status_t C2SoftHevcDec::initDecoder() {
// 2.
if (OK != createDecoder()) return UNKNOWN_ERROR;
mNumCores = MIN(getCpuCoreCount(), MAX_NUM_CORES);
// 3. 根据width计算stride
mStride = ALIGN128(mWidth);
mSignalledError = false;
resetPlugin();
(void) setNumCores();
// 4. 将解码器设置为IVD_DECODE_FRAME,stride设置给decoder
if (OK != setParams(mStride, IVD_DECODE_FRAME)) return UNKNOWN_ERROR;
(void) getVersion();
return OK;
}
onInit方法调用了createDecoder方法来创建解码器,然后将解码器设置为IVD_DECODE_FRAME,我理解的是解数据帧的模式。
status_t C2SoftHevcDec::createDecoder() {
ivdext_create_ip_t s_create_ip = {};
ivdext_create_op_t s_create_op = {};
s_create_ip.s_ivd_create_ip_t.u4_size = sizeof(ivdext_create_ip_t);
s_create_ip.s_ivd_create_ip_t.e_cmd = IVD_CMD_CREATE;
s_create_ip.s_ivd_create_ip_t.u4_share_disp_buf = 0;
s_create_ip.s_ivd_create_ip_t.e_output_format = mIvColorformat;
s_create_ip.s_ivd_create_ip_t.pf_aligned_alloc = ivd_aligned_malloc;
s_create_ip.s_ivd_create_ip_t.pf_aligned_free = ivd_aligned_free;
s_create_ip.s_ivd_create_ip_t.pv_mem_ctxt = nullptr;
s_create_op.s_ivd_create_op_t.u4_size = sizeof(ivdext_create_op_t);
// 1. 创建组件实例
IV_API_CALL_STATUS_T status = ivdec_api_function(mDecHandle,
&s_create_ip,
&s_create_op);
if (status != IV_SUCCESS) {
ALOGE("error in %s: 0x%x", __func__,
s_create_op.s_ivd_create_op_t.u4_error_code);
return UNKNOWN_ERROR;
}
// 2. handle赋值
mDecHandle = (iv_obj_t*)s_create_op.s_ivd_create_op_t.pv_handle;
mDecHandle->pv_fxns = (void *)ivdec_api_function;
mDecHandle->u4_size = sizeof(iv_obj_t);
return OK;
}
ivdext_create_ip_t是输入参数,cmd为IVD_CMD_CREATE。ivdext_create_op_t为输出参数,调用ivdec_api_function时传入的mDecHandle为NULL,函数调用完成后,会将输出参数存储的handle赋值给mDecHandle,后续再操作解码器实例需要使用此handle。
init之后就要start,不过SimpleC2Component没有提供onStart方法,也就是说onInit执行完成后组件就已经处在运行状态下了。
2、process
组件启动完成就可以开始处理数据了,所以我们接着先看process方法,代码较长我们分段阅读:
void C2SoftHevcDec::process(
const std::unique_ptr<C2Work> &work,
const std::shared_ptr<C2BlockPool> &pool) {
// 1. 初始化C2Work的输出结果
work->result = C2_OK;
work->workletsProcessed = 0u;
work->worklets.front()->output.configUpdate.clear();
work->worklets.front()->output.flags = work->input.flags;
// 2. 如果解码器出现错误,或者已经发出output eos,返回error
if (mSignalledError || mSignalledOutputEos) {
work->result = C2_BAD_VALUE;
return;
}
size_t inOffset = 0u;
size_t inSize = 0u;
// 3. 获取input frameIndex
uint32_t workIndex = work->input.ordinal.frameIndex.peeku() & 0xFFFFFFFF;
C2ReadView rView = mDummyReadView;
if (!work->input.buffers.empty()) {
// 4. 获取readView
rView = work->input.buffers[0]->data().linearBlocks().front().map().get();
// 5. 读取容量
inSize = rView.capacity();
if (inSize && rView.error()) {
ALOGE("read view map failed %d", rView.error());
work->result = rView.error();
return;
}
}
// 6. 读取是否携带eos flag
bool eos = ((work->input.flags & C2FrameData::FLAG_END_OF_STREAM) != 0);
bool hasPicture = false;
size_t inPos = 0;
// ......
}
调用process方法需要传入一个C2Work引用和output C2BlockPool引用,C2Work在这里是一个输入输出参数,output相关的信息会记录到该C2Work中,最终它会携带解码的结果返回给CCodec。
-
首先初始化C2Work中的输出结果,result默认为C2_OK,它记录的是数据处理中出现的问题;workletsProcessed默认为0表示,当前C2Work的数据未被处理;worklets是一个list,但是实际只会存储一个输出,所以使用front拿到第一个C2Worklet;更多C2Work内容参考 Android Codec2(二六)C2Work;
-
如果解码器出现错误,或者已经发出output eos,将work->result置为C2_BAD_VALUE;
-
获取C2Work携带的input frameIndex;
-
从C2Buffer中获取C2ReadView,映射过程参考Android Codec2(二十)C2Buffer与Codec2Buffer 、Android Codec2(十九)C2LinearBlock;
接着会有一个while循环来处理input data:
while (inPos < inSize) {
if (C2_OK != ensureDecoderState(pool)) {
mSignalledError = true;
work->workletsProcessed = 1u;
work->result = C2_CORRUPTED;
return;
}
// ......
}
inPos为当前数据游标,inSize为数据长度。while循环会先调用ensureDecoderState方法:
c2_status_t C2SoftHevcDec::ensureDecoderState(const std::shared_ptr<C2BlockPool> &pool) {
if (!mDecHandle) {
ALOGE("not supposed to be here, invalid decoder context");
return C2_CORRUPTED;
}
if (mOutBlock &&
(mOutBlock->width() != ALIGN128(mWidth) || mOutBlock->height() != mHeight)) {
mOutBlock.reset();
}
if (!mOutBlock) {
uint32_t format = HAL_PIXEL_FORMAT_YV12;
C2MemoryUsage usage = { C2MemoryUsage::CPU_READ, C2MemoryUsage::CPU_WRITE };
c2_status_t err =
pool->fetchGraphicBlock(ALIGN128(mWidth), mHeight, format, usage, &mOutBlock);
if (err != C2_OK) {
ALOGE("fetchGraphicBlock for Output failed with status %d", err);
return err;
}
ALOGV("provided (%dx%d) required (%dx%d)",
mOutBlock->width(), mOutBlock->height(), ALIGN128(mWidth), mHeight);
}
return C2_OK;
}
ensureDecoderState会调用C2BlockPool的fetchGraphicBlock方法,将获取到的C2GraphicBlock存储到mOutBlock,fetch过程参考 Android Codec2(二三)C2BufferQueueBlockPool - Ⅱ。
// 1. 获取writeview
C2GraphicView wView = mOutBlock->map().get();
if (wView.error()) {
ALOGE("graphic view map failed %d", wView.error());
work->result = wView.error();
return;
}
ihevcd_cxa_video_decode_ip_t s_hevcd_decode_ip = {};
ihevcd_cxa_video_decode_op_t s_hevcd_decode_op = {};
ivd_video_decode_ip_t *ps_decode_ip = &s_hevcd_decode_ip.s_ivd_video_decode_ip_t;
ivd_video_decode_op_t *ps_decode_op = &s_hevcd_decode_op.s_ivd_video_decode_op_t;
// 2. 开始准备数据处理
if (!setDecodeArgs(ps_decode_ip, ps_decode_op, &rView, &wView,
inOffset + inPos, inSize - inPos, workIndex)) {
// 3. 处理失败
mSignalledError = true;
work->workletsProcessed = 1u;
work->result = C2_CORRUPTED;
return;
}
- 首先从C2GraphicView获取C2WriteView,映射过程参考 Android Codec2(二四)C2GraphicBlock;
- 调用setDecodeArgs准备处理input data;
- 如果调用返回error,将mSignalledError置为true,同时将workletsProcessed置1表示数据处理完成,result置为C2_CORRUPTED表示出现错误。
bool C2SoftHevcDec::setDecodeArgs(ivd_video_decode_ip_t *ps_decode_ip,
ivd_video_decode_op_t *ps_decode_op,
C2ReadView *inBuffer,
C2GraphicView *outBuffer,
size_t inOffset,
size_t inSize,
uint32_t tsMarker) {
uint32_t displayStride = mStride;
if (outBuffer) {
C2PlanarLayout layout;
layout = outBuffer->layout();
displayStride = layout.planes[C2PlanarLayout::PLANE_Y].rowInc;
}
uint32_t displayHeight = mHeight;
size_t lumaSize = displayStride * displayHeight;
size_t chromaSize = lumaSize >> 2;
if (mStride != displayStride) {
mStride = displayStride;
if (OK != setParams(mStride, IVD_DECODE_FRAME)) return false;
}
ps_decode_ip->u4_size = sizeof(ihevcd_cxa_video_decode_ip_t);
ps_decode_ip->e_cmd = IVD_CMD_VIDEO_DECODE;
if (inBuffer) {
// 1. 设置frameIndex,input数据地址以及input buffer size
ps_decode_ip->u4_ts = tsMarker;
ps_decode_ip->pv_stream_buffer = const_cast<uint8_t *>(inBuffer->data() + inOffset);
ps_decode_ip->u4_num_Bytes = inSize;
} else {
ps_decode_ip->u4_ts = 0;
ps_decode_ip->pv_stream_buffer = nullptr;
ps_decode_ip->u4_num_Bytes = 0;
}
// 2. 设置output buffer地址
ps_decode_ip->s_out_buffer.u4_min_out_buf_size[0] = lumaSize;
ps_decode_ip->s_out_buffer.u4_min_out_buf_size[1] = chromaSize;
ps_decode_ip->s_out_buffer.u4_min_out_buf_size[2] = chromaSize;
if (outBuffer) {
if (outBuffer->height() < displayHeight) {
ALOGE("Output buffer too small: provided (%dx%d) required (%ux%u)",
outBuffer->width(), outBuffer->height(), displayStride, displayHeight);
return false;
}
ps_decode_ip->s_out_buffer.pu1_bufs[0] = outBuffer->data()[C2PlanarLayout::PLANE_Y];
ps_decode_ip->s_out_buffer.pu1_bufs[1] = outBuffer->data()[C2PlanarLayout::PLANE_U];
ps_decode_ip->s_out_buffer.pu1_bufs[2] = outBuffer->data()[C2PlanarLayout::PLANE_V];
} else {
ps_decode_ip->s_out_buffer.pu1_bufs[0] = mOutBufferFlush;
ps_decode_ip->s_out_buffer.pu1_bufs[1] = mOutBufferFlush + lumaSize;
ps_decode_ip->s_out_buffer.pu1_bufs[2] = mOutBufferFlush + lumaSize + chromaSize;
}
ps_decode_ip->s_out_buffer.u4_num_bufs = 3;
ps_decode_op->u4_size = sizeof(ihevcd_cxa_video_decode_op_t);
ps_decode_op->u4_output_present = 0;
return true;
}
setDecodeArgs用于初始化组件API调用参数,它将input buffer地址、偏移量、数据长度设置给输入参数ivd_video_decode_ip_t;除此之外还会把output buffer地址记录到输入参数的s_out_buffer字段上。
if (false == mHeaderDecoded) {
/* Decode header and get dimensions */
setParams(mStride, IVD_DECODE_HEADER);
}
如果HEVC的数据头还未被解出,要先设定参数IVD_DECODE_HEADER给解码器,用于解出数据头。接下来正式进入到数据处理:
mTimeStart = systemTime();
nsecs_t delay = mTimeStart - mTimeEnd;
// 1. 解码数据
(void) ivdec_api_function(mDecHandle, ps_decode_ip, ps_decode_op);
mTimeEnd = systemTime();
nsecs_t decodeTime = mTimeEnd - mTimeStart;
ALOGV("decodeTime=%6" PRId64 " delay=%6" PRId64 " numBytes=%6d", decodeTime, delay,
ps_decode_op->u4_num_bytes_consumed);
// 2. 判断输出结果,
if (IVD_MEM_ALLOC_FAILED == (ps_decode_op->u4_error_code & IVD_ERROR_MASK)) {
ALOGE("allocation failure in decoder");
mSignalledError = true;
work->workletsProcessed = 1u;
work->result = C2_CORRUPTED;
return;
} else if (IVD_STREAM_WIDTH_HEIGHT_NOT_SUPPORTED ==
(ps_decode_op->u4_error_code & IVD_ERROR_MASK)) {
ALOGE("unsupported resolution : %dx%d", mWidth, mHeight);
mSignalledError = true;
work->workletsProcessed = 1u;
work->result = C2_CORRUPTED;
return;
// 3. IVD_RES_CHANGED
} else if (IVD_RES_CHANGED == (ps_decode_op->u4_error_code & IVD_ERROR_MASK)) {
ALOGV("resolution changed");
drainInternal(DRAIN_COMPONENT_NO_EOS, pool, work);
resetDecoder();
resetPlugin();
work->workletsProcessed = 0u;
/* Decode header and get new dimensions */
setParams(mStride, IVD_DECODE_HEADER);
(void) ivdec_api_function(mDecHandle, ps_decode_ip, ps_decode_op);
} else if (IS_IVD_FATAL_ERROR(ps_decode_op->u4_error_code)) {
ALOGE("Fatal error in decoder 0x%x", ps_decode_op->u4_error_code);
mSignalledError = true;
work->workletsProcessed = 1u;
work->result = C2_CORRUPTED;
return;
}
调用ivdec_api_function前后都获取一下系统时间,这样就能知道解码耗时了。解码完成后需要根据ivd_video_decode_op_t输出参数的u4_error_code判断结果,如果返回error就把workletsProcessed置为1,将result置为C2_CORRUPTED。
3、finishWork
4、drainInternal
5、drain
6、others
原文阅读:
Android Codec2(三十)C2SoftHevcDec - Ⅱ
扫描下方二维码,关注公众号《青山渺渺》阅读音视频开发内容。