1. openh264进行视频编码

编码过程

初始化ISVCEncoder

ISVCEncoder* encoder_;
int rv = WelsCreateSVCEncoder(&encoder_);
if (0 != rv || !encoder_) {
        //error
}

  ISVCEncoder提供了一系列的编码有关的接口,如初始化编码器、设置编码参数、编码等;

设置相关参数

  openh264提供了两个结构体来设置编码参数,SEncParamBase与SEncParamExt, SEncParamBase仅通过了最基础的参数设置, 其定义如下:

typedef struct TagEncParamBase {
  EUsageType
  iUsageType;                 ///< application type;1.CAMERA_VIDEO_REAL_TIME:camera video signal; 2.SCREEN_CONTENT_REAL_TIME:screen content signal;

  int       iPicWidth;        ///< width of picture in luminance samples (the maximum of all layers if multiple spatial layers presents)
  int       iPicHeight;       ///< height of picture in luminance samples((the maximum of all layers if multiple spatial layers presents)
  int       iTargetBitrate;   ///< target bitrate desired, in unit of bps
  RC_MODES  iRCMode;          ///< rate control mode
  float     fMaxFrameRate;    ///< maximal input frame rate
} SEncParamBase, *PEncParamBase;

其中参数iUsageType指明应用的类型,类型包括:

typedef enum {
  CAMERA_VIDEO_REAL_TIME,      ///< camera video for real-time communication
  SCREEN_CONTENT_REAL_TIME,    ///< screen content signal
  CAMERA_VIDEO_NON_REAL_TIME
} EUsageType;

iRCMode指定码率控制的模式, openh264提供的模式如下:

typedef enum {
  RC_QUALITY_MODE = 0,     ///< quality mode
  RC_BITRATE_MODE = 1,     ///< bitrate mode
  RC_BUFFERBASED_MODE = 2, ///< no bitrate control,only using buffer status,adjust the video quality
  RC_TIMESTAMP_MODE = 3, //rate control based timestamp
  RC_BITRATE_MODE_POST_SKIP = 4, ///< this is in-building RC MODE, WILL BE DELETED after algorithm tuning!
  RC_OFF_MODE = -1,         ///< rate control off mode
} RC_MODES;

对编码器的初始化例子如下:

SEncParamBase paramBase;
paramBase.iPicWidth = width_;
paramBase.iPicHeight = height_;
paramBase.fMaxFrameRate = fps_;
paramBase.iTargetBitrate = 10 * width_ * height_;
paramBase.iUsageType = CAMERA_VIDEO_REAL_TIME;
paramBase.iRCMode = RC_BITRATE_MODE;
int ret = encoder_->Initialize(&paramBase);
if (0 != ret) {
        //error
}

编码

  编码需要用到SSourcePicture和SFrameBSInfo两个结构体, SSourcePicture用来保存需要编码的数据信息, 而SFrameBSInfo会保存编码完成后的数据。
SSourcePicture的定义:

typedef struct Source_Picture_s {
  int       iColorFormat;          ///< color space type
  int       iStride[4];            ///< stride for each plane pData
  unsigned char*  pData[4];        ///< plane pData
  int       iPicWidth;             ///< luma picture width in x coordinate
  int       iPicHeight;            ///< luma picture height in y coordinate
  long long uiTimeStamp;           ///< timestamp of the source picture, unit: millisecond
} SSourcePicture;
  • iColorFormat:颜色空间的类型,如videoFormatI420;
  • iStride, 每个plane的stride,对于plane和stride的理解可参考yuv 图像里的stride和plane的解释。
  • pData,指向每个plane的指针;
    SFrameBSInfo结构体在编码前只需要用memset将结构体中的数据置为0即可。其定义如下:
typedef struct {
  int           iLayerNum;
  SLayerBSInfo  sLayerInfo[MAX_LAYER_NUM_OF_FRAME];

  EVideoFrameType eFrameType;
  int iFrameSizeInBytes;
  long long uiTimeStamp;
} SFrameBSInfo, *PFrameBSInfo;

typedef struct {
  unsigned char uiTemporalId;
  unsigned char uiSpatialId;
  unsigned char uiQualityId;
  EVideoFrameType eFrameType;
  unsigned char uiLayerType;

  /**
   * The sub sequence layers are ordered hierarchically based on their dependency on each other so that any picture in a layer shall not be
   * predicted from any picture on any higher layer.
  */
  int   iSubSeqId;                ///< refer to D.2.11 Sub-sequence information SEI message semantics
  int   iNalCount;              ///< count number of NAL coded already
  int*  pNalLengthInByte;       ///< length of NAL size in byte from 0 to iNalCount-1
  unsigned char*  pBsBuf;       ///< buffer of bitstream contained
} SLayerBSInfo, *PLayerBSInfo;

SFrameBSInfo结构体比较复杂, 具体使用情况下面再解释。

对于i420数据的编码过程:

  SSourcePicture pic;
  memset(&pic, 0, sizeof(pic));
  pic.iPicWidth = width_;
  pic.iPicHeight = height_;
  pic.iColorFormat = videoFormatI420;
  pic.iStride[0] = pic.iPicWidth;
  pic.iStride[1] = pic.iStride[2] = pic.iPicWidth >> 1;
  pic.pData[0] = (unsigned char *) i420Buffer;
  pic.pData[1] = pic.pData[0] + width_ * height_;
  pic.pData[2] = pic.pData[1] + (width_ * height_ >> 2);
  SFrameBSInfo info;
  memset(&info, 0, sizeof(SFrameBSInfo));
  int rv = encoder_->EncodeFrame(&pic, &info);

  int retSize = 0;
  if (rv != cmResultSuccess) {
        //error info
      return retSize;
  }
 if (info.eFrameType != videoFrameTypeSkip) {
        int type = info.eFrameType;
        for (int i = 0; i < info.iLayerNum; ++i) {
            const SLayerBSInfo &layerInfo = info.sLayerInfo[i];
            int layerSize = 0;
            for (int j = 0; j < layerInfo.iNalCount; ++j) {
                layerSize += layerInfo.pNalLengthInByte[j];
            }
            memcpy((char *) (oBuf + retSize), (char *) layerInfo.pBsBuf, layerSize);
            retSize += layerSize;
        }
    }

其中i420Buffer是指向原始yuv数据的指针,oBuf是指向h264流缓冲区的指针。由上可知,整个过程分为几步:

  • 定义SSourcePicture和SFrameBSInfo, 并给SSourcePicture赋值;
  • 编码;
  • 判断编码是否成功;
  • 判断帧类型,如果不是跳帧, 则读取编码后数据;

  SFrameBSInfo的参数iLayerNum表示编码后的NAL数量。编码后的h264数据存放在SFrameBSInfo的sLayerInfo结构数组中,其中每个结构体中的pBsBuf表示编码得到的数据,而长度是结构体pNalLengthInByteint数组加起来的和,数组长度由结构体的iNalCount成员表示。

释放

先调用ISVCEncoder的Uninitialize函数,再调用WelsDestroySVCEncoder即可。

功能实现

定义接口

在实现OpenH264编码之前,我们先定义一些常用视频编码接口:

  • 打开编码器:Open
  • 关闭编码器:Close
  • 配置编码器:Configure或者ReConfigure
  • 获取编码器配置信息:GetConfig
  • 编码:Encode
  • 请求关键帧:ForceKeyframe

根据上面的定义,可以使用C++来抽象出一个虚基类

namespace toy {
	// 编码器基类
	class VideoEncoder {
	public:
		// 参数定义
		struct Setting {
			Setting() {
				fps = 15;
				frame_skip = false;
			}
			uint32_t width; // 视频宽
			uint32_t height; // 视频高
			uint32_t bitrate_bps; // 目标码率
			uint32_t fps; // 帧率
			bool frame_skip; // 是否开启跳帧
			uint32_t qp; // qp值
		};
		
		// 创建编码器的静态接口
		static std::shared_ptr<VideoEncoder> Create(VideoCodecType type);

		virtual ~VideoEncoder() {}

		// 打开编码器
		virtual bool Open(const Setting& param) = 0;

		// 重新配置编码器
		virtual void ReConfigure(const Setting& param) = 0;

		// 返回编码器的参数
		virtual Setting GetConfig() = 0;

		// 编码
		virtual bool Encode(
			const uint8_t** yuv_data,// uint8_t* yuv_data[3]
			uint8_t* pkt,
			size_t& pkt_size,
			bool& is_keyframe,
			bool& got_output
		) = 0;

		// 编码
		virtual bool Encode(
			const uint8_t* yuv_data,
			uint8_t* pkt,
			size_t& pkt_size,
			bool& is_keyframe,
			bool& got_output
		) = 0;

		// 请求关键帧
		virtual void ForceKeyframe() = 0;

		// 关闭编码器
		virtual void Close() = 0;
	};
}

OpenH264的一些概念

  • ISVCEncoder:表示一个编码器
  • TagEncParamExt:表示编码器参数
  • Source_Picture_s:表示一个图像
  • 关键帧:即I帧(其他的两种帧类型是:P帧和B帧)。在实时音视频领域,一般只用到I帧和P帧
  • 图像组(GOP):一般来说,两个关键帧之间即为一个图像组

使用OpenH264进行视频编码

一些注意事项:

  • 本实现没有使用时域编码(如果你需要svc,那么可以开启);也没有使用空间分层(即只有一个分辨率)
  • GOP设置得大一些,目的是节省码率,因为I帧通常比较大;有需要的时候编码器会自行检测场景切换然后编码一个关键帧;当然也可以调用接口强制编码一个I帧。

头文件定义:

// OpenH264Encoder
#ifndef __VIDEO_OPENH264_ENCODER_H__
#define __VIDEO_OPENH264_ENCODER_H__
#include "video_codec.h"

class ISVCEncoder;
struct TagEncParamExt;
struct Source_Picture_s;

class OpenH264Encoder :public toy::VideoEncoder
{
public:
	OpenH264Encoder();
	~OpenH264Encoder();

	//初始化编解码
	virtual bool Open(const Setting& param) override;

	//结束释放
	virtual void Close() override;

	//编码
	virtual bool Encode(
		const uint8_t** yuv_data,
		uint8_t* pkt,
		size_t& pkt_size,
		bool& is_keyframe,
		bool& got_output
	) override;

	virtual bool Encode(
		const uint8_t* yuv_data,
		uint8_t* pkt,
		size_t& pkt_size,
		bool& is_keyframe,
		bool& got_output
	) override;

	//调整码率
	virtual void ReConfigure(const Setting& param) override;

	//强制i帧
	virtual void ForceKeyframe() override;

	virtual Setting GetConfig() override;

private:
	void SetOption(const Setting& param);
private:
	ISVCEncoder* encoder_;
	TagEncParamExt* encode_param_;
	Source_Picture_s* picture_;
	Setting setting_;
	uint8_t* picture_buffer_;
	bool is_open_;
	bool temporal_;
	unsigned int gop_size_;
	bool enable_fixed_gop_;
	uint64_t encoded_frame_count_;
	uint64_t timestamp_;
};

#endif __VIDEO_OPENH264_ENCODER_H__

源文件实现

#include "openh264_encoder.h"
#include "openh264/wels/codec_api.h"
#include <cstdio>
#include <memory>

// 最大和最小的码率
#define kVideoBitrateMin 80000
#define kVideoBitrateMax 6000000

OpenH264Encoder::OpenH264Encoder()
{
	encoder_ = nullptr;
	encode_param_ = new SEncParamExt;
	picture_ = new SSourcePicture;
	picture_buffer_ = nullptr;
	is_open_ = false;
	temporal_ = false;
	gop_size_ = 400;
	enable_fixed_gop_ = true;
	encoded_frame_count_ = 0;
	timestamp_ = 0;
}

OpenH264Encoder::~OpenH264Encoder()
{
	Close();
	if (encode_param_)
	{
		delete encode_param_;
		encode_param_ = nullptr;
	}
	if (picture_)
	{
		delete picture_;
		picture_ = nullptr;
	}
}

void OpenH264Encoder::Close()
{
	if (encoder_) {
		encoder_->Uninitialize();
		WelsDestroySVCEncoder(encoder_);
		encoder_ = NULL;
	}
	if (picture_buffer_)
	{
		delete[] picture_buffer_;
		picture_buffer_ = nullptr;
	}
	is_open_ = false;
}
bool OpenH264Encoder::Open(const Setting& param){
	Close();
	// 创建编码器对象
	int err = WelsCreateSVCEncoder(&encoder_);
	if (cmResultSuccess != err) {
		Close();
		return false;
	}
	temporal_ = false;

	// 获取默认参数
	encoder_->GetDefaultParams(encode_param_);
	// 复杂度
	ECOMPLEXITY_MODE complexityMode = HIGH_COMPLEXITY;
	// 码控模式
	RC_MODES rc_mode = RC_BITRATE_MODE;
	//int iMaxQp = qp_;
	//int iMinQp = 0;
	bool bEnableAdaptiveQuant = false;

	rc_mode = RC_BITRATE_MODE;//RC_QUALITY_MODE

	// 其他的参数:分辨率、码率、帧率等等
	encode_param_->iPicWidth = param.width;
	encode_param_->iPicHeight = param.height;
	encode_param_->iTargetBitrate = param.bitrate_bps;
	encode_param_->iMaxBitrate = kVideoBitrateMax;
	encode_param_->bEnableAdaptiveQuant = bEnableAdaptiveQuant;
	encode_param_->iRCMode = rc_mode;
	encode_param_->fMaxFrameRate = param.fps;
	encode_param_->iComplexityMode = complexityMode;
	encode_param_->iNumRefFrame = -1;
	encode_param_->bEnableFrameSkip = param.frame_skip;
	encode_param_->eSpsPpsIdStrategy = CONSTANT_ID;

	encode_param_->iEntropyCodingModeFlag = 0;//1;
	//encode_param_->bEnablePsnr = false;
	encode_param_->bEnableSSEI = true;
	encode_param_->bEnableSceneChangeDetect = true;

	// 设置QP,可以根据自己的需要来,QP越大码率越小(图像的质量越差)
	encode_param_->iMaxQp = 38;
	encode_param_->iMinQp = 16;
	// 当前不支持时域编码
	if (temporal_)
	{
		encode_param_->iTemporalLayerNum = 4;
	}

	SetOption(param);

	// 空间层
	SSpatialLayerConfig *spaticalLayerCfg = &(encode_param_->sSpatialLayers[SPATIAL_LAYER_0]);
	spaticalLayerCfg->iVideoWidth = param.width;
	spaticalLayerCfg->iVideoHeight = param.height;
	spaticalLayerCfg->fFrameRate = param.fps;
	spaticalLayerCfg->iSpatialBitrate = encode_param_->iTargetBitrate;
	spaticalLayerCfg->iMaxSpatialBitrate = encode_param_->iMaxBitrate;
	//spaticalLayerCfg->uiProfileIdc= PRO_SCALABLE_BASELINE;
	
	// 单slice,有的解码器无法解码多slice
	spaticalLayerCfg->sSliceArgument.uiSliceMode = SM_SINGLE_SLICE;

	// 初始化
	if (cmResultSuccess != (err = encoder_->InitializeExt(encode_param_))) {
		Close();
		return false;
	}
	
	// 日志级别
	int log_level = WELS_LOG_ERROR;
	encoder_->SetOption(ENCODER_OPTION_TRACE_LEVEL, &log_level);

	// 初始化图片对象
	memset(picture_, 0, sizeof(SSourcePicture));
	picture_->iPicWidth = param.width;
	picture_->iPicHeight = param.height;
	// 只支持yuv420p
	picture_->iColorFormat = videoFormatI420;
	picture_->iStride[0] = picture_->iPicWidth;
	picture_->iStride[1] = picture_->iStride[2] = picture_->iPicWidth >> 1;
	//yuvData = CFDataCreateMutable(kCFAllocatorDefault, inputSize.width * inputSize.height * 3 >> 1);
	picture_buffer_ = new uint8_t[picture_->iPicWidth * picture_->iPicHeight * 3 >> 1];
	picture_->pData[0] = (unsigned char*)picture_buffer_;//CFDataGetMutableBytePtr(yuvData);
	picture_->pData[1] = picture_->pData[0] + picture_->iPicWidth * picture_->iPicHeight;
	picture_->pData[2] = picture_->pData[1] + (picture_->iPicWidth * picture_->iPicHeight >> 2);

	is_open_ = true;

	setting_ = param;

	return true;
}

void OpenH264Encoder::SetOption(const Setting& param) {
	// 码率重设
	if (param.bitrate_bps != setting_.bitrate_bps) {

		setting_.bitrate_bps = param.bitrate_bps;

		encode_param_->iTargetBitrate = setting_.bitrate_bps;
		encode_param_->sSpatialLayers[SPATIAL_LAYER_0].iSpatialBitrate = setting_.bitrate_bps;

		SBitrateInfo bitrate_info;
		bitrate_info.iLayer = SPATIAL_LAYER_0;
		bitrate_info.iBitrate = setting_.bitrate_bps;
		int err = encoder_->SetOption(ENCODER_OPTION_BITRATE, &bitrate_info);
		if (err != 0) {
			// TODO
		}
	}

	// 帧率重设
	if (param.fps != setting_.fps) {
		int err = 0;

		setting_.fps = param.fps;

		// 注意:如果没有设置固定的I帧间隔,那么默认一个fps就输出一个I帧
		int uiIntraPeriod = param.fps;

		if (temporal_)
		{
			uiIntraPeriod = 16;
			if (param.fps > 16)
			{
				uiIntraPeriod = 32;
			}
		}

		// 如果设置了固定的I帧间隔,那么使用默认的I帧间隔,默认是400个
		if (enable_fixed_gop_)
		{
			uiIntraPeriod = gop_size_;
		}

		SSpatialLayerConfig *spaticalLayerCfg = &(encode_param_->sSpatialLayers[SPATIAL_LAYER_0]);

		// 设置I帧间隔
		if (encode_param_->uiIntraPeriod != uiIntraPeriod)
		{
			encode_param_->uiIntraPeriod = uiIntraPeriod;
			err = encoder_->SetOption(ENCODER_OPTION_IDR_INTERVAL, &uiIntraPeriod);
			if (err != 0)
			{
				printf("Apply new video idr interval %d code %d\n", uiIntraPeriod, err);
			}
		}

		// 设置帧率
		if (spaticalLayerCfg->fFrameRate != param.fps)
		{
			float fps = param.fps;
			err = encoder_->SetOption(ENCODER_OPTION_FRAME_RATE, (void*)&fps);
			if (err != 0)
			{
				printf("Apply new video frame rate %d code %d\n", (int)param.fps, err);
			}
		}

		encode_param_->uiIntraPeriod = uiIntraPeriod;
		spaticalLayerCfg->fFrameRate = setting_.fps;
	}

	// 跳帧操作
	if (encode_param_->bEnableFrameSkip != param.frame_skip)
	{
		setting_.frame_skip = param.frame_skip;

		int err = encoder_->SetOption(ENCODER_OPTION_RC_FRAME_SKIP, &setting_.frame_skip);
		if (err != 0)
		{
			printf("SetFrameSkip %d code %d \r\n", setting_.frame_skip, err);
			return;
		}
		encode_param_->bEnableFrameSkip = setting_.frame_skip;
	}

	// TODO:QP
}

void OpenH264Encoder::ReConfigure(const Setting& param) {
	if (is_open_ == false) {
		Open(param);
		return;
	}

	// 分辨率改变了,重启编码器
	if (param.width != setting_.width || param.height != setting_.height) {
		Close();
		if (Open(param) == false) {
			// TODO:
		}
		return;
	}

	SetOption(param);

	setting_ = param;

	return ;
}

bool OpenH264Encoder::Encode( 
	const uint8_t** yuv_data, 
	uint8_t* pkt, 
	size_t& pkt_size, 
	bool& is_keyframe, 
	bool& got_output )
{
	got_output = false;
	pkt_size = 0;

	if (!is_open_) {
		return false;
	}

	if (!encoder_)
	{
		return false;
	}

	encoded_frame_count_++;

	picture_->uiTimeStamp = timestamp_++; 

	int y_size = picture_->iPicWidth * picture_->iPicHeight;
	memcpy(picture_->pData[0] + 0,				yuv_data[0], y_size);
	memcpy(picture_->pData[0] + y_size,			yuv_data[1], y_size / 4);
	memcpy(picture_->pData[0] + y_size * 5 / 4, yuv_data[2], y_size / 4);

	int iFrameSize = 0;

	SFrameBSInfo encoded_frame_info;

	int err = encoder_->EncodeFrame(picture_, &encoded_frame_info);

	if (err) {
		Close();
		return 0;
	}

	if (encoded_frame_info.eFrameType == videoFrameTypeInvalid) {
		return 0;
	}

	int temporal_id = 0;

	if (encoded_frame_info.eFrameType != videoFrameTypeSkip) {
		int iLayer = 0;
		while (iLayer < encoded_frame_info.iLayerNum) {
			SLayerBSInfo* pLayerBsInfo = &(encoded_frame_info.sLayerInfo[iLayer]);
			if (pLayerBsInfo != NULL) {
				int iLayerSize = 0;
				temporal_id = pLayerBsInfo->uiTemporalId;
				int iNalIdx = pLayerBsInfo->iNalCount - 1;
				do {
					iLayerSize += pLayerBsInfo->pNalLengthInByte[iNalIdx];
					--iNalIdx;
				} while (iNalIdx >= 0);
				memcpy(pkt + iFrameSize, pLayerBsInfo->pBsBuf, iLayerSize);
				iFrameSize += iLayerSize;
			}
			++iLayer;
		}

		got_output = true;
	}
	else {
#ifdef _DEBUG
		printf("!!!!videoFrameTypeSkip---!\n");
#endif
		is_keyframe = false;
	}
	if (iFrameSize > 0)
	{
		pkt_size = iFrameSize;

		EVideoFrameType ft_temp = encoded_frame_info.eFrameType;
		if (ft_temp == 1 || ft_temp == 2)
		{
			is_keyframe = true;
		}
		else if (ft_temp == 3)
		{
			is_keyframe = false;
			if (temporal_)
			{
				if (temporal_id == 0 || temporal_id == 1)
				{
					is_keyframe = true;
				}
			}
		}
		else
		{
			is_keyframe = false;
		}
	}

	return iFrameSize;
}

bool OpenH264Encoder::Encode(
	const uint8_t* yuv_data,
	uint8_t* pkt,
	size_t& pkt_size,
	bool& is_keyframe,
	bool& got_output
)
{
	const uint8_t* yuv[3] = { 0 };
	if (yuv_data == NULL) {
		return Encode(yuv, pkt, pkt_size, is_keyframe, got_output);
	}
	else {
		int y_size = setting_.width * setting_.height;
		yuv[0] = yuv_data;
		yuv[1] = yuv_data + y_size;
		yuv[2] = yuv_data + y_size * 5 / 4;
		return Encode(yuv, pkt, pkt_size, is_keyframe, got_output);
	}

}

//强制i帧
void OpenH264Encoder::ForceKeyframe()
{
	if (!is_open_) {
		return;
	}
	
	encoder_->ForceIntraFrame(true);
}

toy::VideoEncoder::Setting OpenH264Encoder::GetConfig() {
	return setting_;
}

//是否使用分层
//bool OpenH264Encoder::IsTemporal()
//{
//	return temporal_used_;
//}

转载文章链接:
讲解:https://www.jianshu.com/p/5208f37090a9
使用:https://blog.youkuaiyun.com/NB_vol_1/article/details/103376649

OpenH264 是思科公司发布的一个开源的 H.264 编码和解码器。编码器特性Constrained Baseline Profile up to Level 5.2 (4096x2304)Arbitrary resolution, not constrained to multiples of 16x16Rate control with adaptive quantization, or constant quantizationSlice options: 1 slice per frame, N slices per frame, N macroblocks per slice, or N bytes per sliceMultiple threads automatically used for multiple slicesTemporal scalability up to 4 layers in a dyadic hierarchySpatial simulcast up to 4 resolutions from a single inputLong Term Reference (LTR) framesMemory Management Control Operation (MMCO)Reference picture list modificationSingle reference frame for inter predictionMultiple reference frames when using LTR and/or 3-4 temporal layersPeriodic and on-demand Instantaneous Decoder Refresh (IDR) frame insertionDynamic changes to bit rate, frame rate, and resolutionAnnex B byte stream outputYUV 4:2:0 planar input解码器特性Constrained Baseline Profile up to Level 5.2 (4096x2304)Arbitrary resolution, not constrained to multiples of 16x16Single thread for all slicesLong Term Reference (LTR) framesMemory Management Control Operation (MMCO)Reference picture list modificationMultiple reference frames when specified in Sequence Parameter Set (SPS)Annex B byte stream inputYUV 4:2:0 planar output支持的操作系统Windows 64-bit and 32-bit (initial release is only 32-bit, 64-bit will follow soon)Mac OS X 64-bit (initial release does not include this target, will follow soon)Linux 64-bit and 32-bit (initial release is only 32-bit, 64-bit will follow soon)Android 32-bit (initial release does not include this target, will follow soon)iOS 64-bit and 32-bit (not supported yet, may be added in the future)支持的处理器Intel x86 optionally with MMX/SSE (no AVX yet, help is welcome)ARMv7 optionally with NEON (initial release does not include this target, will follow later)Any architecture using C/C fallback functions 标签:OpenH264
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值