GStreamer 使用经验
参考资料
GitLab 开源地址
GStreamer iOS demo代码
GStreamer iOS 教程 最好下载相关代码自己运行和看配置。
GStreamer中文教程
GStreamer 教程 中文 微信 —大话liunx: 适合理解GStreamer基本概念。
环境集成和项目设置
下载安装
GStreamer不支持Swift Package Manager、Cocoapods、Carxxxx等。
官方提供iOS的Framework,但是需要通过pkg安装,通过pkg iOS 支持多个版本选择下载后双击,安装失败去系统设置-> 隐私和安全 去允许安装。
安装成功后的路径:"$(HOME)/Library/Developer/GStreamer/iPhone.sdk”,如果安装自己调整了路径,后续配置也要对应调整。
项目配置
项目需要添加Framework,点击+后通过add Others 添加(路径为安装路径)。
还应添加其他支持系统库支持libiconv.2
build settings 设置
设置搜索路径
Framework Search Paths:"$(HOME)/Library/Developer/GStreamer/iPhone.sdk"
Header search Paths:"$(HOME)/Library/Developer/GStreamer/iPhone.sdk/GStreamer.framework/Headers"
Other Link flags:
"-lresolv",
"-lc++",
"-framework",
CoreFoundation,
"-framework",
Foundation,
"-framework",
AVFoundation,
"-framework",
CoreMedia,
"-framework",
CoreVideo,
"-framework",
CoreAudio,
"-framework",
AudioToolbox,
"-framework",
OpenGLES,
"-framework",
AssetsLibrary,
"-framework",
QuartzCore,
"-framework",
AssetsLibrary,
"-weak_framework",
VideoToolBox,
"-framework",
IOSurface,
"-framework",
Metal,
插件注册
GSteamer 是基于插件设计,需要进行基本信息和插件注册。
如果没有注册,你可能会遇到类似no element "appsrc”等的错误。
gst_ios_init
最好是直接将官方Demo的gst_ios_init.h和.m导入,并掉用gst_ios_init方法(最好在应用启动后或者使用前掉用)。
可以通过gst_ios_init的取消代码注释来注册某些功能的插件。
/*
* 控制插件注册,需要就不注释。GStreamer 的功通过插件方式集成。通过以下宏,静态注册相关插件。
*
*/
#define GST_IOS_PLUGINS_CORE // GStreamer 核心
#define GST_IOS_PLUGINS_CODECS // 解码
//#define GST_IOS_PLUGINS_ENCODING //用于编码(如录音、转码)
//#define GST_IOS_PLUGINS_NET //用于网络流(如 RTSP, HLS)
#define GST_IOS_PLUGINS_PLAYBACK// 播放相关
//#define GST_IOS_PLUGINS_VIS //用于音频可视化(如频谱图)
#define GST_IOS_PLUGINS_SYS // 系统相关,和系统交互
#define GST_IOS_PLUGINS_EFFECTS //
//#define GST_IOS_PLUGINS_CAPTURE //用于从摄像头或麦克风捕获
//#define GST_IOS_PLUGINS_CODECS_GPL //包含 GPL 许可证的编解码器(如 x264)
//#define GST_IOS_PLUGINS_CODECS_RESTRICTED //包含有专利的编解码器(如 H.264, AAC)
//#define GST_IOS_PLUGINS_NET_RESTRICTED //包含有专利的网络协议
//#define GST_IOS_PLUGINS_GES //用于视频编辑
//#define GST_IOS_GIO_MODULE_GNUTLS
void gst_ios_init (void);
手动注册
除了可以通过gst_ios_init 注册外,还可以通过以下方式手动注册
gst_init_check (NULL, NULL, NULL); // 或者gst_init方法
GST_PLUGIN_STATIC_DECLARE(opus); // 声明 Opus 插件
GST_PLUGIN_STATIC_REGISTER(opus); // 注册 Opus 插件
Swift 使用
本来想通过SPM包装Framework,但是没有ModuleMap文件,导致受限。
暂时通过OC封装。
错误收集
no element “appsrc
没有注册相关插件,参考官方gst_ios_init或者类似处理。
同理在引入gst_ios_init的情况下,其他错误处理
报错no element "decodebin”,取消注释:#define GST_IOS_PLUGINS_PLAYBACK
_gst_plugin_y4m_register
编译报错: “_gst_plugin_y4m_register”, referenced from:
_gst_ios_init in gst_ios_init.o
暂时处理,将gst_ios_init文件 y4m相关注释,
猜测应该是各版本插件不一样。
Undefined symbols
编译过程中报错了很多次Undefined symbols。主要是系统库没有link到。
错误:
Undefined symbols for architecture arm64:
"_AudioComponentFindNext", referenced from:
_gst_core_audio_open_device in GStreamer[arm64][771](libgstosxaudio_a-gstosxcoreaudiocommon.c.o)
"_AudioComponentInstanceDispose", referenced from:
_gst_core_audio_close in GStreamer[arm64][767](libgstosxaudio_a-gstosxcoreaudio.c.o)
_OUTLINED_FUNCTION_4 in GStreamer[arm64][771](libgstosxaudio_a-gstosxcoreaudiocommon.c.o)
"_AudioComponentInstanceNew", referenced from:
_gst_core_audio_open_device in GStreamer[arm64][771](libgstosxaudio_a-gstosxcoreaudiocommon.c.o)
"_AudioOutputUnitStart", referenced from:
_gst_core_audio_io_proc_start in GStreamer[arm64][771](libgstosxaudio_a-gstosxcoreaudiocommon.c.o)
"_AudioOutputUnitStop", referenced from:
_gst_core_audio_io_proc_stop in GStreamer[arm64][771](libgstosxaudio_a-gstosxcoreaudiocommon.c.o)
"_AudioQueueAllocateBuffer", referenced from:
_gst_atdec_set_format in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
_gst_atdec_handle_frame in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
_gst_atdec_offline_render in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
"_AudioQueueDispose", referenced from:
_gst_atdec_destroy_queue in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
"_AudioQueueEnqueueBuffer", referenced from:
_gst_atdec_handle_frame in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
"_AudioQueueFlush", referenced from:
_gst_atdec_handle_frame in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
"_AudioQueueFreeBuffer", referenced from:
_gst_atdec_set_format in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
_gst_atdec_set_format in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
_gst_atdec_buffer_emptied in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
_OUTLINED_FUNCTION_15 in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
"_AudioQueueNewOutput", referenced from:
_gst_atdec_set_format in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
"_AudioQueueOfflineRender", referenced from:
_gst_atdec_set_format in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
_gst_atdec_offline_render in GStreamer[arm64][95](libgstapplemedia_a-atdec.c.o)
没有在Build settings 的 Other Link flags 中添加-framework",CoreFoundation,等flag,又没有在framework中添加相关系统库。
两种方案选择一种添加系统支持的库即可。
使用
调试日志
GST_DEBUG 的值格式为:[类别]:[级别]
debug等级:
//数字越大,信息越详细。
0: 无 (NONE)
1: 错误 (ERROR)
2: 警告 (WARNING)
3: 信息 (INFO) - 推荐用于一般问题排查
4: 调试 (DEBUG) - 最常用的调试级别
5: 日志 (LOG) - 信息量巨大
9: 最多 (NONE) - 显示所有信息,慎用
Xcode 参数设置方式
日志debug设置 在项目Scheme->Arguments -> Environment Variables中添加GST_DEBUG Value值()
示例:
4 或 *:4:开启所有模块的 DEBUG 级别日志。这是最常用的设置,可以让你看到管道状态变化、数据流动等详细信息
GST_LOG_PIPELINE:4,GST_LOG_BUFFER:5:只看管道和缓冲区的详细日志,可以过滤掉不关心的信息,非常高效
代码控制
在GStreamer初始化后
#import <gst/gst.h>
// 在你的初始化代码中,例如 AppDelegate 或者 GStreamerOggOpusDecoder 的 init 方法中
void enableGStreamerDebug() {
// 开启所有模块的 DEBUG 级别日志
gst_debug_set_default_threshold(GST_LEVEL_DEBUG);
// 或者,更精细地控制特定模块的日志级别
// gst_debug_set_threshold_for_name("pipeline", GST_LEVEL_DEBUG);
// gst_debug_set_threshold_for_name("opusdec", GST_LEVEL_INFO);
}
// 在你的应用启动时调用
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
// ...
if (gst_init_check(NULL, NULL, NULL)) {
enableGStreamerDebug(); // 开启调试
NSLog(@"GStreamer initialized with debug enabled.");
}
// ...
}
播放器
.h
#import <Foundation/Foundation.h>
NS_ASSUME_NONNULL_BEGIN
@class GStreamerPlayer;
@protocol GStreamerPlayerDelegate <NSObject>
@optional
/// Called when playback actually started (pipeline entered PLAYING).
- (void)gstreamerPlayerDidStart:(GStreamerPlayer *)player;
/// Called when playback was paused (pipeline entered PAUSED or pause request returned).
- (void)gstreamerPlayerDidPause:(GStreamerPlayer *)player;
/// Called when playback was resumed (explicit resume/play called or pipeline entered PLAYING).
- (void)gstreamerPlayerDidResume:(GStreamerPlayer *)player;
/// Called when playback finished (EOS received and pipeline stopped).
- (void)gstreamerPlayerDidFinish:(GStreamerPlayer *)player;
/// 解析成功的pcm数据。
- (void)gstreamerPlayer:(GStreamerPlayer *)player willPlayData:(NSData *)pcmData;
/// Called when the player encounters an error.
- (void)gstreamerPlayer:(GStreamerPlayer *)player didFailWithError:(NSError *)error;
@end
@interface GStreamerPlayer : NSObject
@property (nonatomic, weak, nullable) id<GStreamerPlayerDelegate> delegate;
// MARK: - Public Methods
/// 初始化并配置 GStreamer 管道。必须在调用播放控制方法之前调用。
/// - Parameter capsString: 参考 ``GStreamerCapsGenerator 方法生成``
- (BOOL)setupPipelineCaps:(NSString *)capsString;
/// 开始播放并启动 GMainLoop。
- (void)start;
/// 暂停管道。
- (void)pause;
/// 从暂停状态恢复播放。
- (void)play;
/// 将播放数据添加
/// @param data NSData 对象。
- (void)pushData:(NSData *)data;
/// 向 appsrc 发送结束流信号。
- (void)stopStream;
/// 停止管道,清理所有资源。此方法是幂等的。
- (void)stop;
@end
NS_ASSUME_NONNULL_END
.m
#import "GStreamerPlayer.h"
#include <gst/app/gstappsrc.h>
#include <gst/app/gstappsink.h>
#import "gst_ios_init.h"
#import "GStreamerManager.h"
// Forward declarations for callbacks
gboolean busCallback(GstBus *bus, GstMessage *message, gpointer userData);
void onPadAdded(GstElement *element, GstPad *pad, gpointer userData);
static GstFlowReturn on_new_sample_from_sink(GstElement *appsink, gpointer user_data);
@interface GStreamerPlayer ()
@property (nonatomic, assign) GstElement *pipeline;
@property (nonatomic, assign) GstElement *appsrc;
@property (nonatomic, assign) GstBus *bus;
@property (nonatomic, assign) GMainLoop *loop;
@property (nonatomic, assign) guint busWatchId;
- (BOOL)handleMessage:(GstBus *)bus message:(GstMessage *)message;
@end
gboolean busCallback(GstBus *bus, GstMessage *message, gpointer userData) {
GStreamerPlayer *player = (__bridge GStreamerPlayer *)userData;
return (gboolean)[player handleMessage:bus message:message];
}
void onPadAdded(GstElement *element, GstPad *pad, gpointer userData) {
GStreamerPlayer *player = (__bridge GStreamerPlayer *)userData;
if (!player) return;
// Get audioconvert and link dynamically (same as before)
GstElement *audioconvert = gst_bin_get_by_name(GST_BIN(player.pipeline), "audioconvert");
if (!audioconvert) {
NSLog(@"Error: Could not find audioconvert element in onPadAdded.");
return;
}
GstPad *sinkpad = gst_element_get_static_pad(audioconvert, "sink");
if (gst_pad_is_linked(sinkpad)) {
gst_object_unref(sinkpad);
gst_object_unref(audioconvert);
return;
}
GstPadLinkReturn ret = gst_pad_link(pad, sinkpad);
if (GST_PAD_LINK_FAILED(ret)) {
NSLog(@"Failed to link pads!");
} else {
NSLog(@"Successfully linked decodebin pad to audioconvert.");
}
gst_object_unref(sinkpad);
gst_object_unref(audioconvert);
}
// appsink new-sample callback: called in GStreamer thread context
static GstFlowReturn on_new_sample_from_sink(GstElement *appsink, gpointer user_data) {
GStreamerPlayer *self = (__bridge GStreamerPlayer *)user_data;
if (!self) return GST_FLOW_ERROR;
GstSample *sample = gst_app_sink_pull_sample(GST_APP_SINK(appsink));
if (!sample) return GST_FLOW_OK;
GstBuffer *buffer = gst_sample_get_buffer(sample);
if (buffer) {
GstMapInfo map;
if (gst_buffer_map(buffer, &map, GST_MAP_READ)) {
// Copy the PCM bytes because we must unmap/unref immediately
NSData *pcm = [NSData dataWithBytes:map.data length:map.size];
gst_buffer_unmap(buffer, &map);
// Dispatch to main thread for delegate callbacks (UI safe)
dispatch_async(dispatch_get_main_queue(), ^{
// Use didPlayData/didRenderData (consistent with header)
if (self.delegate && [self.delegate respondsToSelector:@selector(gstreamerPlayer:willPlayData:)]) {
[self.delegate gstreamerPlayer:self willPlayData:pcm];
}
});
}
}
gst_sample_unref(sample);
return GST_FLOW_OK;
}
@implementation GStreamerPlayer
- (instancetype)init {
self = [super init];
if (self) {
//[GStreamerManager checkAutoGstInit];
_pipeline = NULL;
_appsrc = NULL;
_bus = NULL;
_loop = NULL;
_busWatchId = 0;
}
return self;
}
- (BOOL)setupPipelineCaps:(NSString *)capsString {
if (self.pipeline) {
NSLog(@"Warning: Pipeline already exists. Stopping and cleaning up the old one.");
[self stop];
}
GError *error = NULL;
// Use tee to branch to system sink and appsink for render callback
const char *pipelineStr =
"appsrc name=source ! queue ! decodebin name=decoder ! audioconvert name=audioconvert ! audioresample ! tee name=audio_t "
"audio_t. ! queue ! autoaudiosink "
"audio_t. ! queue ! appsink name=mysink";
self.pipeline = gst_parse_launch(pipelineStr, &error);
if (error) {
NSLog(@"Error: Failed to create pipeline: %s", error->message);
g_error_free(error);
return NO;
}
// get appsrc
self.appsrc = gst_bin_get_by_name(GST_BIN(self.pipeline), "source");
if (!self.appsrc) {
NSLog(@"Error: Failed to get appsrc element.");
gst_object_unref(self.pipeline);
self.pipeline = NULL;
return NO;
}
// set appsrc caps and properties
GstCaps *caps = gst_caps_from_string(capsString.UTF8String);
g_object_set(self.appsrc, "caps", caps, "format", GST_FORMAT_TIME,"is-live", TRUE, "min-percent", 10, NULL); // 10%才开始播放, 默认200k
gst_caps_unref(caps);
// connect decodebin pad-added
GstElement *decoder = gst_bin_get_by_name(GST_BIN(self.pipeline), "decoder");
if (!decoder) {
NSLog(@"Error: Failed to get decodebin element.");
// cleanup
if (self.appsrc) { gst_object_unref(self.appsrc); self.appsrc = NULL; }
gst_object_unref(self.pipeline);
self.pipeline = NULL;
return NO;
}
g_signal_connect(decoder, "pad-added", G_CALLBACK(onPadAdded), (__bridge void *)self);
gst_object_unref(decoder);
// Configure appsink
GstElement *appsink = gst_bin_get_by_name(GST_BIN(self.pipeline), "mysink");
if (!appsink) {
NSLog(@"Error: Failed to get appsink element.");
if (self.appsrc) { gst_object_unref(self.appsrc); self.appsrc = NULL; }
gst_object_unref(self.pipeline);
self.pipeline = NULL;
return NO;
}
// It's safer to accept raw audio; if you know exact format you can set more specific caps
GstCaps *sinkCaps = gst_caps_from_string("audio/x-raw");
g_object_set(appsink,
"emit-signals", TRUE,
"sync", FALSE,
"caps", sinkCaps,
NULL);
gst_caps_unref(sinkCaps);
// connect new-sample
g_signal_connect(appsink, "new-sample", G_CALLBACK(on_new_sample_from_sink), (__bridge void *)self);
gst_object_unref(appsink);
// Bus and main loop; save source id for removal later
self.loop = g_main_loop_new(NULL, FALSE);
self.bus = gst_element_get_bus(self.pipeline);
// gst_bus_add_watch returns a gulong/guint source id we must remove with g_source_remove
self.busWatchId = gst_bus_add_watch(self.bus, busCallback, (__bridge void *)self);
if (self.busWatchId == 0) {
NSLog(@"Warning: gst_bus_add_watch returned 0 (failed to add watch).");
}
NSLog(@"Pipeline setup successful for format: %@", capsString);
return YES;
}
- (void)start {
if (!self.pipeline) {
NSLog(@"Error: Pipeline is not set up. Call setupPipelineCaps: first.");
return;
}
GstStateChangeReturn ret = gst_element_set_state(self.pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
NSLog(@"Error: Failed to set pipeline to PLAYING state.");
} else {
NSLog(@"Pipeline is PLAYING.");
// run main loop on background queue
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
if (self.loop) g_main_loop_run(self.loop);
});
}
}
- (void)pause {
if (!self.pipeline) return;
GstStateChangeReturn ret = gst_element_set_state(self.pipeline, GST_STATE_PAUSED);
if (ret == GST_STATE_CHANGE_FAILURE) {
NSLog(@"Error: Failed to pause pipeline.");
} else {
NSLog(@"Pipeline is PAUSED.");
dispatch_async(dispatch_get_main_queue(), ^{
if (self.delegate && [self.delegate respondsToSelector:@selector(gstreamerPlayerDidPause:)]) {
[self.delegate gstreamerPlayerDidPause:self];
}
});
}
}
- (void)play {
if (!self.pipeline) return;
GstStateChangeReturn ret = gst_element_set_state(self.pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
NSLog(@"Error: Failed to resume pipeline.");
} else {
NSLog(@"Pipeline is PLAYING.");
dispatch_async(dispatch_get_main_queue(), ^{
if (self.delegate && [self.delegate respondsToSelector:@selector(gstreamerPlayerDidResume:)]) {
[self.delegate gstreamerPlayerDidResume:self];
}
});
}
}
- (void)pushData:(NSData *)data {
if (!self.appsrc || !self.pipeline) return;
if (!data || data.length == 0) return;
GstBuffer *buffer = gst_buffer_new_allocate(NULL, (gsize)data.length, NULL);
if (!buffer) {
NSLog(@"Error: Failed to allocate GstBuffer.");
NSError *error = [NSError errorWithDomain:@"com.svrglass.gstreamer" code:-1 userInfo:@{NSLocalizedDescriptionKey: @"Failed to allocate GstBuffer"}];
dispatch_async(dispatch_get_main_queue(), ^{
if (self.delegate && [self.delegate respondsToSelector:@selector(gstreamerPlayer:didFailWithError:)]) {
[self.delegate gstreamerPlayer:self didFailWithError:error];
}
});
return;
}
GstMapInfo mapInfo;
if (gst_buffer_map(buffer, &mapInfo, GST_MAP_WRITE)) {
memcpy(mapInfo.data, [data bytes], [data length]);
gst_buffer_unmap(buffer, &mapInfo);
} else {
gst_buffer_unref(buffer);
NSError *error = [NSError errorWithDomain:@"com.svrglass.gstreamer" code:-2 userInfo:@{NSLocalizedDescriptionKey: @"Failed to map GstBuffer"}];
dispatch_async(dispatch_get_main_queue(), ^{
if (self.delegate && [self.delegate respondsToSelector:@selector(gstreamerPlayer:didFailWithError:)]) {
[self.delegate gstreamerPlayer:self didFailWithError:error];
}
});
return;
}
GstFlowReturn flowReturn = gst_app_src_push_buffer(GST_APP_SRC(self.appsrc), buffer);
if (flowReturn != GST_FLOW_OK) {
NSLog(@"Error pushing buffer to appsrc: %d. Unrefing buffer.", flowReturn);
// If push failed, buffer may or may not be owned by appsrc; safe to unref here
gst_buffer_unref(buffer);
NSError *error = [NSError errorWithDomain:@"com.svrglass.gstreamer" code:flowReturn userInfo:@{NSLocalizedDescriptionKey: [NSString stringWithFormat:@"gst_app_src_push_buffer returned %d", flowReturn]}];
dispatch_async(dispatch_get_main_queue(), ^{
if (self.delegate && [self.delegate respondsToSelector:@selector(gstreamerPlayer:didFailWithError:)]) {
[self.delegate gstreamerPlayer:self didFailWithError:error];
}
});
}
}
- (void)stopStream {
if (!self.appsrc) {
NSLog(@"Error: appsrc is nil, cannot send end-of-stream.");
return;
}
GstFlowReturn flowReturn = gst_app_src_end_of_stream(GST_APP_SRC(self.appsrc));
if (flowReturn != GST_FLOW_OK) {
NSLog(@"Error sending end-of-stream signal: %d", flowReturn);
} else {
NSLog(@"End-of-stream signal sent successfully.");
}
}
- (void)stop {
if (!self.pipeline) return;
NSLog(@"Stopping pipeline and cleaning up resources.");
// Remove bus watch using saved source id
if (self.busWatchId != 0) {
g_source_remove(self.busWatchId);
self.busWatchId = 0;
}
gst_element_set_state(self.pipeline, GST_STATE_NULL);
if (self.loop) {
g_main_loop_quit(self.loop);
}
if (self.bus) { gst_object_unref(self.bus); self.bus = NULL; }
if (self.appsrc) { gst_object_unref(self.appsrc); self.appsrc = NULL; }
if (self.loop) { g_main_loop_unref(self.loop); self.loop = NULL; }
gst_object_unref(self.pipeline);
self.pipeline = NULL;
NSLog(@"Pipeline stopped and resources cleaned up.");
}
- (BOOL)handleMessage:(GstBus *)bus message:(GstMessage *)message {
if (!message) return YES;
switch (GST_MESSAGE_TYPE(message)) {
case GST_MESSAGE_EOS: {
NSLog(@"End-of-Stream received.");
dispatch_async(dispatch_get_main_queue(), ^{
if (self.delegate && [self.delegate respondsToSelector:@selector(gstreamerPlayerDidFinish:)]) {
[self.delegate gstreamerPlayerDidFinish:self];
}
[self stop];
});
break;
}
case GST_MESSAGE_ERROR: {
GError *err = NULL;
gchar *debug = NULL;
gst_message_parse_error(message, &err, &debug);
NSString *errorMessage = err && err->message ? [NSString stringWithUTF8String:err->message] : @"Unknown GStreamer error";
NSString *debugInfo = debug ? [NSString stringWithUTF8String:debug] : @"No debug info";
if (err) g_error_free(err);
if (debug) g_free(debug);
NSError *nsError = [NSError errorWithDomain:@"com.svrglass.gstreamer" code:-1 userInfo:@{NSLocalizedDescriptionKey: errorMessage, @"GStreamerDebugInfo": debugInfo}];
dispatch_async(dispatch_get_main_queue(), ^{
if (self.delegate && [self.delegate respondsToSelector:@selector(gstreamerPlayer:didFailWithError:)]) {
[self.delegate gstreamerPlayer:self didFailWithError:nsError];
}
[self stop];
});
break;
}
case GST_MESSAGE_STATE_CHANGED: {
if (GST_MESSAGE_SRC(message) == GST_OBJECT(self.pipeline)) {
GstState oldState, newState, pending;
gst_message_parse_state_changed(message, &oldState, &newState, &pending);
NSLog(@"Pipeline state changed from %s to %s", gst_element_state_get_name(oldState), gst_element_state_get_name(newState));
dispatch_async(dispatch_get_main_queue(), ^{
if (newState == GST_STATE_PLAYING) {
if (self.delegate && [self.delegate respondsToSelector:@selector(gstreamerPlayerDidStart:)]) {
[self.delegate gstreamerPlayerDidStart:self];
}
} else if (newState == GST_STATE_PAUSED) {
if (self.delegate && [self.delegate respondsToSelector:@selector(gstreamerPlayerDidPause:)]) {
[self.delegate gstreamerPlayerDidPause:self];
}
}
});
}
break;
}
default:
break;
}
return YES;
}
- (void)dealloc {
[self stop];
}
@end
ogg/opus解码为pcm
#import "GStreamerOggOpusDecoder.h"
#include <gst/app/gstappsrc.h>
#include <gst/app/gstappsink.h>
#import "gst_ios_init.h"
#import "GStreamerManager.h"
// MARK: - Objective-C Implementation
@interface GStreamerOggOpusDecoder ()
// GStreamer 对象 assign表示自行销毁
@property (nonatomic, assign) GstElement *pipeline;
@property (nonatomic, assign) GstElement *appsrc;
@property (nonatomic, assign) GstElement *appsink;
@property (nonatomic, assign) GstBus *bus;
@property (nonatomic, assign) GMainLoop *loop;
@property (nonatomic, assign) guint busWatchId;
// 状态相关
@property (nonatomic, assign) GstCaps *audioCaps; // 用于存储期望的音频格式
@property (nonatomic, assign) BOOL hasStartedParsing; // 标记是否已经发送过开始回调
// 消息处理
- (BOOL)handleMessage:(GstBus *)bus message:(GstMessage *)message;
@end
/**
* GStreamer 总线消息回调函数。
*/
gboolean busCallbackDecode(GstBus *bus, GstMessage *message, gpointer userData) {
GStreamerOggOpusDecoder *player = (__bridge GStreamerOggOpusDecoder *)userData;
return (gboolean)[player handleMessage:bus message:message];
}
/**
* appsink 的 "new-sample" 信号回调函数。
* 当 appsink 接收到一个新的解码后的音频样本时,此函数被调用。
*/
GstFlowReturn newSampleCallback(GstAppSink *appsink, gpointer userData) {
GStreamerOggOpusDecoder *decoder = (__bridge GStreamerOggOpusDecoder *)userData;
// 1. 从 appsink 中获取样本(拥有引用)
GstSample *sample = gst_app_sink_pull_sample(appsink);
if (!sample) {
return GST_FLOW_OK;
}
// 2. 从样本中获取缓冲区
GstBuffer *buffer = gst_sample_get_buffer(sample);
if (!buffer) {
gst_sample_unref(sample);
return GST_FLOW_OK;
}
// 3. 将缓冲区数据映射到内存
GstMapInfo mapInfo;
if (gst_buffer_map(buffer, &mapInfo, GST_MAP_READ)) {
// 4. 复制出 PCM 数据到 Objective-C 的 NSData(必须复制,不能持有 map 指针跨线程)
NSData *pcmData = [NSData dataWithBytes:mapInfo.data length:mapInfo.size]; //使用复制,否则可能会有buffer冲突,导致这个buffer有其他数据。
// 5. 在主线程回调 delegate(安全)
if (decoder.delegate && [decoder.delegate respondsToSelector:@selector(decoder:didOutputPCMData:)]) {
dispatch_async(dispatch_get_main_queue(), ^{
[decoder.delegate decoder:decoder didOutputPCMData:pcmData];
});
}
// 6. 取消内存映射(必须在复制后立即 unmap)
gst_buffer_unmap(buffer, &mapInfo);
}
// 7. 释放样本引用
gst_sample_unref(sample);
return GST_FLOW_OK;
}
@implementation GStreamerOggOpusDecoder
// MARK: - Initialization
- (instancetype)init {
self = [super init];
if (self) {
[GStreamerManager checkAutoGstInit];
_pipeline = NULL;
_appsrc = NULL;
_appsink = NULL;
_bus = NULL;
_loop = NULL;
_busWatchId = 0;
_audioCaps = NULL;
_hasStartedParsing = NO; // 初始化为 NO
}
return self;
}
// MARK: - Pipeline Setup
- (BOOL)setupDecoderPipelineWithAppSink:(NSString *)caps {
if (self.pipeline) {
NSLog(@"Warning: Pipeline already exists. Stopping and cleaning up the old one.");
[self stop];
}
// 1. 定义解码管道:从 Ogg Opus 到原始音频
GError *error = NULL;
self.pipeline = gst_parse_launch("appsrc name=source ! queue ! oggdemux ! opusdec ! audioconvert ! appsink name=sink", &error);
if (error) {
NSLog(@"Error: Failed to create pipeline: %s", error->message);
g_error_free(error);
return NO;
}
// 2. 获取并配置 appsrc
self.appsrc = gst_bin_get_by_name(GST_BIN(self.pipeline), "source");
if (!self.appsrc) {
NSLog(@"Error: Failed to get appsrc element.");
gst_object_unref(self.pipeline);
self.pipeline = NULL;
return NO;
}
// appsrc 推送的是 Ogg 容器数据
GstCaps *oggCaps = gst_caps_from_string("application/ogg");
// 更通用的 appsrc 配置:使用 bytes 格式并将 stream-type 设为 STREAM(按需可改)
g_object_set(self.appsrc,
"caps", oggCaps,
"format", GST_FORMAT_BYTES,
"is-live", FALSE,
"stream-type", GST_APP_STREAM_TYPE_STREAM,
NULL);
gst_caps_unref(oggCaps);
// 3. 获取并配置 appsink
self.appsink = gst_bin_get_by_name(GST_BIN(self.pipeline), "sink");
if (!self.appsink) {
NSLog(@"Error: Failed to get appsink element.");
if (self.appsrc) { gst_object_unref(self.appsrc); self.appsrc = NULL; }
gst_object_unref(self.pipeline);
self.pipeline = NULL;
return NO;
}
// 设置 appsink 的期望音频格式(会复制 caps)
if (caps && caps.length > 0) {
self.audioCaps = gst_caps_from_string([caps UTF8String]);
if (self.audioCaps) {
g_object_set(self.appsink, "caps", self.audioCaps, NULL);
// 注意:appsink 不持有我们的 caps 指针的 sole ownership —— 我们在 stop 时会 unref
}
}
// 连接 "new-sample" 信号,用于接收解码后的数据
g_object_set(self.appsink, "emit-signals", TRUE, "sync", FALSE, NULL);
g_signal_connect(self.appsink, "new-sample", G_CALLBACK(newSampleCallback), (__bridge void *)self);
// 4. 设置 Bus 和 GMainLoop
self.loop = g_main_loop_new(NULL, 0);
self.bus = gst_element_get_bus(self.pipeline);
if (self.bus) {
// gst_bus_add_watch 返回一个 GSource ID(guint),使用 g_source_remove 移除
self.busWatchId = gst_bus_add_watch(self.bus, busCallbackDecode, (__bridge void *)self);
} else {
NSLog(@"Warning: Failed to get bus from pipeline.");
}
NSLog(@"Decoder pipeline setup successful.");
return YES;
}
// MARK: - Playback Control
- (BOOL)start {
if (!self.pipeline) {
NSLog(@"Error: Pipeline is not set up. Call setupDecoderPipelineWithAppSink: first.");
return NO;
}
GstStateChangeReturn ret = gst_element_set_state(self.pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
NSLog(@"Error: Failed to set pipeline to PLAYING state.");
return NO;
} else {
NSLog(@"Pipeline is PLAYING.");
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
if (self.loop) {
g_main_loop_run(self.loop);
}
});
return YES;
}
}
- (void)pause {
if (!self.pipeline) { return; }
gst_element_set_state(self.pipeline, GST_STATE_PAUSED);
NSLog(@"Pipeline is PAUSED.");
}
- (void)play {
if (!self.pipeline) { return; }
gst_element_set_state(self.pipeline, GST_STATE_PLAYING);
NSLog(@"Pipeline is PLAYING.");
}
// MARK: - Data Injection
- (void)pushOggData:(NSData * _Nullable)data isEnd:(BOOL)isEnd {
if (!self.appsrc || !self.pipeline) { return; }
if (data.length == 0 && !isEnd) { return; }
GstBuffer *buffer = NULL;
if (data.length > 0) {
buffer = gst_buffer_new_allocate(NULL, (gsize)data.length, NULL);
if (!buffer) {
NSLog(@"Error: Failed to allocate GstBuffer.");
return;
}
GstMapInfo mapInfo;
if (gst_buffer_map(buffer, &mapInfo, GST_MAP_WRITE)) {
memcpy(mapInfo.data, data.bytes, data.length);
gst_buffer_unmap(buffer, &mapInfo);
} else {
NSLog(@"Error: Failed to map GstBuffer.");
gst_buffer_unref(buffer);
return;
}
}
GstFlowReturn flowReturn = GST_FLOW_OK;
if (isEnd) {
// 如果提前分配了 buffer(但用户同时请求 isEnd),释放该 buffer(我们不推送它)
if (buffer) {
gst_buffer_unref(buffer);
buffer = NULL;
}
flowReturn = gst_app_src_end_of_stream(GST_APP_SRC(self.appsrc)); // 一旦掉用end,管道将不会接受新数据。
NSLog(@"End-of-stream signal sent. flowReturn=%d", flowReturn);
} else {
// 推送 buffer(appsrc 接收后会拥有引用)
flowReturn = gst_app_src_push_buffer(GST_APP_SRC(self.appsrc), buffer);
// 成功推送后 appsrc 已经接管 buffer,不要再 unref
if (flowReturn != GST_FLOW_OK) {
NSLog(@"Error pushing data to appsrc: %d.", flowReturn);
// 如果失败,push 没有取走 buffer 的所有权,需要释放
if (buffer) {
gst_buffer_unref(buffer);
buffer = NULL;
}
}
}
}
// MARK: - Cleanup
- (void)stop {
if (!self.pipeline) { return; }
NSLog(@"Stopping pipeline and cleaning up resources.");
// 取消 bus watch(使用 g_source_remove,busWatchId 来自 gst_bus_add_watch)
if (self.busWatchId != 0) {
g_source_remove(self.busWatchId);
self.busWatchId = 0;
}
// 将 pipeline 置为 NULL 状态
gst_element_set_state(self.pipeline, GST_STATE_NULL);
// 结束主循环
if (self.loop) {
g_main_loop_quit(self.loop);
}
// 释放 bus、elements、loop、caps 等
if (self.bus) {
gst_object_unref(self.bus);
self.bus = NULL;
}
if (self.appsrc) {
gst_object_unref(self.appsrc);
self.appsrc = NULL;
}
if (self.appsink) {
gst_object_unref(self.appsink);
self.appsink = NULL;
}
if (self.loop) {
g_main_loop_unref(self.loop);
self.loop = NULL;
}
if (self.audioCaps) {
gst_caps_unref(self.audioCaps);
self.audioCaps = NULL;
}
if (self.pipeline) {
gst_object_unref(self.pipeline);
self.pipeline = NULL;
}
// 重置状态标志
self.hasStartedParsing = NO;
NSLog(@"Pipeline stopped and resources cleaned up.");
}
// MARK: - Message Handling
- (BOOL)handleMessage:(GstBus *)bus message:(GstMessage *)message {
if (!message) return YES;
switch (GST_MESSAGE_TYPE(message)) {
case GST_MESSAGE_EOS: {
NSLog(@"End-of-Stream received.");
__weak typeof(self) weakSelf = self;
dispatch_async(dispatch_get_main_queue(), ^{
__strong typeof(weakSelf) strongSelf = weakSelf;
if (strongSelf && strongSelf.delegate && [strongSelf.delegate respondsToSelector:@selector(decoderDidFinishParsing:)]) {
[strongSelf.delegate decoderDidFinishParsing:strongSelf];
}
});
// 由 stop 统一清理
[self stop];
break;
}
case GST_MESSAGE_ERROR: {
GError *err = NULL;
gchar *debug = NULL;
gst_message_parse_error(message, &err, &debug);
NSLog(@"GStreamer Error: %s\nDebug Info: %s", err ? err->message : "unknown", debug ? debug : "none");
// 将 GError 转换为 NSError
NSString *errorDomain = @"com.yourapp.gstreamer.error";
NSMutableDictionary *userInfo = [NSMutableDictionary dictionary];
if (err && err->message) {
userInfo[NSLocalizedDescriptionKey] = [NSString stringWithUTF8String:err->message];
}
if (debug) {
userInfo[@"GStreamerDebugInfo"] = [NSString stringWithUTF8String:debug];
}
NSError *nsError = [NSError errorWithDomain:errorDomain code:(err ? err->code : -1) userInfo:userInfo];
if (err) { g_error_free(err); }
if (debug) { g_free(debug); }
__weak typeof(self) weakSelf = self;
dispatch_async(dispatch_get_main_queue(), ^{
__strong typeof(weakSelf) strongSelf = weakSelf;
if (strongSelf && strongSelf.delegate && [strongSelf.delegate respondsToSelector:@selector(decoder:didFailWithError:)]) {
[strongSelf.delegate decoder:strongSelf didFailWithError:nsError];
}
});
// 错误后停止管道
[self stop];
break;
}
case GST_MESSAGE_STATE_CHANGED: {
// 仅关注 pipeline 自身的状态变化
if (GST_MESSAGE_SRC(message) == GST_OBJECT(self.pipeline)) {
GstState oldState, newState, pendingState;
gst_message_parse_state_changed(message, &oldState, &newState, &pendingState);
NSLog(@"Pipeline state changed from %s to %s",
gst_element_state_get_name(oldState),
gst_element_state_get_name(newState));
__weak typeof(self) weakSelf = self;
dispatch_async(dispatch_get_main_queue(), ^{
__strong typeof(weakSelf) strongSelf = weakSelf;
if (!strongSelf) return;
if (newState == GST_STATE_PLAYING) {
// 只有在第一次进入 PLAYING 状态时才回调
if (!strongSelf.hasStartedParsing) {
strongSelf.hasStartedParsing = YES;
if (strongSelf.delegate && [strongSelf.delegate respondsToSelector:@selector(decoderDidStartParsing:)]) {
[strongSelf.delegate decoderDidStartParsing:strongSelf];
}
}
} else if (newState == GST_STATE_PAUSED) {
if (strongSelf.delegate && [strongSelf.delegate respondsToSelector:@selector(decoderDidPauseParsing:)]) {
[strongSelf.delegate decoderDidPauseParsing:strongSelf];
}
} else if (newState == GST_STATE_NULL) {
// 管道停止后,重置标志,以便下次可以重新开始
strongSelf.hasStartedParsing = NO;
}
});
}
break;
}
default: {
break;
}
}
return YES;
}
// MARK: - Dealloc
- (void)dealloc {
[self stop];
}
@end
其他经验
在利用GStreamer 制作播放器的时候,播放一个ogg/opus 的文件时,怎么调整参数播放都卡顿,最后直接使用GStreamer的playBin 播放也卡顿,最后使用ffmpeg 播放就正常,通过系统自带播放器也能播放。不知道是不是编码有问题,最后只使用GStreamer解析部分,解析为pcm,通过AudioQueue来播放。
452

被折叠的 条评论
为什么被折叠?



