http://blog.sina.com.cn/s/articlelist_2160998997_4_1.html
1. PLAYBACK插件基本介绍
在早期的版本中同时存在playbin和playbin2,但是在最新的版本中,playbin2已经稳定,取代了playbin,playbin不再进行维护。下面是官网上的一些描述:
Playbin2 provides a stand-alone everything-in-one abstraction for an audio and/or video player.
playbin2 is considered stable now. It is the prefered playback API now, and replaces the oldplaybin element, which is no longer supported.
上面的例子中就是没有描述audio/videosink,playbin要负责找到最好的element。
基于下面这个图,gst-lauch就是这里的application,pipeline 就是playbin
也可以不使用playbin,而是直接描述自己的pipeline:
gst-launch gnlfilesource location=file:///tmp/myfile.mov start=0 duration=2000000000 ! autovideosink
的一个application。
set_state()要满足状态机跳转的要求:
typedef enum
{
} GstStateChange;
{
}
{
};
static void
gst_play_bin_class_init (GstPlayBinClass * klass)
{
}
如何创建message:
//Create a new element-specific message创建一个element相关的message
{
}
所有支持的message类型包括:
};
//把message发到bus上
gboolean gst_bus_post (GstBus * bus, GstMessage * message)
例子:
在gst-launch中,由application直接去处理message,
event_loop (GstElement * pipeline, gboolean blocking, GstState target_state)
{
}
另外一种message处理方法:
由于bus的存在,而message都需要通过bus传输给application,另外一种方法就是在bus上增加watch函数
来处理pipeline发给application的message:
static gboolean
my_bus_callback (GstBus
{
}
3. playbin2
enum
{
};
当某个signal发出以后,该handler函数自动调用:
signal和handler绑定方法1: 在class_init()中:
gst_play_bin_signals[SIGNAL_GET_AUDIO_PAD] =
klass->get_audio_pad = gst_play_bin_get_audio_pad;
另外还有种绑定signal和handler(callback函数)的方法g_signal_connect:
playsink:
关于playsink见另外一篇blog。
It can handle both audio and video files and features
- automatic file type recognition and based on that automatic selection and usage of the right audio/video/subtitle demuxers/decoders
- visualisations for audio files
- subtitle support for video files. Subtitles can be store in external files.
- stream selection between different video/audio/subtitles streams
- meta info (tag) extraction
- easy access to the last video frame
- buffering when playing streams over a network
- volume control with mute option
Usage
A playbin2 element can be created just like any other element using gst_element_factory_make()
. The file/URI to play should be set via the "uri" property. This must be an absolute URI, relative file paths are not allowed. Example URIs are file:///home/joe/movie.avi or http://www.joedoe.com/foo.ogg
Playbin is a GstPipeline. It will notify the application of everything that's happening (errors, end of stream, tags found, state changes, etc.) by posting messages on its GstBus. The application needs to watch the bus.
Playback can be initiated by setting the element to PLAYING state using gst_element_set_state()
. Note that the state change will take place in the background in a separate thread, when the function returns playback is probably not happening yet and any errors might not have occured yet. Applications using playbin should ideally be written to deal with things completely asynchroneous.
When playback has finished (an EOS message has been received on the bus) or an error has occured (an ERROR message has been received on the bus) or the user wants to play a different track, playbin should be set back to READY or NULL state, then the "uri" property should be set to the new location and then playbin be set to PLAYING state again.
Seeking can be done using gst_element_seek_simple()
or gst_element_seek()
on the playbin element. Again, the seek will not be executed instantaneously, but will be done in a background thread. When the seek call returns the seek will most likely still be in process. An application may wait for the seek to finish (or fail) using gst_element_get_state()
with -1 as the timeout, but this will block the user interface and is not recommended at all.
Applications may query the current position and duration of the stream viagst_element_query_position()
and gst_element_query_duration()
and setting the format passed to GST_FORMAT_TIME. If the query was successful, the duration or position will have been returned in units of nanoseconds.
2. 代码说明
//定义数据结构
struct _GstPlayBin
{
};
struct _GstPipeline {
};
struct _GstPipelineClass {
};
//定义函数指针
struct _GstPlayBinClass
{
};
上面几个类的关系:
struct _GstSourceGroup
{
};
初始化:
enum
{
};
在playbin这个class的初始化中:
为上面定义的函数指针的函数进行实现。
static void gst_play_bin_set_property (GObject * object, guint prop_id,
{
}
核心函数:
//状态机控制,NULL/READY/PAUSE/PLAY 4个状态之间的跳转,如果从NULL->PLAYING,中间需要经过
//
{
}
根据状态机的各个状态的要求,在ready状态下必须分配好所有需要的资源,包括element的创建等。
setup_next_source()->activate_group()->
static gboolean
activate_group (GstPlayBin * playbin, GstSourceGroup * group, GstState target)
{
}
Playbin中根据signal来进行通信和处理。在内部根据media type(type_found()函数)来加载对应的plugin来形成pipeline。
需要注意的是:
main (gint argc, gchar * argv[])
{
}
问题1:在早期的playbin中要求用户指定playbin中的各个element的类型,在新的playbin中不需要这么
DecodeBin:
struct _GstURIDecodeBin
{
};
struct _GstURIDecodeBinClass
{
};
PLAYBACK 插件定义:
plugin_init (GstPlugin * plugin)
{
#ifdef ENABLE_NLS
#endif
}
GST_PLUGIN_DEFINE (GST_VERSION_MAJOR,
声明一个插件,并且在初始化是调用decodebin/playbin2/playsink/subtitle_overlay等的element注册函数。
总结:在element中实际上都是围绕2个结构体来进行的,一个struct是定义的是该element的数据变量;另外一个struct定义的是该struct支持的成员函数的指针。实现element实际上就是实现这些函数。
在代码中存在这样的代码:
第二部分:playback插件如何工作
selector的数据结构:
struct _GstSourceSelect
{
};
audiosink/videosink/textsink可以通过property来设置(PROP_VIDEO_SINK/PROP_AUDIO_SINK/PROP_TEXT_SINK)。
activate_group()中会建立uridecodebin:
{
}
从总体上来看都是通过设置在pad上的caps来寻找element,这可以从playbin的几个函数中看到:
在playbin2的说明中:
Advanced Usage: specifying the audio and video sink
By default, if no audio sink or video sink has been specified via the "audio-sink" or "video-sink"property, playbin will use the autoaudiosink and autovideosink elements to find the first-best available output method. This should work in most cases, but is not always desirable. Often either the user or application might want to specify more explicitly what to use for audio and video output.(如果用户没有描述audiosink和videosink,那么playbin会自动去寻找最合适的sink,最好是用户去描述)
If the application wants more control over how audio or video should be output, it may create the audio/video sink elements itself (for example using gst_element_factory_make()
) and provide them to playbin using the "audio-sink" or "video-sink" property.
GNOME-based applications, for example, will usually want to create gconfaudiosink and gconfvideosink elements and make playbin use those, so that output happens to whatever the user has configured in the GNOME Multimedia System Selector configuration dialog.
The sink elements do not necessarily need to be ready-made sinks. It is possible to create container elements that look like a sink to playbin, but in reality contain a number of custom elements linked together. This can be achieved by creating a GstBin and putting elements in there and linking them, and then creating a sink GstGhostPad for the bin and pointing it to the sink pad of the first element within the bin. This can be used for a number of purposes, for example to force output to a particular format or to modify or observe the data before it is output.
It is also possible to 'suppress' audio and/or video output by using 'fakesink' elements (or capture it from there using the fakesink element's "handoff" signal, which, nota bene, is fired from the streaming thread!).