gstreamer的rtsp推流

1. 安装 gstream 1.0

sudo apt-get update  
sudo apt-get install gstreamer1.0  

2.安装gst-rtsp需要的库文件

sudo apt-get install libgstreamer*  
sudo apt-get install gstreamer-tools gstreamer0.10-plugins-base gstreamer0.10-plugins-good gstreamer0.10-plugins-bad gstreamer0.10-plugins-ugly  

3.编译安装gst-rtsp

wget http://gstreamer.freedesktop.org/src/gst-rtsp/gst-rtsp-0.10.8.tar.bz2  
bzip2 -d gst-rtsp-0.10.8.tar.bz2  
tar -xvf gst-rtsp-0.10.8.tar  
cd gst-rtsp-0.10.8/  
./configure  
sudo make 

4、测试gst-rtsp

./test-readme
./test-launch  --gst-debug=3 "( v4l2src ! video/x-raw,width=640,height=480 ! omxh264enc ! h264parse ! rtph264pay name=pay0 pt=96 )"

播放rtsp://192.168.1.110:8554/test

### GStreamer UDP Streaming Configuration and Examples GStreamer is a powerful framework for creating multimedia applications, including streaming over networks using protocols such as UDP. Below are some configurations and examples to help you understand how to use GStreamer for UDP-based video/audio streaming. #### Basic UDP Stream Example A simple pipeline can be constructed to send an H.264-encoded stream via UDP. The following command demonstrates this: ```bash gst-launch-1.0 videotestsrc ! x264enc ! mpegtsmux ! udpsink host=192.168.1.1 port=5000 ``` This pipeline generates test video content (`videotestsrc`), encodes it with `x264enc`, multiplexes it into MPEG-TS format (`mpegtsmux`), and sends the resulting data through UDP to the specified IP address (in this case, `192.168.1.1`) on port `5000`. This setup assumes that there's another machine or application listening at the given destination[^1]. #### Receiving the UDP Stream To receive the streamed data sent by the above sender, one could set up a receiver like so: ```bash gst-launch-1.0 udpsrc port=5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! avdec_h264 ! autovideosink ``` Here, `udpsrc` listens for incoming packets from the sender on port `5000`. It expects RTP/H.264 encoded streams based on its defined capabilities (`caps`). After depayloading (`rtph264depay`) and decoding (`avdec_h264`), the output will play back automatically depending upon your system’s available sinks (`autovideosink`). #### Configuring Multiple Sources Using DeepStream Framework For more complex scenarios involving multiple sources, NVIDIA provides tools within their DeepStream SDK which allow configuring several input files simultaneously along with GPU acceleration settings among other parameters. An illustrative configuration snippet might look similar to what follows below where two different MP4 videos act as inputs while leveraging batch processing features provided by SGIE components configured accordingly: ```ini [source-list] num-source-bins=2 list=file:///path/to/video1.mp4;file:///path/to/video2.mp4 sgie-batch-size=8 [source-attr-all] enable=1 type=3 num-sources=1 gpu-id=0 cudadec-memtype=0 latency=100 rtsp-reconnect-interval-sec=0 ``` Such setups may require additional tuning according to specific requirements but provide flexibility when dealing with numerous simultaneous feeds requiring intensive computational resources typically found in AI-driven analytics pipelines[^2]. #### Alternative Solutions For USB Cameras With MJPEG Streams Over HTTP When working directly with hardware devices connected locally—such as webcams—one alternative solution involves utilizing software packages designed specifically around these peripherals' characteristics. One recommended approach mentioned earlier leverages **MJPG-streamer**, capable of generating accessible HTTP endpoints serving live footage captured by attached cameras without needing explicit knowledge about underlying transport layers beyond basic networking principles applicable generally across most modern systems today[^3]: Install necessary dependencies first before proceeding further steps outlined hereunder assuming Linux environment context unless otherwise stated explicitly elsewhere during actual implementation phases later down road ahead eventually reaching final goal successfully achieved ultimately! ```bash sudo apt-get install libjpeg8-dev imagemagick ffmpeg v4l-utils git clone https://github.com/jacksonliam/mjpg-streamer.git cd mjpg-streamer/ make USE_LIBV4L2=true clean all ./mjpg_streamer -i "./input_uvc.so -d /dev/video0 -r 1280x720 -f 30" -o "./output_http.so -w ./www" ``` The last line starts MJPG-streamer pointing towards device `/dev/video0` capturing frames sized `1280x720px` running @30fps served under default directory structure expected inside current folder named 'www'.
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值