RTSP客户端接收存储数据(live555库中的testRTSPClient实例)

1、testRTSPClient简介

testRTSPClient是个简单的客户端实例,这个实例对rtsp数据交互作了详细的描述,其中涉及到rtsp会话的两个概念Source和Sink.

Source是生产数据,Sink是消费数据.

testRTSPClient非常简洁,除了接收服务端发送过来的数据,什么都没干,所以我们很方便在这个基础上改造,做我们自己的项目.

2、testRTSPClient编译,运行

在linux下编译运行更方便,鉴于我的电脑太渣,虚拟机跑起来费劲,就转到windows下来折腾.

在windows下只需要加载这一个文件就可以编译,我们以mediaServer为服务端,以testRTSPClient为客户端。

当然也可以用支持rtsp协议的摄像机或其他实体设备作为服务端。
在这里插入图片描述在这里插入图片描述

先启动mediaServer,然后在testRTSPClient项目的命令菜单里填入mediaServer 提示的IP, 再启动testRTSPClient即可。
在这里插入图片描述
在这里插入图片描述

3、testRTSPClient核心代码解读

1)看代码之前可以大致浏览一下总体的框架,这位博主画了个流程图http://blog.youkuaiyun.com/smilestone_322/article/details/17297817

复制代码
void DummySink::afterGettingFrame(unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime, unsigned /durationInMicroseconds/) {
// We’ve just received a frame of data. (Optionally) print out information about it:
#ifdef DEBUG_PRINT_EACH_RECEIVED_FRAME
if (fStreamId != NULL) envir() << "Stream “” << fStreamId << “”; “;
envir() << fSubsession.mediumName() << “/” << fSubsession.codecName() << “:\tReceived " << frameSize << " bytes”;
if (numTruncatedBytes > 0) envir() << " (with " << numTruncatedBytes << " bytes truncated)”;
char uSecsStr[6+1]; // used to output the ‘microseconds’ part of the presentation time
sprintf(uSecsStr, “%06u”, (unsigned)presentationTime.tv_usec);
envir() << ".\tPresentation time: " << (unsigned)presentationTime.tv_sec << “.” << uSecsStr;
if (fSubsession.rtpSource() != NULL && !fSubsession.rtpSource()->hasBeenSynchronizedUsingRTCP()) {
envir() << “!”; // mark the debugging output to indicate that this presentation time is not RTCP-synchronized
}
envir() << “\n”;
#endif

// Then continue, to request the next frame of data:
continuePlaying();
}

Boolean DummySink::continuePlaying() {
if (fSource == NULL) return False; // sanity check (should not happen)

// Request the next frame of data from our input source. “afterGettingFrame()” will get called later, when it arrives:
fSource->getNextFrame(fReceiveBuffer, DUMMY_SINK_RECEIVE_BUFFER_SIZE,
afterGettingFrame, this,
onSourceClosure, this);
return True;
}
复制代码

2)有网友在testRTSPClient基础上,把接收的数据写成h264文件了http://blog.youkuaiyun.com/occupy8/article/details/36426821

复制代码
void DummySink::afterGettingFrame(void* clientData, unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime, unsigned durationInMicroseconds) {
DummySink* sink = (DummySink*)clientData;
sink->afterGettingFrame(frameSize, numTruncatedBytes, presentationTime, durationInMicroseconds);
}

// If you don’t want to see debugging output for each received frame, then comment out the following line:
#define DEBUG_PRINT_EACH_RECEIVED_FRAME 1

void DummySink::afterGettingFrame(unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime, unsigned /durationInMicroseconds/) {
// We’ve just received a frame of data. (Optionally) print out information about it:
#ifdef DEBUG_PRINT_EACH_RECEIVED_FRAME
if (fStreamId != NULL) envir() << "Stream “” << fStreamId << “”; “;
envir() << fSubsession.mediumName() << “/” << fSubsession.codecName() << “:\tReceived " << frameSize << " bytes”;
if (numTruncatedBytes > 0) envir() << " (with " << numTruncatedBytes << " bytes truncated)”;
char uSecsStr[6+1]; // used to output the ‘microseconds’ part of the presentation time
sprintf(uSecsStr, “%06u”, (unsigned)presentationTime.tv_usec);
envir() << ".\tPresentation time: " << (unsigned)presentationTime.tv_sec << “.” << uSecsStr;
if (fSubsession.rtpSource() != NULL && !fSubsession.rtpSource()->hasBeenSynchronizedUsingRTCP()) {
envir() << “!”; // mark the debugging output to indicate that this presentation time is not RTCP-synchronized
}
envir() << “\n”;
#endif

//todo one frame
//save to file
if(!strcmp(fSubsession.mediumName(), “video”))
{
if(firstFrame)
{
unsigned int num;
SPropRecord *sps = parseSPropParameterSets(fSubsession.fmtp_spropparametersets(), num);
// For H.264 video stream, we use a special sink that insert start_codes:
struct timeval tv= {0,0};
unsigned char start_code[4] = {0x00, 0x00, 0x00, 0x01};
FILE *fp = fopen(“test.264”, “a+b”);
if(fp)
{
fwrite(start_code, 4, 1, fp);
fwrite(sps[0].sPropBytes, sps[0].sPropLength, 1, fp);
fwrite(start_code, 4, 1, fp);
fwrite(sps[1].sPropBytes, sps[1].sPropLength, 1, fp);
fclose(fp);
fp = NULL;
}
delete [] sps;
firstFrame = False;
}

 char *pbuf = (char *)fReceiveBuffer;
 char head[4] = {0x00, 0x00, 0x00, 0x01};
 FILE *fp = fopen("test.264", "a+b");
 if(fp)
 {
     fwrite(head, 4, 1, fp);
     fwrite(fReceiveBuffer, frameSize, 1, fp);
     fclose(fp);
     fp = NULL;
 }

}

// Then continue, to request the next frame of data:
continuePlaying();
}

Boolean DummySink::continuePlaying() {
if (fSource == NULL) return False; // sanity check (should not happen)

// Request the next frame of data from our input source. “afterGettingFrame()” will get called later, when it arrives:
fSource->getNextFrame(fReceiveBuffer, DUMMY_SINK_RECEIVE_BUFFER_SIZE,
afterGettingFrame, this,
onSourceClosure, this);
return True;
}
复制代码
testRTSPClient接收的fReceiveBuffer缓存没有起始码,start_code[4] = {0x00, 0x00, 0x00, 0x01}; 写成文件或者播放都需要自行加上。

3)testRTSPClient这个实例还支持多路录放,网上搜到有人已经实现了,搬过来.

 http://blog.chinaunix.net/uid-15063109-id-4482932.html
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值