PJMEDIA 延迟缓冲区实现

PJMEDIA实现的自适应延迟缓冲区能根据输入速率变化调整缓冲策略,使用WSOLA算法保证音频质量,适用于实时音频流场景。

PJMEDIA实现了一个自适应的延迟缓冲区,延迟缓冲区很像一个固定的抖动缓冲区,用于缓冲那些不按照确定时间间隔到达的帧,这样调用者就能够连续地从缓冲区获取到帧。当get()和put()操作并不是均匀地间隔进行,延迟缓冲区会十分有用,例如:调用者没有规律,急速地多次进行put操作然后又急速地多次进行get操作。有了延迟缓冲区,假设get和put操作速率不匹配,急速放入的帧总是能够被匀速地获取。

延迟缓冲区是自适应的,它通过不断学习来掌握用于实时音频流的最优延迟时间。一旦最优的延迟时间被确定,将这个时间用于音频流的缓冲区,当实际的语音样本过少或过多,就可以扩大或缩小语音样本数量,这样做不至于扭曲语音质量。

PJMEDIA 延迟缓冲区主要用在PJMED_SND_PORT、PJMEDIA_SPLITCOMB、PJMEDIA_CONF。使用WSOLA(波形相似的叠加)算法。

 

延迟缓冲区的外部接口定义了创建缓冲区,从缓冲区读(消费),向缓冲区写(生产),销毁缓冲区,重置缓冲区等操作。

延迟缓冲区的基本属性:

--每帧的样本数量(samples_per_frame)

--帧时长(ptime):ms为单位

--声道数量(channel_count)

--最大缓冲样本数量(max_cnt)

--缓冲样本的有效数量(eff_cnt):有效数量值在延迟和稳定之间保持最优平衡,根据突发级别计算而来。

延迟缓冲区的可学习属性:

--突发级别计数器:记载连续相同操作的次数,比如生产者连续做了3次写put操作,中间没有做读get操作,则计数器值为3。一旦发生造作切换,则重置为1。

--上次执行的操作:消费(get)还是生产(put),用于比较本次操作是否与上次相同,若相同,则突发级别计数器加1。

--最大突发级别:记载当前突发级别计数器的最大值。

--计算最大突发级别的定时器:初值设置为2000,每次操作发生了切换时,则自减当前突发级别计数器的值与ptime积的2倍

 

创建延迟缓冲区时,需要指定采样频率,每帧样本数(samples_per_frame),声道数量channel_count,最大缓冲多长时间的延迟max_delay(默认为400ms)。采样频率,每帧样本数,声道数量可以计算出ptime。方法如下:

帧时长:ptime = samples_per_frame * 1000 / clock_rate / channel_count;

最大缓冲样本数量:max_cnt=samples_per_frame * (max_delay / ptime);

缓冲样本的有效数量设置为最大缓冲样本数量的一半:eff_cnt = max_cnt/2;

最大突发级别的计算定时器时长设置为每隔2000ms计算一次。

 

1.根据最大突发级别重新计算有效样本数量

 

   当最大突发级别定时器的值小于等于0时,需要重新计算有效样本数量。

   

   new_eff_cnt = (max_level*samples_per_frame)  / 2;

   eff_cnt = (eff_cnt + new_eff_cnt *3) >> 2;

   为确保eff_cnt 是channel_count的倍数,需要对其做对其操作:

   if (eff_cnt % channel_count)
        eff_cnt += channel_count - (eff_cnt % channel_count);

   max_level重置为0;

   recalc_timer重置为初始值(2000);

 

2. 收缩(shrinking)延迟缓冲区

 

      收缩采用WSOLA算法,保证播放时间缩短,但是并不影响音调(频率)。收缩操作一个参数:收缩数量。

      当操作发生了切换,且当前操作是put时,如果缓冲区的长度(能够容纳的样本数量)大于eff_cnt与samples_per_frame的和,则缓冲区需要收缩。 收缩数量为samples_per_frame的一半。

      向缓冲区写数据之前要检查当前缓冲区的剩余空间是否容纳得下一帧样本数据。如果容纳不下,也要做收缩操作,收缩数量恰好保证能够放入一帧。如果收缩失败,则之间一定缓冲区指针来清除缓冲区中最老的数据。

 

      缓冲区的增长和缩小要依据快速增长和缓慢减小的平滑原则。

 

 

 

 

 

 

 

 

 

 

 

 

 

 

#include "form_voip.h" #include "ui_form_voip.h" #include "parammanager.h" #include <QPushButton> #include <QMediaDevices> #include <QVideoSink> #include <QCamera> #include <QTimer> #include <QAudioInput> #include <QVideoFrame> using namespace pj; class MyEndpoint : public Endpoint { public: MyEndpoint() : Endpoint() {}; virtual pj_status_t onCredAuth(OnCredAuthParam &prm) { PJ_UNUSED_ARG(prm); std::cout << "*** Callback onCredAuth called ***" << std::endl; /* Return PJ_ENOTSUP to use * pjsip_auth_create_aka_response()/<b>libmilenage</b> (default), * if PJSIP_HAS_DIGEST_AKA_AUTH is defined. */ return PJ_ENOTSUP; } }; class MyAccount; class MyAudioMediaPort: public AudioMediaPort { virtual void onFrameRequested(MediaFrame &frame) { // Give the input frame here frame.type = PJMEDIA_FRAME_TYPE_AUDIO; // frame.buf.assign(frame.size, 'c'); } virtual void onFrameReceived(MediaFrame &frame) { PJ_UNUSED_ARG(frame); // Process the incoming frame here } }; class MyCall : public Call { private: MyAccount *myAcc; AudioMediaPlayer *wav_player; AudioMediaPort *med_port; // 视频窗口指针 VideoWindow *localVideoWin = nullptr; VideoWindow *remoteVideoWin = nullptr; public: MyCall(Account &acc, int call_id = PJSUA_INVALID_ID) : Call(acc, call_id) { wav_player = NULL; med_port = NULL; myAcc = (MyAccount *)&acc; } ~MyCall() { if (wav_player) delete wav_player; if (med_port) delete med_port; // 清理视频窗口 if (localVideoWin) delete localVideoWin; if (remoteVideoWin) delete remoteVideoWin; } virtual void onCallState(OnCallStateParam &prm); virtual void onCallTransferRequest(OnCallTransferRequestParam &prm); virtual void onCallReplaceRequest(OnCallReplaceRequestParam &prm); virtual void onCallMediaState(OnCallMediaStateParam &prm); }; class MyAccount : public Account { public: std::vector<Call *> calls; public: MyAccount() {} ~MyAccount() { std::cout << "*** Account is being deleted: No of calls=" << calls.size() << std::endl; for (std::vector<Call *>::iterator it = calls.begin(); it != calls.end(); ) { delete (*it); it = calls.erase(it); } } void removeCall(Call *call) { for (std::vector<Call *>::iterator it = calls.begin(); it != calls.end(); ++it) { if (*it == call) { calls.erase(it); break; } } } //‌ OnRegStateParam ‌是 PJSUA2 开发框架中的一个参数类,用于处理注册状态的变化。当帐户的注册状态发生变化时, Account 类会触发一个事件,该事件传递 OnRegStateParam 参数,其中包含了注册状态变化的详细信息。 virtual void onRegState(OnRegStateParam &prm) { AccountInfo ai = getInfo(); std::cout << (ai.regIsActive? "***** Register: code=" : "***** Unregister: code=") << prm.code << std::endl; } virtual void onIncomingCall(OnIncomingCallParam &iprm) { Call *call = new MyCall(*this, iprm.callId); CallInfo ci = call->getInfo(); CallOpParam prm; std::cout << "*** Incoming Call: " << ci.remoteUri << " [" << ci.stateText << "]" << std::endl; calls.push_back(call); prm.statusCode = (pjsip_status_code)200; // 2.15.1 必须设置这两个参数 prm.opt.audioCount = 1; prm.opt.videoCount = 1; //prm.opt.audioDir = PJMEDIA_DIR_ENCODING_DECODING; //prm.opt.videoDir = PJMEDIA_DIR_ENCODING_DECODING; prm.opt.mediaDir.push_back(PJMEDIA_DIR_ENCODING_DECODING); // 音频 prm.opt.mediaDir.push_back(PJMEDIA_DIR_ENCODING_DECODING); // 视频 call->answer(prm); //currentCall = call; // 启动来电铃声 // pjmedia_tonegen_param toneParam; // pj_bzero(&toneParam, sizeof(toneParam)); // toneParam.flags = PJMEDIA_TONEGEN_LOOP; // toneParam.tones[0].freq1 = 440; // 音频频率 // toneParam.tones[0].duration = -1; // 持续时间(无限循环) // pjmedia_tonegen_create(pjsua_get_media_transport(), NULL, &toneParam, &call->ringToneId); //使用 pjmedia_tonegen_create 函数创建一个音频生成器来播放铃声。上述代码中,ringToneId 是生成的铃声音频流 ID,用于后续停止铃声 //当用户点击接听按钮时,可以通过调用 answer 方法接听电话,并停止铃声播放 // void MyCall::answerCall() { // CallOpParam answer_op; // answer_op.statusCode = PJSIP_SC_OK; // 使用 200 OK 表示接听电话 // answer(answer_op); // // 停止来电铃声 // if (ringToneId != PJMEDIA_TONEGEN_ID_INVALID) { // pjmedia_tonegen_destroy(ringToneId); // ringToneId = PJMEDIA_TONEGEN_ID_INVALID; // } // PJ_LOG(3, (pj_get_log_writer(), "Call answered!")); // } // 假设这是 GUI 框架中的按钮点击事件处理函数 // void onAnswerButtonClicked(MyCall *call) { // call->answerCall(); // } } // void MyAccount::onIncomingCall(OnIncomingCallParam &iprm) { // PJ_LOG(3, (pj_get_log_writer(), "Incoming call from %.*s!", // (int)iprm.rinfo.fromUri.slen, iprm.rinfo.fromUri.ptr)); // // 创建一个新的 Call 对象 // MyCall *call = new MyCall(*this, iprm.callId); // // 启动来电铃声 // pjmedia_tonegen_param toneParam; // pj_bzero(&toneParam, sizeof(toneParam)); // toneParam.flags = PJMEDIA_TONEGEN_LOOP; // toneParam.tones[0].freq1 = 440; // 音频频率 // toneParam.tones[0].duration = -1; // 持续时间(无限循环) // pjmedia_tonegen_create(pjsua_get_media_transport(), NULL, &toneParam, &call->ringToneId); // } }; //condition_variable cv; void MyCall::onCallState(OnCallStateParam &prm) { PJ_UNUSED_ARG(prm); CallInfo ci = getInfo(); std::cout << "*** Call: " << ci.remoteUri << " [" << ci.stateText << "]" << std::endl; if (ci.state == PJSIP_INV_STATE_DISCONNECTED) { //myAcc->removeCall(this); /* Delete the call */ //delete this; } else if (ci.state == PJSIP_INV_STATE_CONFIRMED) { std::cout << "*** 通话已连接" << std::endl; //callConnected = true; //cv.notify_one(); } } // 创建视频窗口 // void createVideoWindows(int med_idx) { // // 获取视频媒体 // VideoMedia vid_med = getVideoMedia(med_idx); // // 创建本地预览窗口 // VideoPreviewOpParam pre_param; // VideoWindowHandle wnd_handle; // try { // VideoPreview *preview = vid_med.startPreview(pre_param); // VideoWindow win = preview->getVideoWindow(); // win.Show(true); // cout << "本地预览窗口ID: " << win.getInfo().winId << endl; // } catch(Error& err) { // cout << "预览窗口错误: " << err.info() << endl; // } // // 创建远程视频窗口 // try { // VideoWindow remote_win = vid_med.getRemoteVideoWindow(); // remote_win.Show(true); // cout << "远程视频窗口ID: " << remote_win.getInfo().winId << endl; // } catch(Error& err) { // cout << "远程窗口错误: " << err.info() << endl; // } // } void MyCall::onCallMediaState(OnCallMediaStateParam &prm) { PJ_UNUSED_ARG(prm); CallInfo ci = getInfo(); AudioMedia aud_med; AudioMedia& play_dev_med = MyEndpoint::instance().audDevManager().getPlaybackDevMedia(); qDebug()<<"AudioMedia& play_dev_med = MyEndpoint::instance().audDevManager().getPlaybackDevMedia();"; try { // Get the first audio media aud_med = getAudioMedia(-1); } catch(...) { std::cout << "Failed to get audio media" << std::endl; return; } if (!wav_player) { wav_player = new AudioMediaPlayer(); try { wav_player->createPlayer( "./closecall.wav", 0); std::cout << "Success opened wav file" << std::endl; } catch (...) { std::cout << "Failed opening wav file" << std::endl; delete wav_player; wav_player = NULL; } } // This will connect the wav file to the call audio media if (wav_player) wav_player->startTransmit(aud_med); if (!med_port) { med_port = new MyAudioMediaPort(); MediaFormatAudio fmt; fmt.init(PJMEDIA_FORMAT_PCM, 16000, 1, 20000, 16); med_port->createPort("med_port", fmt); // Connect the media port to the call audio media in both directions med_port->startTransmit(aud_med); aud_med.startTransmit(*med_port); } // And this will connect the call audio media to the sound device/speaker aud_med.startTransmit(play_dev_med); // 遍历媒体流 for (unsigned i = 0; i < ci.media.size(); i++) { if (ci.media[i].type == PJMEDIA_TYPE_VIDEO) { // 获取视频媒体流 VideoMedia *vid_med = (VideoMedia *)getMedia(i); if (vid_med) { // 创建本地预览窗口 try { VideoPreviewOpParam pre_param; VideoPreview *preview; localVideoWin = new VideoWindow(preview->getVideoWindow()); localVideoWin->Show(true); //std::cout << "本地预览窗口句柄: " << localVideoWin->getInfo().hwnd << std::endl; } catch(Error& err) { std::cout << "预览窗口错误: " << err.info() << std::endl; } // 创建远程视频窗口 try { //remoteVideoWin = new VideoWindow(vid_med->getRemoteVideoWindow()); remoteVideoWin->Show(true); //std::cout << "远程窗口句柄: " << remoteVideoWin->getInfo().hwnd << std::endl; } catch(Error& err) { std::cout << "远程窗口错误: " << err.info() << std::endl; } } } } } void MyCall::onCallTransferRequest(OnCallTransferRequestParam &prm) { /* Create new Call for call transfer */ prm.newCall = new MyCall(*myAcc); } void MyCall::onCallReplaceRequest(OnCallReplaceRequestParam &prm) { /* Create new Call for call replace */ prm.newCall = new MyCall(*myAcc); } Form_VoIP::Form_VoIP(QWidget *parent) : QWidget(parent) , ui(new Ui::Form_VoIP) { ui->setupUi(this); this->resize(1680,920); // 初始化GStreamer //qputenv("GST_DEBUG", "2"); // 设置GStreamer调试级别 qputenv("GST_PLUGIN_PATH", "/usr/lib/gstreamer-1.0"); // 设置插件路径 connect(ui->registerButton, &QPushButton::clicked, this, &Form_VoIP::onRegisterClicked); connect(ui->unregisterButton, &QPushButton::clicked, this, &Form_VoIP::onUnRegisterClicked); connect(ui->callButton, &QPushButton::clicked, this, &Form_VoIP::onCallClicked); connect(ui->hangupButton, &QPushButton::clicked, this, &Form_VoIP::onHangupClicked); connect(this, &Form_VoIP::callStateChanged, this, &Form_VoIP::onCallStateChanged); connect(this, &Form_VoIP::incomingVideoFrame, this, &Form_VoIP::handleIncomingFrame); connect(this, &Form_VoIP::outgoingVideoFrame, this, &Form_VoIP::handleOutgoingFrame); } Form_VoIP::~Form_VoIP() { // if (registered) { // this->setRegistration(false); // } ep.libDestroy(); delete ui; } void Form_VoIP::PJSUA2_Init() { if (ep.libGetState() == PJSUA_STATE_NULL) { ep.libCreate(); // 安全调用 } else { return; } qDebug() << "ep.libCreate(); 完成"; pj::EpConfig ep_cfg; ep_cfg.logConfig.level = 5; // 关键设置:启用视频支持 //ep_cfg.medConfig.vidCount = 1; // 必须 >=1 才能支持视频 ep_cfg.medConfig.hasIoqueue = true; ep_cfg.medConfig.clockRate = 16000; ep_cfg.medConfig.quality = 10; ep_cfg.medConfig.ecOptions = PJMEDIA_ECHO_USE_SW_ECHO; // 检查是否已执行 libInit if (ep.libGetState() == PJSUA_STATE_CREATED) { qDebug() << "开始 ep.libInit(ep_cfg)"; ep.libInit(ep_cfg); } else { return; } try { // 刷新视频设备 ep.vidDevManager().refreshDevs(); pj::VidDevManager& vidMgr = ep.vidDevManager(); if (vidMgr.getDevCount() > 0) { qDebug() << "找到视频设备,数量:" << vidMgr.getDevCount(); // 查找您的 video3 设备 int targetDevice = -1; for (unsigned i = 0; i < vidMgr.getDevCount(); i++) { pj::VideoDevInfo info = vidMgr.getDevInfo(i); qDebug() << "设备" << i << ":" << info.name.c_str(); if (QString(info.name.c_str()).contains("video3") || i == 3) { targetDevice = i; qDebug() << "找到目标摄像头 video3,索引:" << i; qDebug() << "设备" << i << "当前格式:"; qDebug() << " 格式ID:" << info.fmt[0].id; qDebug() << " 分辨率:" << info.fmt[0].width << "x" << info.fmt[0].height; qDebug() << " 帧率:" << info.fmt[0].fpsNum << "/" << info.fmt[0].fpsDenum; break; } } if (targetDevice != -1) { // 通过账户配置设置默认捕获设备 accCfg.videoConfig.defaultCaptureDevice = targetDevice; // 修正:使用正确的MediaFormatVideo构造方式 try { // 创建MediaFormatVideo对象并逐个设置属性 MediaFormatVideo format; format.id = PJMEDIA_FORMAT_RGB24; format.width = 640; format.height = 480; format.fpsNum = 30; format.fpsDenum = 1; // 调用setFormat,第三个参数keep表示是否保持当前格式 vidMgr.setFormat(targetDevice, format, true); // keep=true qDebug() << "成功设置设备格式为 RGB24 640x480 @30fps"; } catch (pj::Error &e) { qDebug() << "RGB24格式设置失败:" << e.reason.c_str(); // 尝试其他格式 try { MediaFormatVideo format2; format2.id = PJMEDIA_FORMAT_I420; format2.width = 640; format2.height = 480; format2.fpsNum = 30; format2.fpsDenum = 1; vidMgr.setFormat(targetDevice, format2, true); qDebug() << "成功设置设备格式为 I420 640x480 @30fps"; } catch (pj::Error &e2) { qDebug() << "I420格式设置也失败:" << e2.reason.c_str(); // 尝试YUY2格式 try { MediaFormatVideo format3; format3.id = PJMEDIA_FORMAT_YUY2; format3.width = 640; format3.height = 480; format3.fpsNum = 30; format3.fpsDenum = 1; vidMgr.setFormat(targetDevice, format3, true); qDebug() << "成功设置设备格式为 YUY2 640x480 @30fps"; } catch (pj::Error &e3) { qDebug() << "所有格式设置失败,使用设备默认格式"; } } } } } } catch (pj::Error &e) { qDebug() << "视频设备配置错误:" << e.reason.c_str(); } qDebug() << "配置视频编解码器参数"; // 配置视频编解码器参数 VidCodecParam vid_codec_param; vid_codec_param.ignoreFmtp = true; // 优先使用H264,如果没有则使用VP8 // try { // ep.videoCodecSetPriority("H264/97/98", PJMEDIA_CODEC_PRIO_HIGHEST); // ep.setVideoCodecParam("H264", vid_codec_param); // } catch (...) { // qDebug() << "H264编解码器不可用,尝试VP8"; // ep.videoCodecSetPriority("VP8", PJMEDIA_CODEC_PRIO_HIGHEST); // } qDebug() << "传输配置"; // 传输配置 pj::TransportConfig tcfg; tcfg.port = 5060; ep.transportCreate(PJSIP_TRANSPORT_UDP, tcfg); // 设置默认设备 // ep.audDevManager().setCaptureDev(1); // ep.audDevManager().setPlaybackDev(1); //ep.audDevManager().refreshDevs();// 刷新设备列表 qDebug() << "检查设备"; // 检查设备 if (ep.audDevManager().getDevCount() == 0) { qDebug() << "无音频设备,加载虚拟设备..."; // 调用系统命令加载 snd-dummy(需提前配置) system("sudo modprobe snd-dummy"); ep.audDevManager().refreshDevs(); // 重新刷新 } else { qDebug() << "音频设备数量:"<<ep.audDevManager().getDevCount(); AudioDevInfoVector2 aud_devices = ep.audDevManager().enumDev2(); qDebug() << aud_devices.size(); for (int i = 0; i < aud_devices.size(); i++) { qDebug()<< "设备 " << i << ": " << aud_devices[i].name << " (输入通道: " << aud_devices[i].inputCount << ", 输出通道: " << aud_devices[i].outputCount << ")\n"; } } // ep.audDevManager().setCaptureDev(3); // ep.audDevManager().setPlaybackDev(3); //ep.vidDevManager().refreshDevs(); //pj::VidDevManager& vidMgr = ep.vidDevManager(); // 使用 pj:: 命名空间 // for (unsigned i = 0; i < vidMgr.getDevCount(); i++) { // pj::VideoDevInfo info = vidMgr.getDevInfo(i); // 使用 pj::VidDevInfo // qDebug() << "视频设备" << i << ":" << info.name.c_str() // << "驱动:" << info.driver.c_str(); // if (QString(info.name.c_str()).contains("video3") || // QString(info.driver.c_str()).contains("v4l2")) { // // 修正:使用正确的配置路径 // //accCfg.videoConfig.defaultCaptureDevice = i; // //qDebug() << "设置视频捕获设备为:" << i; // break; // } //} // 获取当前音频设备参数 // AudDevMgr::Param param = ep.audDevMgr().getParam(); // // 设置输入和输出设备ID // param.input_dev = inputDevId; // param.output_dev = outputDevId; // try { // // 应用新的音频设备参数 // ep.audDevMgr().setParam(param); // std::cout << "Successfully configured audio devices with IDs: " // << inputDevId << " (input), " << outputDevId << " (output)." << std::endl; // } catch (Error &err) { // std::cerr << "Failed to set audio device settings: " << err.info() << std::endl; // } // 启动 PJSUA2 ep.libStart(); // 新增自动注册逻辑 // QTimer::singleShot(1000, this, [this]() { // auto& paramMgr = ParamManager::instance(); // QString server = paramMgr.getParamValue("SIP", "Server").toString(); // QString user = paramMgr.getParamValue("SIP", "User").toString(); // QString pass = paramMgr.getParamValue("SIP", "Password").toString(); // if(!server.isEmpty() && !user.isEmpty()) { // registerAccount(server, user, pass); // } // }); } void Form_VoIP::showEvent(QShowEvent *event) { Q_UNUSED(event); qDebug()<<"Form_VoIP::showEvent"; PJSUA2_Init(); //qDebug() << "音频设备数量:"<<ep.audDevManager().getDevCount(); } void Form_VoIP::startStream() { if (streaming) return; streaming = true; VideoDevice::init(); AudioDevice::init(); // 启动视频采集定时器 QTimer *videoTimer = new QTimer(this); connect(videoTimer, &QTimer::timeout, [this](){ QVideoFrame frame = VideoDevice::captureFrame(); if (frame.isValid()) { emit outgoingVideoFrame(frame); } }); videoTimer->start(33); // ~30fps // 音频流处理类似... } void Form_VoIP::stopStream() { if (!streaming) return; streaming = false; VideoDevice::cleanup(); AudioDevice::cleanup(); } void Form_VoIP::handleIncomingFrame(const QVideoFrame &frame) { // 处理接收到的视频帧 // 这里可以添加解码和显示逻辑 } void Form_VoIP::handleOutgoingFrame(const QVideoFrame &frame) { // 处理要发送的视频帧 // 这里可以添加编码和网络传输逻辑 } void Form_VoIP::onRegisterClicked() { m_sipserver = ui->serverEdit->text(); m_sipuser = ui->userEdit->text(); m_sippassword = ui->passwordEdit->text(); registerAccount(m_sipserver, m_sipuser, m_sippassword); } void Form_VoIP::onUnRegisterClicked() { try { if(!acc)return; // 获取已注册的账户对象 Account& account = *acc; // 发送Expires=0的REGISTER请求(注销) account.setRegistration(false); // 核心注销指令 若Account已经注销,则该方法会报错 } catch (pj::Error &e) { std::cerr << "PJSIP Error: " << e.reason << std::endl; // 输出具体错误:ml-citation{ref="4" data="citationList"} } } void Form_VoIP::onCallClicked() { QString number = ui->numberEdit->text(); //makeCall(number); if (currentCall) return; // Make outgoing call Call *call = new MyCall(*acc); acc->calls.push_back(call); CallOpParam prm(true); prm.opt.audioCount = 1; prm.opt.videoCount = 1; // 带视频呼叫 // 设置媒体方向(2.15.1 要求) //prm.opt.audioDir = PJMEDIA_DIR_ENCODING_DECODING; //prm.opt.videoDir = PJMEDIA_DIR_ENCODING_DECODING; prm.opt.mediaDir.push_back(PJMEDIA_DIR_ENCODING_DECODING); // 音频 prm.opt.mediaDir.push_back(PJMEDIA_DIR_ENCODING_DECODING); // 视频 const QString uri = QString("sip:%1@%2").arg(number).arg(m_sipserver); try { call->makeCall(uri.toStdString(), prm); currentCall = call; qDebug() << "发起视频呼叫:" << uri; } catch (pj::Error &e) { delete call; currentCall = nullptr; std::cerr << "PJSIP Error: " << e.reason << std::endl; // 输出具体错误:ml-citation{ref="4" data="citationList"} qDebug() << "呼叫失败:" << e.reason.c_str(); } // pj::Call *call = new pj::Call(*this); // pj::CallOpParam prm(true); // currentCall = call; // try { // call->makeCall("sip:" + uri.toStdString(), prm); // emit callStateChanged(1); // 连接中 // } catch (...) { // delete call; // currentCall = nullptr; // emit callStateChanged(0); // 空闲状态 // } } void Form_VoIP::onHangupClicked() { //hangup(); currentCall = nullptr; ep.hangupAllCalls(); } void Form_VoIP::onCallStateChanged(int state) { // 更新界面状态 // 0: 空闲, 1: 连接中, 2: 通话中 if (state == 2) { // 通话中 setupVideoCall(); } //有拨号界面时,关闭拨号 // else if (callWindow) { // callWindow->close(); // callWindow = nullptr; // } } void Form_VoIP::setupVideoCall() { // if (!callWindow) // { // callWindow = new CallWindow(this); // VideoDevice::init(); // connect(VideoDevice::instance(), &VideoDevice::frameAvailable, // callWindow, [this](const QVideoFrame &frame) { // // 获取本地视频项的videoSink并设置视频帧 // if (auto sink = callWindow->localVideoItem()->videoSink()) { // sink->setVideoFrame(frame); // } // }); // } // callWindow->show(); VideoDevice::init(); connect(VideoDevice::instance(), &VideoDevice::frameAvailable, this, [this](const QVideoFrame &frame) { // if (auto* sink = this->localVideoItem()->videoSink()) { // 明确指出 sink 是指针类型 // if (sink && sink->isActive()) { // 添加必要的有效性检查 // qDebug() << "Video sink is active."; // sink->setVideoFrame(frame); // } else { // qDebug() << "No valid video sink or it's inactive."; // } // } // 获取本地视频项的videoSink并设置视频帧 if (auto sink = this->localVideoItem->videoSink()) { sink->setVideoFrame(frame); } }); } AudioFilter::AudioFilter(int frameSize, int sampleRate) : frameSize(frameSize), sampleRate(sampleRate) { m_format.setSampleRate(sampleRate); m_format.setChannelCount(1); m_format.setSampleFormat(QAudioFormat::Int16); // 获取默认音频输入设备并创建QAudioInput QAudioDevice inputDevice = QMediaDevices::defaultAudioInput(); if (!inputDevice.isFormatSupported(m_format)) { qWarning() << "Default audio input device does not support the required format"; m_format = inputDevice.preferredFormat(); } // m_audioInput = new QAudioInput(inputDevice, m_format, this); // // 设置音频处理选项 // QAudioInputOptions options; // options.setEchoCancel(true); // options.setNoiseSuppression(true); // m_audioInput->setOptions(options); // // 启动音频输入 // m_audioInput->start(); } AudioFilter::~AudioFilter() { //m_audioDevice.stop(); } void AudioFilter::processCapture(const QAudioBuffer &input, QAudioBuffer &output) { // Qt6.7处理逻辑 - 使用QAudioSink处理输入 // QAudioSink sink(m_audioDevice, m_format); // sink.start(); // sink.write(input.data(), input.byteCount()); // // 处理输出缓冲区 // QAudioSource source(m_audioDevice, m_format); // source.start(); // source.read(output.data(), output.byteCount()); } void AudioFilter::processPlayback(const QAudioBuffer &input) { // QAudioSink sink(m_audioDevice, m_format); // QIODevice *ioDevice = sink.start(); // if (ioDevice) { // ioDevice->write(input.constData<char>(), input.byteCount()); // } // sink.stop(); // sink.write(input.data(), input.byteCount()); } // 静态成员变量定义 QAudioDevice AudioDevice::m_inputDevice; QAudioDevice AudioDevice::m_outputDevice; QAudioFormat AudioDevice::m_format; void AudioDevice::init() { // 初始化音频格式 m_format.setSampleRate(44100); m_format.setChannelCount(2); m_format.setSampleFormat(QAudioFormat::Int16); // 获取默认输入输出设备 m_inputDevice = QMediaDevices::defaultAudioInput(); m_outputDevice = QMediaDevices::defaultAudioOutput(); // 验证设备是否支持所需格式 if (!m_inputDevice.isFormatSupported(m_format)) { qWarning() << "Default input device does not support required format"; } if (!m_outputDevice.isFormatSupported(m_format)) { qWarning() << "Default output device does not support required format"; } } void AudioDevice::cleanup() { // 清理音频资源 m_inputDevice = QAudioDevice(); m_outputDevice = QAudioDevice(); } int AudioDevice::capture(void *buffer, int samples) { static QAudioSource *input = nullptr; if (!input) { input = new QAudioSource(m_inputDevice, m_format); } if (input->state() == QAudio::StoppedState) { //input->start(buffer); } return 0;//input->read((char*)buffer, samples * m_format.bytesPerSample()); } int AudioDevice::play(void *buffer, int samples) { static QAudioSink *output = nullptr; static QIODevice *ioDevice = nullptr; if (!output) { output = new QAudioSink(m_outputDevice, m_format); ioDevice = output->start(); } if (ioDevice) { return ioDevice->write(static_cast<const char*>(buffer), samples); } return 0; if (output->state() == QAudio::StoppedState) { //output->start(buffer); } return 0;//output->write((const char*)buffer, samples * m_format.bytesPerSample()); } VideoDevice *VideoDevice::m_instance = nullptr; QCamera *VideoDevice::camera = nullptr; VideoDevice *VideoDevice::instance() { if (!m_instance) { m_instance = new VideoDevice(); } return m_instance; } VideoDevice::VideoDevice(QObject *parent) : QObject(parent) {} void VideoDevice::init() { auto videoInputs = QMediaDevices::videoInputs(); if (videoInputs.isEmpty()) { qWarning() << "No video input devices found"; return; } // 配置视频格式以匹配您的ffmpeg参数 QCameraFormat format; foreach (auto fmt, videoInputs.first().videoFormats()) { if (fmt.resolution() == QSize(640, 480) && fmt.maxFrameRate() >= 30) { format = fmt; break; } } if (!format.isNull()) { camera = new QCamera(videoInputs.first()); camera->setCameraFormat(format); auto *captureSession = new QMediaCaptureSession; captureSession->setCamera(camera); auto *videoSink = new QVideoSink; QObject::connect(videoSink, &QVideoSink::videoFrameChanged, [](const QVideoFrame &frame) { emit VideoDevice::instance()->frameAvailable(frame); }); captureSession->setVideoSink(videoSink); camera->start(); qDebug() << "摄像头启动: 640x480 @ 30fps"; } else { qWarning() << "未找到匹配的视频格式"; } } void VideoDevice::cleanup() { if (camera) { camera->stop(); delete camera; camera = nullptr; } } QVideoFrame VideoDevice::captureFrame() { if (!camera) return QVideoFrame(); // 创建视频帧并映射内存 //QVideoFrame frame(QSize(640, 480), QVideoFrameFormat::Format_ARGB8888); // 先创建格式对象 QVideoFrameFormat format( QSize(640, 480), QVideoFrameFormat::Format_ARGB8888 ); // 再创建视频帧对象 QVideoFrame frame(format); if (frame.map(QVideoFrame::WriteOnly)) { // 这里可以填充帧数据 frame.unmap(); } return frame; } void Form_VoIP::registerAccount(const QString &server, const QString &user, const QString &pass) { accCfg.idUri = "sip:" + user.toStdString() + "@" + server.toStdString(); accCfg.regConfig.registrarUri = "sip:" + server.toStdString(); accCfg.callConfig.timerMinSESec = 90; // 强制最小值 accCfg.callConfig.timerSessExpiresSec = 1800; // 推荐30分钟超时 accCfg.sipConfig.authCreds.push_back( pj::AuthCredInfo("digest", "*", user.toStdString(), 0, pass.toStdString()) ); // 关键:启用视频配置 accCfg.videoConfig.autoShowIncoming = true; accCfg.videoConfig.autoTransmitOutgoing = true; // accCfg.videoConfig.defaultCaptureDevice = PJMEDIA_VID_DEFAULT_CAPTURE_DEV; // accCfg.videoConfig.defaultRenderDevice = PJMEDIA_VID_DEFAULT_RENDER_DEV; // // 设置媒体功能 // MediaFormatVideo accVideoFormat; // accVideoFormat.init(PJMEDIA_FORMAT_RGB24, 640, 480, 30, 1, 0, 0); // acc= new MyAccount; try { acc->create(accCfg); registered = true; emit callStateChanged(0); } catch (...) { std::cout << "Adding account failed" << std::endl; registered = false; } } void Form_VoIP::hangup() { if (currentCall) { try { pj::CallOpParam prm; currentCall->hangup(prm); currentCall = nullptr; emit callStateChanged(0); // 空闲状态 } catch (pj::Error &e) { currentCall = nullptr; std::cerr << "PJSIP Error: " << e.reason << std::endl; // 输出具体错误:ml-citation{ref="4" data="citationList"} } } //hangUpAllCalls(); } //还没调好 void Form_VoIP::hangUpAllCalls() { std::vector<pj::Call*> calls = acc->calls; for (auto call : calls) { CallInfo ci = call->getInfo(); if (ci.state != PJSIP_INV_STATE_DISCONNECTED) { try { pj::CallOpParam prm; call->hangup(prm); } catch (...) { // 处理可能的异常 } } } } void Form_VoIP::setVideoWindows(QVideoSink *local, QVideoSink *remote) { // Qt6.7视频架构兼容实现 if (remote) { connect(this, &Form_VoIP::incomingVideoFrame, this, [remote](const QVideoFrame &frame) { QVideoFrame clonedFrame(frame); clonedFrame.map(QVideoFrame::ReadOnly); remote->setVideoFrame(clonedFrame); clonedFrame.unmap(); }, Qt::QueuedConnection); } if (local) { connect(this, &Form_VoIP::outgoingVideoFrame, this, [local](const QVideoFrame &frame) { QVideoFrame clonedFrame(frame); clonedFrame.map(QVideoFrame::ReadOnly); local->setVideoFrame(clonedFrame); clonedFrame.unmap(); }, Qt::QueuedConnection); } } 每次程序执行到libinit后 控制台输出Form_VoIP::showEvent ep.libCreate(); 完成 开始 ep.libInit(ep_cfg)后停止了 然后一直卡在ep.libInit(ep_cfg)初始化 这是怎么回事 之前同样的代码运行起来就没问题 是有什么影响到了初始化吗
最新发布
09-30
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值