HTML Form Handler Sample

本文介绍如何使用Windows Communication Foundation (WCF) Web编程模型扩展处理HTML表单提交,并通过示例展示了如何解析表单数据及实现自定义请求格式化。

引自:MSDN http://msdn.microsoft.com/en-us/library/bb943485.aspx

 

Download sample

 

 

This sample shows how to extend the Windows Communication Foundation (WCF) Web Programming Model to handle HTML form posts, such as those produced by a Web browser.

Note:
This sample requires that .NET Framework version 3.5 is installed to build and run. Visual Studio 2008 is required to open the project and solution files.

 

 

Parsing Form Data

HTML form posts are encoded as a series of name-value pairs inside of an HTTP POST entity body with a content type of application/x-www-form-urlencoded. The ParseQueryString method is capable of parsing these values into a NameValueCollection when presented with a raw entity body string. To allow this name/value collection to be passed as a parameter to a WCF service operation, the FormDataProcessor class in the sample uses the IDispatchMessageFormatter extensibility point.

The FormDataProcessor class’s implementation of DeserializeRequest uses ParseQueryString to parse the entity body in a NameValueCollection. A Microsoft Language Integrated Query (LINQ) is used to populate additional method parameters whose values are available through the UriTemplateMatch used to dispatch the request to the operation.

public void DeserializeRequest(System.ServiceModel.Channels.Message message, object[] parameters)
{
    if (WebOperationContext.Current.IncomingRequest.ContentType 
                         != "application/x-www-form-urlencoded")
        throw new InvalidDataException("Unexpected content type");
    Stream s = StreamMessageHelper.GetStream(message);
    string formData = new StreamReader(s).ReadToEnd();
    NameValueCollection parsedForm = 
            System.Web.HttpUtility.ParseQueryString(formData);
    UriTemplateMatch match = 
     message.Properties["UriTemplateMatchResults"] as UriTemplateMatch;
    ParameterInfo[] paramInfos = operation.SyncMethod.GetParameters();
    var binder = CreateParameterBinder( match );
    object[] values = (from p in paramInfos
                       select binder(p)).ToArray<Object>();
    values[paramInfos.Length - 1] = parsedForm;
    values.CopyTo(parameters, 0);
}

private Func<ParameterInfo, object> CreateParameterBinder(UriTemplateMatch match)
{
    QueryStringConverter converter = new QueryStringConverter();
    return delegate( ParameterInfo pi )
    {
        string value = match.BoundVariables[pi.Name];
        if (converter.CanConvert(pi.ParameterType) && value != null)
            return converter.ConvertStringToValue(value, 
                                                   pi.ParameterType);
        else
        return value;
    };
}

Extending WebHttpBehavior with a custom RequestFormatter

You can derive a class from the WebHttpBehavior to extend the WCF runtime for each operation. In the sample, FormProcessingBehavior overrides GetRequestDispatchFormatter to plug in a FormDataFormatter for any Web invoke operation whose last parameter is a NameValueCollection.

public class FormProcessingBehavior : WebHttpBehavior
{
    protected override IDispatchMessageFormatter GetRequestDispatchFormatter(OperationDescription operationDescription, ServiceEndpoint endpoint)
    {
        //Messages[0] is the request message
        MessagePartDescriptionCollection parts = 
                 operationDescription.Messages[0].Body.Parts;

        //This formatter looks for [WebInvoke] operations that have
        // their last parameter typed as NameValueCollection
        if (operationDescription.Behaviors.Find<WebInvokeAttribute>() 
                != null &&
            parts.Count > 0 &&
            parts[parts.Count - 1].Type == typeof(NameValueCollection))
        {
            return new FormDataRequestFormatter(operationDescription);
        }
        else
        {
            return base.GetRequestDispatchFormatter(
                      operationDescription, endpoint);
        }
    }
}
Implementing a Form Processing Service

The FormProcessingBehavior hides the details of HTML form processing. The service implementation can then be written without special knowledge of HTML forms, as shown in the following sample code.

[OperationContract]
[WebInvoke(UriTemplate = "ProcessForm/{templateParam1}/{templateParam2}")]
public Message ProcessForm(string templateParam1, string templateParam2, NameValueCollection formData)
{
    DumpValues(Console.Out, templateParam1, templateParam2, formData);

    return StreamMessageHelper.CreateMessage(
        MessageVersion.None, "",
        "text/plain",
        delegate(Stream output)
        {
          TextWriter writer = new StreamWriter(output);
          DumpValues(writer, templateParam1, templateParam2, formData);
        }
        );
}
Note:
For a detailed description of the StreamMessageHelper class, see the Push-Style Streaming Sample.

 

 

Hosting the Form Processing Service

The service is hosted using the ServiceHost class. The custom FormProcessingBehavior is added to the ServiceEndpoint manually before calling Open as shown in the following sample code.

ServiceHost host = new ServiceHost(typeof(Service), new Uri("http://localhost:8000/FormTest"));

ServiceEndpoint endpoint = host.AddServiceEndpoint(typeof(Service), new WebHttpBinding(), "");
endpoint.Behaviors.Add(new FormProcessingBehavior());

Additionally, the HTTP GET endpoint that exists by default (the endpoint that produces the default HTML help page) is removed by disabling the ServiceMetadataBehavior and ServiceDebugBehavior as shown in the following sample code.

ServiceMetadataBehavior smb = host.Description.Behaviors.Find<ServiceMetadataBehavior>();

if (smb != null)
      {
    smb.HttpGetEnabled = false;
    smb.HttpsGetEnabled = false;
}

ServiceDebugBehavior sdb = host.Description.Behaviors.Find<ServiceDebugBehavior>();
if (sdb != null)
{
    sdb.HttpHelpPageEnabled = false;
}

Running the sample

To view the output of the sample, compile and run the HtmlFormProcessing project and then navigate to http://localhost:8000/FormTest with a Web browser.

See Also

Other Resources
Push-Style Streaming Sample
#include "form_voip.h" #include "ui_form_voip.h" #include "parammanager.h" #include <QPushButton> #include <QMediaDevices> #include <QVideoSink> #include <QCamera> #include <QTimer> #include <QAudioInput> #include <QVideoFrame> using namespace pj; class MyEndpoint : public Endpoint { public: MyEndpoint() : Endpoint() {}; virtual pj_status_t onCredAuth(OnCredAuthParam &prm) { PJ_UNUSED_ARG(prm); std::cout << "*** Callback onCredAuth called ***" << std::endl; /* Return PJ_ENOTSUP to use * pjsip_auth_create_aka_response()/<b>libmilenage</b> (default), * if PJSIP_HAS_DIGEST_AKA_AUTH is defined. */ return PJ_ENOTSUP; } }; class MyAccount; class MyAudioMediaPort: public AudioMediaPort { virtual void onFrameRequested(MediaFrame &frame) { // Give the input frame here frame.type = PJMEDIA_FRAME_TYPE_AUDIO; // frame.buf.assign(frame.size, 'c'); } virtual void onFrameReceived(MediaFrame &frame) { PJ_UNUSED_ARG(frame); // Process the incoming frame here } }; class MyCall : public Call { private: MyAccount *myAcc; AudioMediaPlayer *wav_player; AudioMediaPort *med_port; // 视频窗口指针 VideoWindow *localVideoWin = nullptr; VideoWindow *remoteVideoWin = nullptr; public: MyCall(Account &acc, int call_id = PJSUA_INVALID_ID) : Call(acc, call_id) { wav_player = NULL; med_port = NULL; myAcc = (MyAccount *)&acc; } ~MyCall() { if (wav_player) delete wav_player; if (med_port) delete med_port; // 清理视频窗口 if (localVideoWin) delete localVideoWin; if (remoteVideoWin) delete remoteVideoWin; } virtual void onCallState(OnCallStateParam &prm); virtual void onCallTransferRequest(OnCallTransferRequestParam &prm); virtual void onCallReplaceRequest(OnCallReplaceRequestParam &prm); virtual void onCallMediaState(OnCallMediaStateParam &prm); }; class MyAccount : public Account { public: std::vector<Call *> calls; public: MyAccount() {} ~MyAccount() { std::cout << "*** Account is being deleted: No of calls=" << calls.size() << std::endl; for (std::vector<Call *>::iterator it = calls.begin(); it != calls.end(); ) { delete (*it); it = calls.erase(it); } } void removeCall(Call *call) { for (std::vector<Call *>::iterator it = calls.begin(); it != calls.end(); ++it) { if (*it == call) { calls.erase(it); break; } } } //‌ OnRegStateParam ‌是 PJSUA2 开发框架中的一个参数类,用于处理注册状态的变化。当帐户的注册状态发生变化时, Account 类会触发一个事件,该事件传递 OnRegStateParam 参数,其中包含了注册状态变化的详细信息。 virtual void onRegState(OnRegStateParam &prm) { AccountInfo ai = getInfo(); std::cout << (ai.regIsActive? "***** Register: code=" : "***** Unregister: code=") << prm.code << std::endl; } virtual void onIncomingCall(OnIncomingCallParam &iprm) { Call *call = new MyCall(*this, iprm.callId); CallInfo ci = call->getInfo(); CallOpParam prm; std::cout << "*** Incoming Call: " << ci.remoteUri << " [" << ci.stateText << "]" << std::endl; calls.push_back(call); prm.statusCode = (pjsip_status_code)200; // 2.15.1 必须设置这两个参数 prm.opt.videoCount = 1; //opt 是 CallSetting 类型 call->answer(prm); //currentCall = call; // 启动来电铃声 // pjmedia_tonegen_param toneParam; // pj_bzero(&toneParam, sizeof(toneParam)); // toneParam.flags = PJMEDIA_TONEGEN_LOOP; // toneParam.tones[0].freq1 = 440; // 音频频率 // toneParam.tones[0].duration = -1; // 持续时间(无限循环) // pjmedia_tonegen_create(pjsua_get_media_transport(), NULL, &toneParam, &call->ringToneId); //使用 pjmedia_tonegen_create 函数创建一个音频生成器来播放铃声。上述代码中,ringToneId 是生成的铃声音频流 ID,用于后续停止铃声 //当用户点击接听按钮时,可以通过调用 answer 方法接听电话,并停止铃声播放 // void MyCall::answerCall() { // CallOpParam answer_op; // answer_op.statusCode = PJSIP_SC_OK; // 使用 200 OK 表示接听电话 // answer(answer_op); // // 停止来电铃声 // if (ringToneId != PJMEDIA_TONEGEN_ID_INVALID) { // pjmedia_tonegen_destroy(ringToneId); // ringToneId = PJMEDIA_TONEGEN_ID_INVALID; // } // PJ_LOG(3, (pj_get_log_writer(), "Call answered!")); // } // 假设这是 GUI 框架中的按钮点击事件处理函数 // void onAnswerButtonClicked(MyCall *call) { // call->answerCall(); // } } // void MyAccount::onIncomingCall(OnIncomingCallParam &iprm) { // PJ_LOG(3, (pj_get_log_writer(), "Incoming call from %.*s!", // (int)iprm.rinfo.fromUri.slen, iprm.rinfo.fromUri.ptr)); // // 创建一个新的 Call 对象 // MyCall *call = new MyCall(*this, iprm.callId); // // 启动来电铃声 // pjmedia_tonegen_param toneParam; // pj_bzero(&toneParam, sizeof(toneParam)); // toneParam.flags = PJMEDIA_TONEGEN_LOOP; // toneParam.tones[0].freq1 = 440; // 音频频率 // toneParam.tones[0].duration = -1; // 持续时间(无限循环) // pjmedia_tonegen_create(pjsua_get_media_transport(), NULL, &toneParam, &call->ringToneId); // } }; //condition_variable cv; void MyCall::onCallState(OnCallStateParam &prm) { PJ_UNUSED_ARG(prm); CallInfo ci = getInfo(); std::cout << "*** Call: " << ci.remoteUri << " [" << ci.stateText << "]" << std::endl; if (ci.state == PJSIP_INV_STATE_DISCONNECTED) { //myAcc->removeCall(this); /* Delete the call */ //delete this; } else if (ci.state == PJSIP_INV_STATE_CONFIRMED) { std::cout << "*** 通话已连接" << std::endl; //callConnected = true; //cv.notify_one(); } } // 创建视频窗口 // void createVideoWindows(int med_idx) { // // 获取视频媒体 // VideoMedia vid_med = getVideoMedia(med_idx); // // 创建本地预览窗口 // VideoPreviewOpParam pre_param; // VideoWindowHandle wnd_handle; // try { // VideoPreview *preview = vid_med.startPreview(pre_param); // VideoWindow win = preview->getVideoWindow(); // win.Show(true); // cout << "本地预览窗口ID: " << win.getInfo().winId << endl; // } catch(Error& err) { // cout << "预览窗口错误: " << err.info() << endl; // } // // 创建远程视频窗口 // try { // VideoWindow remote_win = vid_med.getRemoteVideoWindow(); // remote_win.Show(true); // cout << "远程视频窗口ID: " << remote_win.getInfo().winId << endl; // } catch(Error& err) { // cout << "远程窗口错误: " << err.info() << endl; // } // } void MyCall::onCallMediaState(OnCallMediaStateParam &prm) { PJ_UNUSED_ARG(prm); CallInfo ci = getInfo(); AudioMedia aud_med; AudioMedia& play_dev_med = MyEndpoint::instance().audDevManager().getPlaybackDevMedia(); qDebug()<<"AudioMedia& play_dev_med = MyEndpoint::instance().audDevManager().getPlaybackDevMedia();"; try { // Get the first audio media aud_med = getAudioMedia(-1); } catch(...) { std::cout << "Failed to get audio media" << std::endl; return; } if (!wav_player) { wav_player = new AudioMediaPlayer(); try { wav_player->createPlayer( "./closecall.wav", 0); std::cout << "Success opened wav file" << std::endl; } catch (...) { std::cout << "Failed opening wav file" << std::endl; delete wav_player; wav_player = NULL; } } // This will connect the wav file to the call audio media if (wav_player) wav_player->startTransmit(aud_med); if (!med_port) { med_port = new MyAudioMediaPort(); MediaFormatAudio fmt; fmt.init(PJMEDIA_FORMAT_PCM, 16000, 1, 20000, 16); med_port->createPort("med_port", fmt); // Connect the media port to the call audio media in both directions med_port->startTransmit(aud_med); aud_med.startTransmit(*med_port); } // And this will connect the call audio media to the sound device/speaker aud_med.startTransmit(play_dev_med); // 遍历媒体流 // for (unsigned i = 0; i < ci.media.size(); i++) { // if (ci.media[i].type == PJMEDIA_TYPE_VIDEO) { // // 获取视频媒体流 // VideoMedia *vid_med = (VideoMedia *)getMedia(i); // if (vid_med) { // // 创建本地预览窗口 // try { // VideoPreviewOpParam pre_param; // VideoPreview *preview = vid_med->startPreview(pre_param); // localVideoWin = new VideoWindow(preview->getVideoWindow()); // localVideoWin->Show(true); // std::cout << "本地预览窗口ID: " << localVideoWin->getInfo().winId << std::endl; // } catch(Error& err) { // std::cout << "预览窗口错误: " << err.info() << std::endl; // } // // 创建远程视频窗口 // try { // remoteVideoWin = new VideoWindow(vid_med->getRemoteVideoWindow()); // remoteVideoWin->Show(true); // std::cout << "远程视频窗口ID: " << remoteVideoWin->getInfo().winId << std::endl; // } catch(Error& err) { // std::cout << "远程窗口错误: " << err.info() << std::endl; // } // } // } // } } void MyCall::onCallTransferRequest(OnCallTransferRequestParam &prm) { /* Create new Call for call transfer */ prm.newCall = new MyCall(*myAcc); } void MyCall::onCallReplaceRequest(OnCallReplaceRequestParam &prm) { /* Create new Call for call replace */ prm.newCall = new MyCall(*myAcc); } Form_VoIP::Form_VoIP(QWidget *parent) : QWidget(parent) , ui(new Ui::Form_VoIP) { ui->setupUi(this); this->resize(1680,920); // 初始化GStreamer qputenv("GST_DEBUG", "2"); // 设置GStreamer调试级别 qputenv("GST_PLUGIN_PATH", "/usr/lib/gstreamer-1.0"); // 设置插件路径 connect(ui->registerButton, &QPushButton::clicked, this, &Form_VoIP::onRegisterClicked); connect(ui->unregisterButton, &QPushButton::clicked, this, &Form_VoIP::onUnRegisterClicked); connect(ui->callButton, &QPushButton::clicked, this, &Form_VoIP::onCallClicked); connect(ui->hangupButton, &QPushButton::clicked, this, &Form_VoIP::onHangupClicked); connect(this, &Form_VoIP::callStateChanged, this, &Form_VoIP::onCallStateChanged); connect(this, &Form_VoIP::incomingVideoFrame, this, &Form_VoIP::handleIncomingFrame); connect(this, &Form_VoIP::outgoingVideoFrame, this, &Form_VoIP::handleOutgoingFrame); } Form_VoIP::~Form_VoIP() { // if (registered) { // this->setRegistration(false); // } ep.libDestroy(); delete ui; } void Form_VoIP::PJSUA2_Init() { if (ep.libGetState() == PJSUA_STATE_NULL) { ep.libCreate(); // 安全调用 } else { return; } qDebug() << "ep.libCreate(); 完成"; pj::EpConfig ep_cfg; ep_cfg.logConfig.level = 5; // 关键设置:启用视频支持 //ep_cfg.medConfig.vidCount = 1; // 必须 >=1 才能支持视频 // 设备配置(使用 PJSUA2 常量) // ep_cfg.vidConfig.capDev = PJMEDIA_VID_DEFAULT_CAPTURE_DEV; // ep_cfg.vidConfig.rendDev = PJMEDIA_VID_DEFAULT_RENDER_DEV; // 2.15.1 特定参数 // ep_cfg.uaConfig.threadCnt = 0; // 自动选择线程数 // ep_cfg.medConfig.clockRate = 16000; // 推荐时钟频率 // Linux (强制使用 V4L2) //ep_cfg.medConfig. = "v4l2"; // 检查是否已执行 libCreate // 检查是否已执行 libInit if (ep.libGetState() == PJSUA_STATE_CREATED) { qDebug() << "开始 ep.libInit(ep_cfg)"; ep.libInit(ep_cfg); } else { return; } qDebug() << "配置视频编解码器参数"; // 配置视频编解码器参数 VidCodecParam vid_codec_param; vid_codec_param.ignoreFmtp = true; ep.videoCodecSetPriority("H264/97/98", PJMEDIA_CODEC_PRIO_HIGHEST); ep.setVideoCodecParam("H264", vid_codec_param); qDebug() << "传输配置"; // 传输配置 pj::TransportConfig tcfg; tcfg.port = 5060; ep.transportCreate(PJSIP_TRANSPORT_UDP, tcfg); // 设置默认设备 // ep.audDevManager().setCaptureDev(1); // ep.audDevManager().setPlaybackDev(1); //ep.audDevManager().refreshDevs();// 刷新设备列表 qDebug() << "检查设备"; // 检查设备 if (ep.audDevManager().getDevCount() == 0) { qDebug() << "无音频设备,加载虚拟设备..."; // 调用系统命令加载 snd-dummy(需提前配置) system("sudo modprobe snd-dummy"); ep.audDevManager().refreshDevs(); // 重新刷新 } else { qDebug() << "音频设备数量:"<<ep.audDevManager().getDevCount(); AudioDevInfoVector2 aud_devices = ep.audDevManager().enumDev2(); qDebug() << aud_devices.size(); for (int i = 0; i < aud_devices.size(); i++) { qDebug()<< "设备 " << i << ": " << aud_devices[i].name << " (输入通道: " << aud_devices[i].inputCount << ", 输出通道: " << aud_devices[i].outputCount << ")\n"; } } // ep.audDevManager().setCaptureDev(3); // ep.audDevManager().setPlaybackDev(3); ep.vidDevManager().refreshDevs(); // 手动刷新设备枚举 if (ep.vidDevManager().getDevCount() == 0) { qDebug() << "无视频设备,..."; } else { qDebug() << "视频设备数量:"<<ep.vidDevManager().getDevCount(); } // 获取当前音频设备参数 // AudDevMgr::Param param = ep.audDevMgr().getParam(); // // 设置输入和输出设备ID // param.input_dev = inputDevId; // param.output_dev = outputDevId; // try { // // 应用新的音频设备参数 // ep.audDevMgr().setParam(param); // std::cout << "Successfully configured audio devices with IDs: " // << inputDevId << " (input), " << outputDevId << " (output)." << std::endl; // } catch (Error &err) { // std::cerr << "Failed to set audio device settings: " << err.info() << std::endl; // } // 启动 PJSUA2 ep.libStart(); // 新增自动注册逻辑 QTimer::singleShot(1000, this, [this]() { auto& paramMgr = ParamManager::instance(); QString server = paramMgr.getParamValue("SIP", "Server").toString(); QString user = paramMgr.getParamValue("SIP", "User").toString(); QString pass = paramMgr.getParamValue("SIP", "Password").toString(); if(!server.isEmpty() && !user.isEmpty()) { registerAccount(server, user, pass); } }); } void Form_VoIP::showEvent(QShowEvent *event) { Q_UNUSED(event); qDebug()<<"Form_VoIP::showEvent"; PJSUA2_Init(); //qDebug() << "音频设备数量:"<<ep.audDevManager().getDevCount(); } void Form_VoIP::startStream() { if (streaming) return; streaming = true; VideoDevice::init(); AudioDevice::init(); // 启动视频采集定时器 QTimer *videoTimer = new QTimer(this); connect(videoTimer, &QTimer::timeout, [this](){ QVideoFrame frame = VideoDevice::captureFrame(); if (frame.isValid()) { emit outgoingVideoFrame(frame); } }); videoTimer->start(33); // ~30fps // 音频流处理类似... } void Form_VoIP::stopStream() { if (!streaming) return; streaming = false; VideoDevice::cleanup(); AudioDevice::cleanup(); } void Form_VoIP::handleIncomingFrame(const QVideoFrame &frame) { // 处理接收到的视频帧 // 这里可以添加解码和显示逻辑 } void Form_VoIP::handleOutgoingFrame(const QVideoFrame &frame) { // 处理要发送的视频帧 // 这里可以添加编码和网络传输逻辑 } void Form_VoIP::onRegisterClicked() { m_sipserver = ui->serverEdit->text(); m_sipuser = ui->userEdit->text(); m_sippassword = ui->passwordEdit->text(); registerAccount(m_sipserver, m_sipuser, m_sippassword); } void Form_VoIP::onUnRegisterClicked() { try { if(!acc)return; // 获取已注册的账户对象 Account& account = *acc; // 发送Expires=0的REGISTER请求(注销) account.setRegistration(false); // 核心注销指令 若Account已经注销,则该方法会报错 } catch (pj::Error &e) { std::cerr << "PJSIP Error: " << e.reason << std::endl; // 输出具体错误:ml-citation{ref="4" data="citationList"} } } void Form_VoIP::onCallClicked() { QString number = ui->numberEdit->text(); //makeCall(number); if (currentCall) return; // Make outgoing call Call *call = new MyCall(*acc); acc->calls.push_back(call); CallOpParam prm(true); prm.opt.audioCount = 1; prm.opt.videoCount = 0; // 带视频呼叫 // 设置媒体方向(2.15.1 要求) // prm.opt.audioDir = PJMEDIA_DIR_ENCODING_DECODING; // prm.opt.videoDir = PJMEDIA_DIR_ENCODING_DECODING; const QString uri = QString("sip:%1@%2").arg(number).arg(m_sipserver); try { call->makeCall(uri.toStdString(), prm); currentCall = call; } catch (pj::Error &e) { delete call; currentCall = nullptr; std::cerr << "PJSIP Error: " << e.reason << std::endl; // 输出具体错误:ml-citation{ref="4" data="citationList"} } // pj::Call *call = new pj::Call(*this); // pj::CallOpParam prm(true); // currentCall = call; // try { // call->makeCall("sip:" + uri.toStdString(), prm); // emit callStateChanged(1); // 连接中 // } catch (...) { // delete call; // currentCall = nullptr; // emit callStateChanged(0); // 空闲状态 // } } void Form_VoIP::onHangupClicked() { //hangup(); currentCall = nullptr; ep.hangupAllCalls(); } void Form_VoIP::onCallStateChanged(int state) { // 更新界面状态 // 0: 空闲, 1: 连接中, 2: 通话中 if (state == 2) { // 通话中 setupVideoCall(); } //有拨号界面时,关闭拨号 // else if (callWindow) { // callWindow->close(); // callWindow = nullptr; // } } void Form_VoIP::setupVideoCall() { // if (!callWindow) // { // callWindow = new CallWindow(this); // VideoDevice::init(); // connect(VideoDevice::instance(), &VideoDevice::frameAvailable, // callWindow, [this](const QVideoFrame &frame) { // // 获取本地视频项的videoSink并设置视频帧 // if (auto sink = callWindow->localVideoItem()->videoSink()) { // sink->setVideoFrame(frame); // } // }); // } // callWindow->show(); VideoDevice::init(); connect(VideoDevice::instance(), &VideoDevice::frameAvailable, this, [this](const QVideoFrame &frame) { // if (auto* sink = this->localVideoItem()->videoSink()) { // 明确指出 sink 是指针类型 // if (sink && sink->isActive()) { // 添加必要的有效性检查 // qDebug() << "Video sink is active."; // sink->setVideoFrame(frame); // } else { // qDebug() << "No valid video sink or it's inactive."; // } // } // 获取本地视频项的videoSink并设置视频帧 if (auto sink = this->localVideoItem->videoSink()) { sink->setVideoFrame(frame); } }); } AudioFilter::AudioFilter(int frameSize, int sampleRate) : frameSize(frameSize), sampleRate(sampleRate) { m_format.setSampleRate(sampleRate); m_format.setChannelCount(1); m_format.setSampleFormat(QAudioFormat::Int16); // 获取默认音频输入设备并创建QAudioInput QAudioDevice inputDevice = QMediaDevices::defaultAudioInput(); if (!inputDevice.isFormatSupported(m_format)) { qWarning() << "Default audio input device does not support the required format"; m_format = inputDevice.preferredFormat(); } // m_audioInput = new QAudioInput(inputDevice, m_format, this); // // 设置音频处理选项 // QAudioInputOptions options; // options.setEchoCancel(true); // options.setNoiseSuppression(true); // m_audioInput->setOptions(options); // // 启动音频输入 // m_audioInput->start(); } AudioFilter::~AudioFilter() { //m_audioDevice.stop(); } void AudioFilter::processCapture(const QAudioBuffer &input, QAudioBuffer &output) { // Qt6.7处理逻辑 - 使用QAudioSink处理输入 // QAudioSink sink(m_audioDevice, m_format); // sink.start(); // sink.write(input.data(), input.byteCount()); // // 处理输出缓冲区 // QAudioSource source(m_audioDevice, m_format); // source.start(); // source.read(output.data(), output.byteCount()); } void AudioFilter::processPlayback(const QAudioBuffer &input) { // QAudioSink sink(m_audioDevice, m_format); // QIODevice *ioDevice = sink.start(); // if (ioDevice) { // ioDevice->write(input.constData<char>(), input.byteCount()); // } // sink.stop(); // sink.write(input.data(), input.byteCount()); } // 静态成员变量定义 QAudioDevice AudioDevice::m_inputDevice; QAudioDevice AudioDevice::m_outputDevice; QAudioFormat AudioDevice::m_format; void AudioDevice::init() { // 初始化音频格式 m_format.setSampleRate(44100); m_format.setChannelCount(2); m_format.setSampleFormat(QAudioFormat::Int16); // 获取默认输入输出设备 m_inputDevice = QMediaDevices::defaultAudioInput(); m_outputDevice = QMediaDevices::defaultAudioOutput(); // 验证设备是否支持所需格式 if (!m_inputDevice.isFormatSupported(m_format)) { qWarning() << "Default input device does not support required format"; } if (!m_outputDevice.isFormatSupported(m_format)) { qWarning() << "Default output device does not support required format"; } } void AudioDevice::cleanup() { // 清理音频资源 m_inputDevice = QAudioDevice(); m_outputDevice = QAudioDevice(); } int AudioDevice::capture(void *buffer, int samples) { static QAudioSource *input = nullptr; if (!input) { input = new QAudioSource(m_inputDevice, m_format); } if (input->state() == QAudio::StoppedState) { //input->start(buffer); } return 0;//input->read((char*)buffer, samples * m_format.bytesPerSample()); } int AudioDevice::play(void *buffer, int samples) { static QAudioSink *output = nullptr; static QIODevice *ioDevice = nullptr; if (!output) { output = new QAudioSink(m_outputDevice, m_format); ioDevice = output->start(); } if (ioDevice) { return ioDevice->write(static_cast<const char*>(buffer), samples); } return 0; if (output->state() == QAudio::StoppedState) { //output->start(buffer); } return 0;//output->write((const char*)buffer, samples * m_format.bytesPerSample()); } VideoDevice *VideoDevice::m_instance = nullptr; QCamera *VideoDevice::camera = nullptr; VideoDevice *VideoDevice::instance() { if (!m_instance) { m_instance = new VideoDevice(); } return m_instance; } VideoDevice::VideoDevice(QObject *parent) : QObject(parent) {} void VideoDevice::init() { auto videoInputs = QMediaDevices::videoInputs(); if (videoInputs.isEmpty()) { qWarning() << "No video input devices found"; return; } // camera = new QCamera(videoInputs.first()); // auto *captureSession = new QMediaCaptureSession; // captureSession->setCamera(camera); // auto *videoSink = new QVideoSink; // QObject::connect(videoSink, &QVideoSink::videoFrameChanged, // [](const QVideoFrame &frame) { // QVideoFrame mappedFrame(frame); // if (mappedFrame.map(QVideoFrame::ReadOnly)) { // emit VideoDevice::instance()->frameAvailable(mappedFrame); // mappedFrame.unmap(); // } // }); // captureSession->setVideoSink(videoSink); // camera->start(); } void VideoDevice::cleanup() { if (camera) { camera->stop(); delete camera; camera = nullptr; } } QVideoFrame VideoDevice::captureFrame() { if (!camera) return QVideoFrame(); // 创建视频帧并映射内存 //QVideoFrame frame(QSize(640, 480), QVideoFrameFormat::Format_ARGB8888); // 先创建格式对象 QVideoFrameFormat format( QSize(640, 480), QVideoFrameFormat::Format_ARGB8888 ); // 再创建视频帧对象 QVideoFrame frame(format); if (frame.map(QVideoFrame::WriteOnly)) { // 这里可以填充帧数据 frame.unmap(); } return frame; } void Form_VoIP::registerAccount(const QString &server, const QString &user, const QString &pass) { accCfg.idUri = "sip:" + user.toStdString() + "@" + server.toStdString(); accCfg.regConfig.registrarUri = "sip:" + server.toStdString(); accCfg.callConfig.timerMinSESec = 90; // 强制最小值 accCfg.callConfig.timerSessExpiresSec = 1800; // 推荐30分钟超时 accCfg.sipConfig.authCreds.push_back( pj::AuthCredInfo("digest", "*", user.toStdString(), 0, pass.toStdString()) ); // 添加视频支持 accCfg.videoConfig.autoShowIncoming = true; accCfg.videoConfig.autoTransmitOutgoing = true; // 设置默认设备(使用 PJSUA2 常量) accCfg.videoConfig.defaultCaptureDevice = PJMEDIA_VID_DEFAULT_CAPTURE_DEV; accCfg.videoConfig.defaultRenderDevice = PJMEDIA_VID_DEFAULT_RENDER_DEV; // acc= new MyAccount; try { acc->create(accCfg); } catch (...) { std::cout << "Adding account failed" << std::endl; } registered = true; emit callStateChanged(0); // 空闲状态 } void Form_VoIP::hangup() { if (currentCall) { try { pj::CallOpParam prm; currentCall->hangup(prm); currentCall = nullptr; emit callStateChanged(0); // 空闲状态 } catch (pj::Error &e) { currentCall = nullptr; std::cerr << "PJSIP Error: " << e.reason << std::endl; // 输出具体错误:ml-citation{ref="4" data="citationList"} } } //hangUpAllCalls(); } //还没调好 void Form_VoIP::hangUpAllCalls() { std::vector<pj::Call*> calls = acc->calls; for (auto call : calls) { CallInfo ci = call->getInfo(); if (ci.state != PJSIP_INV_STATE_DISCONNECTED) { try { pj::CallOpParam prm; call->hangup(prm); } catch (...) { // 处理可能的异常 } } } } void Form_VoIP::setVideoWindows(QVideoSink *local, QVideoSink *remote) { // Qt6.7视频架构兼容实现 if (remote) { connect(this, &Form_VoIP::incomingVideoFrame, this, [remote](const QVideoFrame &frame) { QVideoFrame clonedFrame(frame); clonedFrame.map(QVideoFrame::ReadOnly); remote->setVideoFrame(clonedFrame); clonedFrame.unmap(); }, Qt::QueuedConnection); } if (local) { connect(this, &Form_VoIP::outgoingVideoFrame, this, [local](const QVideoFrame &frame) { QVideoFrame clonedFrame(frame); clonedFrame.map(QVideoFrame::ReadOnly); local->setVideoFrame(clonedFrame); clonedFrame.unmap(); }, Qt::QueuedConnection); } } #ifndef FORM_VOIP_H #define FORM_VOIP_H #include <QWidget> #include <QGraphicsScene> #include <QGraphicsVideoItem> #include <QVideoSink> #include <QVideoFrame> #include <QAudioDevice> #include <QAudioFormat> #include <QAudioSink> #include <QAudioSource> #include <QAudioBuffer> #include <QMediaDevices> #include <QAudioInput> #include <QCamera> #include <pjsua2.hpp> #include <iostream> namespace Ui { class Form_VoIP; } class MyAccount; class Form_VoIP : public QWidget { Q_OBJECT public: explicit Form_VoIP(QWidget *parent = nullptr); ~Form_VoIP(); void setLocalVideo(QVideoSink *sink); void setRemoteVideo(QVideoSink *sink); void startStream(); void stopStream(); void registerAccount(const QString &server, const QString &user, const QString &pass); //void makeCall(const QString &uri); void hangup(); void setVideoWindows(QVideoSink *local, QVideoSink *remote); protected: void showEvent(QShowEvent *event); signals: void callStateChanged(int state); void incomingVideoFrame(const QVideoFrame &frame); void outgoingVideoFrame(const QVideoFrame &frame); private slots: void onRegisterClicked(); void onUnRegisterClicked(); void onCallClicked(); void onHangupClicked(); void onCallStateChanged(int state); void handleIncomingFrame(const QVideoFrame &frame); void handleOutgoingFrame(const QVideoFrame &frame); private: Ui::Form_VoIP *ui; bool streaming = false; pj::Endpoint ep; pj::AccountConfig accCfg; MyAccount *acc; pj::Call *currentCall = nullptr; bool registered = false; QString m_sipserver; QString m_sipuser; QString m_sippassword; QGraphicsScene *localScene; QGraphicsScene *remoteScene; QGraphicsVideoItem *localVideoItem; QGraphicsVideoItem *remoteVideoItem; void PJSUA2_Init(); void setupVideoCall(); void hangUpAllCalls(); }; class AudioDevice : public QObject { Q_OBJECT public: static void init(); static void cleanup(); static int capture(void *buffer, int samples); static int play(void *buffer, int samples); private: static QAudioDevice m_inputDevice; static QAudioDevice m_outputDevice; static QAudioFormat m_format; }; class VideoDevice : public QObject { Q_OBJECT public: static VideoDevice* instance(); static void init(); static void cleanup(); static QVideoFrame captureFrame(); signals: void frameAvailable(const QVideoFrame &frame); private: explicit VideoDevice(QObject *parent = nullptr); static VideoDevice *m_instance; static QCamera *camera; }; class AudioFilter { public: AudioFilter(int frameSize, int sampleRate); ~AudioFilter(); void processCapture(const QAudioBuffer &input, QAudioBuffer &output); void processPlayback(const QAudioBuffer &input); private: QAudioDevice m_audioDevice; QAudioFormat m_format; int frameSize; int sampleRate; QAudioInput *m_audioInput; }; #endif // FORM_VOIP_H 我使用debugger模式运行的时候程序崩溃 指向, ui(new Ui::Form_VoIP) 1 __pthread_kill_implementation pthread_kill.c 44 0xfffff46e2978 2 __pthread_kill_internal pthread_kill.c 78 0xfffff46e29c8 3 __GI_raise raise.c 26 0xfffff469bca0 4 __GI_abort abort.c 79 0xfffff4687e70 5 __gnu_cxx::__verbose_terminate_handler() 0xfffff49aacb0 6 ?? 0xfffff49a84d0 7 std::terminate() 0xfffff49a8554 8 __cxa_throw 0xfffff49a88a4 9 pj::Endpoint::Endpoint endpoint.cpp 600 0xfffff75339b8 10 Form_VoIP::Form_VoIP form_voip.cpp 355 0xaaaaaad531b8 11 FormSettings_System::FormSettings_System formsettings_system.cpp 211 0xaaaaaad82e5c 12 Form001::Init_UI_4 form001.cpp 515 0xaaaaaabba708 13 operator() form001.cpp 192 0xaaaaaabb6f40 14 std::__invoke_impl<void, Form001::Form001(QWidget *)::<lambda()>&>(std::__invoke_other, struct {...} &) invoke.h 61 0xaaaaaabc7338 15 std::__invoke_r<void, Form001::Form001(QWidget *)::<lambda()>&>(struct {...} &) invoke.h 111 0xaaaaaabc5d98 16 std::_Function_handler<void(), Form001::Form001(QWidget *)::<lambda()>>::_M_invoke(const std::_Any_data &) std_function.h 290 0xaaaaaabc4344 17 std::function<void ()>::operator()() const std_function.h 591 0xaaaaaabceb2c 18 Form001::doNextInitStep form001.cpp 261 0xaaaaaabb7d20 19 MainForm::timer_100ms_timeout mainform.cpp 205 0xaaaaaadedaac 20 QtPrivate::FunctorCall<QtPrivate::IndexesList<>, QtPrivate::List<>, void, void (MainForm:: *)()>::call(void (MainForm:: *)(), MainForm *, void * *) qobjectdefs_impl.h 145 0xaaaaaadf0c68 ... <更多>
最新发布
09-27
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值