get****Context各个方法分析

Android Context获取方式解析
本文详细解析了Android中获取Context的几种方式,包括getApplicationContext、getBaseContext、Activity.this及getContext等,并阐述了它们之间的区别及应用场景。

1 getApplicationContext分析

该方法为contextImpl类的方法,返回一个ApplicationContext。方法代码为:

public Context getApplicationContext() {

        return (mPackageInfo != null) ?

                mPackageInfo.getApplication() : mMainThread.getApplication();

    }

1.1 mPackageInfo.getApplication()分析

如果mPackeageInfo对象(LoadedApk类,全局唯一)不为空,就调用mPackageInfo的getApplication方法。该方法返回mPackageInfo对象中的mApplication对象。LoadedApk类中的mApplication只在一个地方被赋值:

public Application makeApplication(boolean forceDefaultAppClass,

            Instrumentation instrumentation) {

        if (mApplication != null) {

            return mApplication; //如果mApplication不为空,就直接返回该值,说明android只允许一个APK含有一个application对象

        }

        //否则就创建一个Application对象

Application app = null;

 

        String appClass = mApplicationInfo.className;

        if (forceDefaultAppClass || (appClass == null)) {

//如果app类名为空,或者指定要求使用默认app类名,就使用这个类名

            appClass = "android.app.Application";

        }

//开始新建一个application对象

        try {

            java.lang.ClassLoader cl = getClassLoader();

           //首先创建一个ContextImpl对象

            ContextImpl appContext = new ContextImpl();

            appContext.init(this, null, mActivityThread); //初始化

            app = mActivityThread.mInstrumentation.newApplication(

                    cl, appClass, appContext);

            appContext.setOuterContext(app);

        } catch (Exception e) {

           ……

}

//将此app加入到mAllApplications链表中

        mActivityThread.mAllApplications.add(app);        

mApplication = app; //这里完成给mApplication赋值

        …….              

        return app;

    }

通过上面的分析,可以知道mPackageInfo.getApplication()方法返回当前APK的唯一的Application对象。

1.2 mMainThread.getApplication()分析

如果mPackeageInfo对象为空,就调用此方法。关于mPackeageInfo相关知识,目前还没分析透,待分析完全之后补充。

mMainThread为ActivityThread类,所以我们需要去ActivityThread.java中找寻答案。该类的getApplication方法返回mInitialApplication对象,此对象也是Application类。同样的分析方法,我们去探寻该对象是如何被赋值的。

在ActivityThread类中共有两个地方对其进行了赋值,第一处的代码如下:

private void handleBindApplication(AppBindData data) {

……

try {

 // If the app is being launched for full backup or restore, bring it up in

 // a restricted environment with the base application class.

   Application app = data.info.makeApplication(data.restrictedBackupMode, null);

   mInitialApplication = app;
    }

……

}

显然mInitialApplication对象是通过makeApplication方法创建的,所以这里的mInitialApplication与mPackageInfo.getApplication()方法返回的Application是同一个对象。同时需要说明一下handleBindApplication函数。该函数会在应用程序第一次启动时调用。具体点说就是:在启动应用程序时ActivityManagerService类中的onTranct方法会调用ActivityThread.bindApplication方法,此方法有两种参数(一为applicationInfo,二为其他相关函数);在bindApplication中会根据这两种参数构建一个AppBindData对象data,然后以此对象为参数调用sendMessage(int what = H.BIND_APPLICATION, Object m = data;过后系统就会在消息处理函数ActivityThread.handleMessage(int what)中调用handleBindApplication(AppBindData data)创建ApplicationContext对象。

另一处的代码如下:

private void attach(boolean system) {

    if (!system) {

            ……

    } else {//如果是系统进程,那么就调用以下代码

            // Don't set application object here -- if the system crashes,

            // we can't display an alert, we just want to die die die.

       android.ddm.DdmHandleAppName.setAppName("system_process",

UserHandle.myUserId());

            try {

                mInstrumentation = new Instrumentation();

                ContextImpl context = new ContextImpl();

                context.init(getSystemContext().mPackageInfo, null, this);

                Application app = Instrumentation.newApplication(Application.class, context);

                mAllApplications.add(app);

                mInitialApplication = app;

                app.onCreate();

            } catch (Exception e) {

                throw new RuntimeException(

                        "Unable to instantiate Application():" + e.toString(), e);

            }

    }

}

对比此处红色部分的代码,和上面makeApplication方法的代码,可以发现两者其实都是创建一个Application对象。不过attach方法只在FrameWork进程(system_server)启动时调用,应用程序运行时并不会调用此方法。所以这里可以忽略。

总之,对应用程序而言getApplicationContext方法返回的都是应用程序的唯一ApplicationContext。

2 getBaseContext分析

此方法在ContextWrapper类中定义。代码如下:

public Context getBaseContext() {

        return mBase;

}

这里的mBase就是一个Context对象,getBaseContext() 返回由ContextWrapper类的构造函数指定或attachBaseContext ()设置的Context,SDK文档很少,不推荐使用。

3 Activity.this分析

Activity.this返回当前activity的context,这个context属于当前的activity,会随着activity的摧毁而摧毁。

4 getContext

该方法比较特殊,特殊的原因不是它复杂,而是它的不确定性。因为并不是每一个类都有该方法,也不是每一个类中该方法的作用均相同。如在android.view类,它就返回当前view所运行的context,一般情况下是当前的activity的context。而在contentProvider类中,该方法返回的是ApplicationContext(在ActivityThread.installProvider中可以知道)。所以该方法的返回值要根据具体情况具体分析。

总结

了解了所有的获取context相关的函数过后,我们就可以很容易的理解下面的问题了:

this: 当前activity的context,生命周期为当前activity的生命周期。

This.getApplicationContext:返回整个应用程序的application context,生命周期为整个应用程序。

 

转载于:https://www.cnblogs.com/wanyuanchun/p/3828603.html

onvif_passthrough:/****************************************************************************** * Copyright (c) 2018-2018 TP-Link Systems Inc. * * Filename: imaging.c * Version: 1.0 * Description: Imaging 请求中,透传相关的指令处理接口 * Author: yexuelin<yexuelin@tp-link.com.cn> * Date: 2019-03-14 ******************************************************************************/ #include <stdio.h> #include "onvif_passthrough.h" #include "libds.h" #include "soap_auth.h" #include "soap_pack.h" #include "soap_timg.h" #include "soap_parse.h" #include "soap_tptz.h" #define MAX_INTEGER_STR_LEN 32 /*最大整数字符串长度*/ #define IMAGE_STR "image" #define COMMON_STR "common" #define START_FOCUS_STR "start_focus" #define STOP_FOCUS_STR "stop_focus" #define START_IRIS_STR "start_iris" #define VELOCITY_STR "velocity" #define TIMG_IMG_SETTING "timg:ImagingSettings" #define TIMG_GET_IMG_SETTINGS "timg:GetImagingSettings" #define TIMG_SET_IMG_SETTINGS "timg:SetImagingSettings" #define TIMG_GET_SETTING_RSP "timg:GetImagingSettingsResponse" #define TIMG_SET_SETTING_RSP "timg:SetImagingSettingsResponse" #define TIMG_MOVE "timg:Move" #define TIMG_STOP "timg:Stop" #define TIMG_MOVE_RSP "timg:MoveResponse" #define TIMG_STOP_RSP "timg:StopResponse" /* 功能:构建一个获取图像设置的JSON请求对象。 输入:SOAP_CONTEXT指针(包含请求信息),输出参数json_req(用于返回构建的JSON对象),模块名和节名。 输出:返回OK或ERROR。 过程: a. 进行鉴权(检查是否有操作员权限)。 b. 创建JSON对象,添加方法为"get",并添加指定的模块和节。 c. 将构建的JSON对象通过json_req返回。*/ LOCAL S32 get_json_settings(SOAP_CONTEXT *soap, JSON_OBJPTR *json_req, char *module_name, char *sec_name) { JSON_OBJPTR json_data = NULL; JSON_OBJPTR json_sec = NULL; if (soap == NULL || json_req == NULL) { ONVIF_WARN("soap == NULL."); return ERROR; } /* 需要鉴权 */ if (OK != soap_usernametoken_auth(soap, UM_OPERATOR)) { ONVIF_ERROR("Auth failed\n"); soap_fault(soap, "SOAP-ENV:Sender", "ter:NotAuthorized", NULL, "Authority failure"); soap->error = 400; return ERROR; } json_data = jso_new_obj(); json_sec = jso_new_obj(); if (json_data == NULL || json_sec == NULL) { ONVIF_WARN("jso_new_obj error."); jso_free_obj(json_data); jso_free_obj(json_sec); return ERROR; } jso_add_string(json_data, DS_METHOD_STR, METHOD_GET_STR); jso_obj_add(json_data, module_name, json_sec); jso_add_string(json_sec, DS_SECTION_STR, sec_name); *json_req = json_data; return OK; } /*------------------------- 获取图像参数 ----------------------------------*/ /* 功能:生成获取图像设置的SOAP响应(XML格式)。 输入:xml_buf(输出缓冲区),data(未使用)。 输出:返回OK或ERROR。 过程: a. 从设备服务(ds_read)读取当前的图像通用设置(亮度、饱和度、对比度、锐度)和模块规格(判断是否支持光圈优先)。 b. 开始构建XML响应,按照ONVIF标准格式输出各个图像参数。 c. 如果设备支持光圈优先(iris_first),则输出曝光模式(手动、自动或光圈优先)和当前的光圈值。 d. 如果设备支持变焦(通过tptz_get_local_capability判断),则输出聚焦相关的参数(自动对焦模式、默认速度、近限和远限)。*/ LOCAL S32 soap_out_timg_get_settings_rsp(ONVIF_BUF *xml_buf, void *data) { IMAGE_COMMON image_common; MODULE_SPEC module_spec; struct onvif_ptz_capability *ptz_capability = NULL; const char *tag_str[] = {TT_BRIGHTNESS_STR, TT_COLORSATURATION_STR, TT_CONTRAST_STR, TT_SHARPNESS_STR}; U8 *value[] = {&image_common.luma, &image_common.saturation, &image_common.contrast, &image_common.sharpness}; U32 i; if (NULL == xml_buf) { ONVIF_WARN("Input NULL."); return ERROR; } memset(&image_common, 0, sizeof(IMAGE_COMMON)); if (0 == ds_read(IMAGE_COMMON_PATH, &image_common, sizeof(IMAGE_COMMON))) { ONVIF_ERROR("ds_read %s fail.", IMAGE_COMMON_PATH); return ERROR; } memset(&module_spec, 0, sizeof(MODULE_SPEC)); if (0 == ds_read(MODULE_SPEC_PATH, &module_spec, sizeof(MODULE_SPEC))) { ONVIF_ERROR("ds_read %s fail.", MODULE_SPEC_PATH); return ERROR; } SOAP_IF_FAIL_RET(soap_element_begin_out(xml_buf, TIMG_GET_SETTING_RSP, NULL)); SOAP_IF_FAIL_RET(soap_element_begin_out(xml_buf, TIMG_IMG_SETTING, NULL)); for (i = 0; i < sizeof(tag_str) / sizeof(char*); ++i) { SOAP_IF_FAIL_RET(soap_element_int_type(xml_buf, tag_str[i], *value[i], NULL)); } if (module_spec.iris_first[0] == '1') { /* tt:Exposure begin */ SOAP_IF_FAIL_RET(soap_element_begin_out(xml_buf, "tt:Exposure", NULL)); /* tt:Mode */ switch (image_common.exp_type) { case EXP_TYPE_MANUAL: SOAP_IF_FAIL_RET(soap_element(xml_buf, "tt:Mode", "MANUAL", NULL)); break; case EXP_TYPE_AUTO: SOAP_IF_FAIL_RET(soap_element(xml_buf, "tt:Mode", "AUTO", NULL)); break; case EXP_TYPE_IRIS_FIRST: SOAP_IF_FAIL_RET(soap_element(xml_buf, "tt:Mode", "IRIS_FIRST", NULL)); break; default: ONVIF_WARN("Unsupported expore mode."); return ERROR; } /* tt:Iris */ SOAP_IF_FAIL_RET(soap_element_int_type(xml_buf, "tt:Iris", image_common.iris_level, NULL)); /* tt:Exposure end */ SOAP_IF_FAIL_RET(soap_element_end_out(xml_buf, "tt:Exposure")); } if (tptz_get_local_capability(&ptz_capability) && ptz_capability->zoom_valid) { /*FOCUS begin */ SOAP_IF_FAIL_RET(soap_element_begin_out(xml_buf, "tt:Focus", NULL)); SOAP_IF_FAIL_RET(soap_element(xml_buf, "tt:AutoFocusModes", "AUTO", NULL)); SOAP_IF_FAIL_RET(soap_element(xml_buf, "tt:AutoFocusModes", "MANUAL", NULL)); SOAP_IF_FAIL_RET(soap_element_begin_out(xml_buf, "tt:DefaultSpeed", NULL)); SOAP_IF_FAIL_RET(soap_element_int_type(xml_buf, "tt:Min", 1, NULL)); SOAP_IF_FAIL_RET(soap_element_int_type(xml_buf, "tt:Max", 1, NULL)); SOAP_IF_FAIL_RET(soap_element_end_out(xml_buf, "tt:DefaultSpeed")); SOAP_IF_FAIL_RET(soap_element_begin_out(xml_buf, "tt:NearLimit", NULL)); SOAP_IF_FAIL_RET(soap_element_int_type(xml_buf, "tt:Min", 10, NULL)); SOAP_IF_FAIL_RET(soap_element_int_type(xml_buf, "tt:Max", 2000, NULL)); SOAP_IF_FAIL_RET(soap_element_end_out(xml_buf, "tt:NearLimit")); SOAP_IF_FAIL_RET(soap_element_begin_out(xml_buf, "tt:FarLimit", NULL)); SOAP_IF_FAIL_RET(soap_element_int_type(xml_buf, "tt:Min", 0, NULL)); SOAP_IF_FAIL_RET(soap_element_int_type(xml_buf, "tt:Max", 0, NULL)); SOAP_IF_FAIL_RET(soap_element_end_out(xml_buf, "tt:FarLimit")); SOAP_IF_FAIL_RET(soap_element_end_out(xml_buf, "tt:Focus")); /*FOCUS end */ } SOAP_IF_FAIL_RET(soap_element_end_out(xml_buf, TIMG_IMG_SETTING)); SOAP_IF_FAIL_RET(soap_element_end_out(xml_buf, TIMG_GET_SETTING_RSP)); return OK; } /* 功能:将获取图像设置的SOAP请求(XML)转换为内部JSON请求对象。 输入:SOAP_CONTEXT指针,输出参数json_req(返回构建的JSON对象)。 输出:返回OK或ERROR。 过程: a. 鉴权。 b. 解析SOAP请求中的VideoSourceToken(视频源令牌),检查是否为"raw_vs1"(目前只支持这个)。 c. 调用get_json_settings函数构建JSON请求。*/ LOCAL S32 timg_get_settings_xml_to_json(SOAP_CONTEXT *soap, JSON_OBJPTR *json_req) { S32 request_len = 0; S32 ch = 0; char *xml_start = NULL; char *xml_str = NULL; char **p = NULL; char charbuf[LEN_INFO] = {0}; if (soap == NULL || json_req == NULL) { ONVIF_TRACE("soap == NULL."); return ERROR; } ONVIF_TRACE("timg:GetImagingSettings"); /* 需确认是否支持 */ if (OK != soap_usernametoken_auth(soap, UM_OPERATOR)) { ONVIF_TRACE("Auth failed\n"); soap_fault(soap, "SOAP-ENV:Sender", "ter:NotAuthorized", NULL, "Authority failure"); soap->error = 400; return ERROR; } /* 分析 GetVideoEncoderConfigurationOptions 请求的具体内容 */ if (soap->request_begin != NULL && soap->request_end != NULL) { request_len = soap->request_end - soap->request_begin; if (request_len < 0) { ONVIF_TRACE("request_len < 0."); return ERROR; } xml_str = xml_start = soap->request_begin; p = (char **)&xml_str; } else { ONVIF_TRACE("no content"); return ERROR; } while (((*p) - xml_start) < request_len) { if ((ch = soap_get_tag(xml_start, request_len, p, charbuf, sizeof(charbuf))) == EOF) { return ERROR; } if (charbuf[0] == '/') { continue; } if (TRUE == soap_match_tag(charbuf, "VideoSourceToken")) { if (OK != soap_parse_element_value(xml_start, request_len, p, ch, charbuf, sizeof(charbuf))) { return ERROR; } ONVIF_TRACE("VideoSourceToken: %s", charbuf); if (strcmp(charbuf, "raw_vs1")) { SOAP_IF_FAIL_RET(soap_fault(soap, "SOAP-ENV:Sender", "ter:InvalidArgVal", "ter:SettingsInvalid", "error data")); soap->error = SOAP_FAULT; return ERROR; } continue; } } return get_json_settings(soap, json_req, IMAGE_STR, COMMON_STR); } /* 功能:将内部处理后的JSON响应转换为SOAP响应(XML)。 输入:SOAP_CONTEXT指针,json_rsp(内部处理返回的JSON响应)。 输出:返回OK或ERROR。 过程:调用soap_generate_xml函数,使用soap_out_timg_get_settings_rsp函数生成XML。*/ LOCAL S32 timg_get_settings_json_to_xml(SOAP_CONTEXT *soap, JSON_OBJPTR json_rsp) { if (soap == NULL) { ONVIF_WARN("soap == NULL."); return ERROR; } return soap_generate_xml((p_out_fun)(soap_out_timg_get_settings_rsp), soap, NULL); } /*------------------------- 配置图像参数 ----------------------------------*/ /* 功能:生成设置图像设置的SOAP响应(XML)。 输入:xml_buf(输出缓冲区),data(未使用)。 输出:返回OK或ERROR。 过程:生成一个空的timg:SetImagingSettingsResponse标签。*/ LOCAL S32 soap_out_set_timg_rsp(ONVIF_BUF *xml_buf, void *data) { if (NULL == xml_buf) { ONVIF_WARN("Input NULL."); return ERROR; } SOAP_IF_FAIL_RET(soap_element_begin_out(xml_buf, TIMG_SET_SETTING_RSP, NULL)); SOAP_IF_FAIL_RET(soap_element_end_out(xml_buf, TIMG_SET_SETTING_RSP)); return OK; } #if 0 /*------------------------- 开始聚焦 ----------------------------------*/ LOCAL S32 soap_out_timg_move_rsp(ONVIF_BUF *xml_buf, void *data) { if (NULL == xml_buf) { ONVIF_WARN("Input NULL."); return ERROR; } /* timg:MoveResponse */ SOAP_IF_FAIL_RET(soap_element(xml_buf, TIMG_MOVE_RSP, NULL, NULL)); return OK; } /*------------------------- 停止聚焦 ----------------------------------*/ LOCAL S32 soap_out_timg_stop_rsp(ONVIF_BUF *xml_buf, void *data) { if (NULL == xml_buf) { ONVIF_WARN("Input NULL."); return ERROR; } /* timg:StopResponse */ SOAP_IF_FAIL_RET(soap_element(xml_buf, TIMG_STOP_RSP, NULL, NULL)); return OK; } #endif LOCAL S32 timg_set_imaging_xml_to_json(SOAP_CONTEXT *soap, JSON_OBJPTR *json_req) { S32 request_len = 0; char *xml_start = NULL; char *xml_str = NULL; char **p = NULL; S32 ch = 0; char charbuf[LEN_INFO * 2] = {0}; JSON_OBJPTR json_data = NULL; JSON_OBJPTR json_sec = NULL; JSON_OBJPTR json_seg = NULL; const char *tag_str[] = {BRIGHTNESS_STR, COLORSATURATION_STR, CONTRAST_STR, SHARPNESS_STR}; const char *jso_str[] = {LUMA_JSO_STR, SATURATION_JSO_STR, CONTRAST_JSO_STR, SHARPNESS_JSO_STR}; U32 min[] = {LUMA_MIN, SATURATION_MIN, CONTRAST_MIN, SHARPNESS_MIN}; U32 max[] = {LUMA_MAX, SATURATION_MAX, CONTRAST_MAX, SHARPNESS_MAX}; U32 i; JSON_OBJPTR json_data_iris = NULL; JSON_OBJPTR json_sec_iris = NULL; JSON_OBJPTR json_seg_iris = NULL; int iris_to_set = 0; IMAGE_COMMON image_common; MODULE_SPEC module_spec; JSON_OBJ *rsp_obj = NULL; if (soap == NULL || json_req == NULL) { ONVIF_WARN("soap == NULL."); return ERROR; } /* 需要鉴权 */ if (OK != soap_usernametoken_auth(soap, UM_OPERATOR)) { ONVIF_WARN("Auth failed"); soap_fault(soap, "SOAP-ENV:Sender", "ter:NotAuthorized", NULL, "Authority failure"); soap->error = 400; return ERROR; } /* 分析SetImage请求的具体内容 */ if (soap->request_begin != NULL && soap->request_end != NULL) { request_len = soap->request_end - soap->request_begin; if (request_len < 0) { ONVIF_WARN("Invalid request"); return ERROR; } xml_str = soap->request_begin; xml_start = soap->request_begin; p = (char **)&xml_str; } else { ONVIF_ERROR("soap request content is NULL."); return ERROR; } json_data = jso_new_obj(); json_sec = jso_new_obj(); json_seg = jso_new_obj(); if (NULL == json_data || NULL == json_sec || NULL == json_seg) { ONVIF_WARN("jso_new_obj error."); goto err_out; } while (((*p) - xml_start) < request_len) { if ((ch = soap_get_tag(xml_start, request_len, p, charbuf, sizeof(charbuf))) == EOF) { goto err_out; } if (charbuf[0] == '/') { continue; } if (TRUE == soap_match_tag(charbuf, "VideoSourceToken")) { if (OK != soap_parse_element_value(xml_start, request_len, p, ch, charbuf, sizeof(charbuf))) { goto err_out; } ONVIF_TRACE("VideoSourceToken: %s", charbuf); if (strcmp(charbuf, "raw_vs1")) { SOAP_IF_FAIL_RET(soap_fault(soap, "SOAP-ENV:Sender", "ter:InvalidArgVal", "ter:SettingsInvalid", "error data")); soap->error = SOAP_FAULT; goto err_out; } continue; } for (i = 0; i < sizeof(tag_str) / sizeof(char*); ++i) { if (TRUE != soap_match_tag(charbuf, tag_str[i])) { continue; } if (OK != soap_parse_element_value(xml_start, request_len, p, ch, charbuf, sizeof(charbuf)) || 0 != jso_add_int(json_seg, jso_str[i], atoi(charbuf))) { goto err_out; } if((atoi(charbuf) < min[i]) || (atoi(charbuf) > max[i])) { SOAP_IF_FAIL_RET(soap_fault(soap, "SOAP-ENV:Sender", "ter:InvalidArgVal", "ter:SettingsInvalid", "error data")); soap->error = SOAP_FAULT; goto err_out; } break; } if (TRUE == soap_match_tag(charbuf, "Iris")) { if (OK != soap_parse_element_value(xml_start, request_len, p, ch, charbuf, sizeof(charbuf))) { goto err_out; } ONVIF_TRACE("Iris: %s", charbuf); iris_to_set = atoi(charbuf); continue; } } jso_obj_add(json_sec, COMMON_STR, json_seg); jso_obj_add(json_data, IMAGE_STR, json_sec); jso_add_string(json_data, DS_METHOD_STR, METHOD_SET_STR); *json_req = json_data; memset(&module_spec, 0, sizeof(MODULE_SPEC)); if (0 == ds_read(MODULE_SPEC_PATH, &module_spec, sizeof(MODULE_SPEC))) { ONVIF_ERROR("ds_read %s fail.", MODULE_SPEC_PATH); goto err_out; } if (module_spec.iris_first[0] == '1') { memset(&image_common, 0, sizeof(IMAGE_COMMON)); if (0 == ds_read(IMAGE_COMMON_PATH, &image_common, sizeof(IMAGE_COMMON))) { ONVIF_ERROR("ds_read %s fail.", IMAGE_COMMON_PATH); goto err_out; } json_data_iris = jso_new_obj(); json_sec_iris = jso_new_obj(); json_seg_iris = jso_new_obj(); if (NULL == json_data_iris || NULL == json_sec_iris || NULL == json_seg_iris) { ONVIF_WARN("jso_new_obj error."); goto err_out; } if (iris_to_set > image_common.iris_level) { jso_add_string(json_seg_iris, VELOCITY_STR, "0.1"); } else if (iris_to_set < image_common.iris_level) { jso_add_string(json_seg_iris, VELOCITY_STR, "-0.1"); } else { ONVIF_WARN("iris_to_set is the same as DUT."); goto err_out; } jso_obj_add(json_sec_iris, START_IRIS_STR, json_seg_iris); jso_obj_add(json_data_iris, IMAGE_STR, json_sec_iris); jso_add_string(json_data_iris, DS_METHOD_STR, METHOD_DO_STR); if (OK != onvif_passthrough(soap, json_data_iris, &rsp_obj)) { ONVIF_WARN("passthrough error."); goto err_out; } jso_free_obj(json_data_iris); jso_free_obj(rsp_obj); } return OK; err_out: jso_free_obj(json_data); jso_free_obj(json_sec); jso_free_obj(json_seg); jso_free_obj(json_data_iris); jso_free_obj(json_sec_iris); jso_free_obj(json_seg_iris); jso_free_obj(rsp_obj); return ERROR; } LOCAL S32 timg_set_imaging_json_to_xml(SOAP_CONTEXT *soap, JSON_OBJPTR json_rsp) { if (soap == NULL) { ONVIF_WARN("soap == NULL."); return ERROR; } return soap_generate_xml((p_out_fun)(soap_out_set_timg_rsp), soap, NULL); } #if 0 LOCAL S32 timg_move_xml_to_json(SOAP_CONTEXT *soap, JSON_OBJPTR *json_req) { S32 request_len = 0; char *xml_start = NULL; char *xml_str = NULL; char **p = NULL; S32 ch = 0; char charbuf[LEN_INFO] = {0}; char vs_token[LEN_INFO] = {0}; JSON_OBJPTR json_data = NULL; JSON_OBJPTR json_sec = NULL; JSON_OBJPTR json_seg = NULL; if (soap == NULL || json_req == NULL) { ONVIF_WARN("soap == NULL."); return ERROR; } /* 判断是否支持ptz和zoom,不支持直接返回 */ if (!is_ptz_support() || !is_ptz_3D_zoom_support()) { ONVIF_WARN("not support ptz or zoom."); return ERROR; } /* 需要鉴权 */ if (OK != soap_usernametoken_auth(soap, UM_OPERATOR)) { ONVIF_WARN("Auth failed"); soap_fault(soap, "SOAP-ENV:Sender", "ter:NotAuthorized", NULL, "Authority failure"); soap->error = 400; return ERROR; } /* 需要分析 timg:Move请求的具体内容 */ if (soap->request_begin != NULL && soap->request_end != NULL) { request_len = soap->request_end - soap->request_begin; if (request_len < 0) { ONVIF_TRACE("request_len < 0."); return ERROR; } xml_str = xml_start = soap->request_begin; p = (char **)&xml_str; } else { ONVIF_ERROR("soap request content is NULL."); return ERROR; } json_data = jso_new_obj(); json_sec = jso_new_obj(); json_seg = jso_new_obj(); if (NULL == json_data || NULL == json_sec || NULL == json_seg) { ONVIF_WARN("jso_new_obj error."); goto err_out; } while (((*p) - xml_start) < request_len) { if ((ch = soap_get_tag(xml_start, request_len, p, charbuf, sizeof(charbuf))) == EOF) { ONVIF_ERROR("soap_get_tag failed"); goto err_out; } if (charbuf[0] == '/') { continue; } if (TRUE == soap_match_tag(charbuf, "VideoSourceToken")) { if (OK != soap_parse_element_value(xml_start, request_len, p, ch, charbuf, sizeof(charbuf))) { goto err_out; } ONVIF_TRACE("VideoSourceToken: %s", charbuf); snprintf(vs_token, LEN_INFO, "%s", charbuf); continue; } if (TRUE == soap_match_tag(charbuf, "Speed")) { if (OK != soap_parse_element_value(xml_start, request_len, p, ch, charbuf, sizeof(charbuf)) || 0 != jso_add_string(json_seg, VELOCITY_STR, charbuf)) { goto err_out; } ONVIF_TRACE("Speed: %s", charbuf); } } if (0 != strcmp("raw_vs1", vs_token)) { soap->error = SOAP_FAULT; goto err_out; } jso_obj_add(json_sec, START_FOCUS_STR, json_seg); jso_obj_add(json_data, IMAGE_STR, json_sec); jso_add_string(json_data, DS_METHOD_STR, METHOD_DO_STR); *json_req = json_data; return OK; err_out: jso_free_obj(json_data); jso_free_obj(json_sec); jso_free_obj(json_seg); return ERROR; } LOCAL S32 timg_move_json_to_xml(SOAP_CONTEXT *soap, JSON_OBJPTR json_rsp) { if (soap == NULL) { ONVIF_WARN("soap == NULL."); return ERROR; } return soap_generate_xml((p_out_fun)(soap_out_timg_move_rsp), soap, NULL); } LOCAL S32 timg_stop_xml_to_json(SOAP_CONTEXT *soap, JSON_OBJPTR *json_req) { S32 request_len = 0; char *xml_start = NULL; char *xml_str = NULL; char **p = NULL; S32 ch = 0; char charbuf[LEN_INFO] = {0}; char vs_token[LEN_INFO] = {0}; JSON_OBJPTR json_data = NULL; JSON_OBJPTR json_sec = NULL; JSON_OBJPTR json_seg = NULL; if (soap == NULL || json_req == NULL) { ONVIF_WARN("soap == NULL."); return ERROR; } /* 判断是否支持ptz和zoom,不支持直接返回 */ if (!is_ptz_support() || !is_ptz_3D_zoom_support()) { ONVIF_WARN("not support ptz or zoom."); return ERROR; } /* 需要鉴权 */ if (OK != soap_usernametoken_auth(soap, UM_OPERATOR)) { ONVIF_WARN("Auth failed"); soap_fault(soap, "SOAP-ENV:Sender", "ter:NotAuthorized", NULL, "Authority failure"); soap->error = 400; return ERROR; } /* 需要分析 timg:Move请求的具体内容 */ if (soap->request_begin != NULL && soap->request_end != NULL) { request_len = soap->request_end - soap->request_begin; if (request_len < 0) { ONVIF_TRACE("request_len < 0."); return ERROR; } xml_str = xml_start = soap->request_begin; p = (char **)&xml_str; } else { ONVIF_ERROR("soap request content is NULL."); return ERROR; } json_data = jso_new_obj(); json_sec = jso_new_obj(); json_seg = jso_new_obj(); if (NULL == json_data || NULL == json_sec || NULL == json_seg) { ONVIF_WARN("jso_new_obj error."); goto err_out; } while (((*p) - xml_start) < request_len) { if ((ch = soap_get_tag(xml_start, request_len, p, charbuf, sizeof(charbuf))) == EOF) { ONVIF_ERROR("soap_get_tag failed"); goto err_out; } if (charbuf[0] == '/') { continue; } if (TRUE == soap_match_tag(charbuf, "VideoSourceToken")) { if (OK != soap_parse_element_value(xml_start, request_len, p, ch, charbuf, sizeof(charbuf))) { goto err_out; } ONVIF_TRACE("VideoSourceToken: %s", charbuf); snprintf(vs_token, LEN_INFO, "%s", charbuf); break; } } if (0 != strcmp("raw_vs1", vs_token)) { soap->error = SOAP_FAULT; goto err_out; } jso_obj_add(json_sec, STOP_FOCUS_STR, json_seg); jso_obj_add(json_data, IMAGE_STR, json_sec); jso_add_string(json_data, DS_METHOD_STR, METHOD_DO_STR); *json_req = json_data; return OK; err_out: jso_free_obj(json_data); jso_free_obj(json_sec); jso_free_obj(json_seg); return ERROR; } LOCAL S32 timg_stop_json_to_xml(SOAP_CONTEXT *soap, JSON_OBJPTR json_rsp) { if (soap == NULL) { ONVIF_WARN("soap == NULL."); return ERROR; } return soap_generate_xml((p_out_fun)(soap_out_timg_stop_rsp), soap, NULL); } #endif void imaging_passthrough_init() { onvif_passthrough_handle_add(TIMG_GET_IMG_SETTINGS, timg_get_settings_xml_to_json, timg_get_settings_json_to_xml); onvif_passthrough_handle_add(TIMG_SET_IMG_SETTINGS, timg_set_imaging_xml_to_json, timg_set_imaging_json_to_xml); /* 调焦相关,目前不支持 */ #if 0 onvif_passthrough_handle_add(TIMG_MOVE, timg_move_xml_to_json, timg_move_json_to_xml); onvif_passthrough_handle_add(TIMG_STOP, timg_stop_xml_to_json, timg_stop_json_to_xml); #endif } /****************************************************************************** 模块流程: 系统初始化时调用imaging_passthrough_init,将本模块的处理函数注册到全局透传路由表。 当收到SOAP请求时,透传框架(onvif_serve_passthrough)会匹配请求标签(如timg:GetImagingSettings)。 匹配成功后,调用注册的xml_to_json函数(如timg_get_settings_xml_to_json)将SOAP请求转换为内部JSON请求。 调用onvif_passthrough函数(核心业务处理)执行JSON指令,得到JSON响应。 调用注册的json_to_xml函数(如timg_get_settings_json_to_xml)将JSON响应转换为SOAP响应。 将SOAP响应返回给客户端。 对外接口: 本模块对外提供的唯一接口是imaging_passthrough_init,它被系统初始化函数调用,用于注册本模块的处理函数。 ******************************************************************************/ /****************************************************************************** * Copyright (c) 2018-2018 TP-Link Systems Inc. * * Filename: md_active_cells.c * Version: 1.0 * Description: 移动侦测区域计算的相关接口头文件 * Author: dengpeng<dengpeng@tp-link.com.cn> * Date: 2019-03-15 ******************************************************************************/ #include <stdio.h> #include <stdlib.h> #include <unistd.h> #include <math.h> #include "onvif_passthrough.h" #include "packbits.h" #include <json/json.h> #include <json/json_object_private.h> #include "md_active_cells.h" /**************************************************************************** * Function : json_parse * Description: 从json对象解析section子对象 * Input : json_recv : json对象 mod_name : 模块名称 sub_mod_name : section名称 * Output : N/A * Return : 解析结果 ****************************************************************************/ json_object* json_parse(json_object* json_recv, char* mod_name, char* sub_mod_name) { json_object* json_data = NULL; if (json_recv == NULL || mod_name == NULL || sub_mod_name == NULL) { return NULL; } /* !!!json_object_object_get返回的指针不能手动释放!!! */ json_data = json_object_object_get(json_recv, mod_name); if (NULL == json_data) { ONVIF_ERROR("get %s mod error", mod_name); return NULL; } json_data = json_object_object_get(json_data, sub_mod_name); if (NULL == json_data) { ONVIF_ERROR("get %s subMode error", sub_mod_name); return NULL; } return json_data; } #define UP_GET_CELL_IDX(total, cell_cnt, v) \ ({ \ int i = 0; \ int idx = 1; \ float gap = 0.0; \ for (i = (cell_cnt); i > 0; i--) \ { \ gap = (((float)total) * (i - 1)) / (cell_cnt); \ if ((v) > gap || fabs((v) - gap) <= 1e-5) \ { \ idx = i; \ break; \ } \ } \ idx; \ }) #define DOWN_GET_CELL_IDX(total, cell_cnt, v) \ ({ \ int i = 0; \ int idx = cell_cnt; \ float gap = 0.0; \ for (i = 1; i <= (cell_cnt); i++) \ { \ gap = (((float)total) * i) / (cell_cnt); \ if ((v) < gap || fabs((v) - gap) <= 1e-5) \ { \ idx = i; \ break; \ } \ } \ idx; \ }) /**************************************************************************** * Function : md_get_active_cells * Description: 读取移动侦测区域信息,映射到22*18的网格中,将映射后的 * 的网格使用packbits和base64算法编码 * Input : SLP协议定义的区域信息 * Output : N/A * Return : base64编码的区域信息 ****************************************************************************/ char *md_get_active_cells_alloc(json_object* region_info) { int i = 0; int c = 0; int r = 0; int region_cnt = 0; int x = 0; int y = 0; int height = 0; int width = 0; int left_col = 0; int right_col = 0; int top_row = 0; int bottom_row = 0; int byte_idx = 0; int bit_idx = 0; int packbit_len = 0; json_object* jso_opt = NULL; json_object** module_obj = NULL; unsigned char bit_arr[(CELL_LAYOUT_ROWS * CELL_LAYOUT_COLS + BITS_IN_BYTE) / BITS_IN_BYTE] = {0}; unsigned char packbit_arr[((CELL_LAYOUT_ROWS * CELL_LAYOUT_COLS + BITS_IN_BYTE) * 2) / BITS_IN_BYTE] = {0}; char *base64_buf = NULL; if (NULL == region_info) { ONVIF_ERROR("NULL == region_info"); return NULL; } memset(bit_arr, 0, sizeof(bit_arr)); region_cnt = jso_array_length(region_info); for (i = 0; i < region_cnt; i++) { jso_opt = jso_array_get_idx(region_info, i); /* 取region_info_i下一级的内容 */ module_obj = NULL; module_obj = jso_next_sub_obj(jso_opt, NULL, NULL); if (NULL == module_obj) { ONVIF_ERROR("NULL == module_obj"); return NULL; } jso_opt = *module_obj; jso_obj_get_int(jso_opt, "x_coor", &x); jso_obj_get_int(jso_opt, "y_coor", &y); jso_obj_get_int(jso_opt, "height", &height); jso_obj_get_int(jso_opt, "width", &width); left_col = UP_GET_CELL_IDX(SLP_REGION_MAX_WIDTH, CELL_LAYOUT_COLS, x); right_col = DOWN_GET_CELL_IDX(SLP_REGION_MAX_WIDTH, CELL_LAYOUT_COLS, x + width); top_row = UP_GET_CELL_IDX(SLP_REGION_MAX_HEIGHT, CELL_LAYOUT_ROWS, y); bottom_row = DOWN_GET_CELL_IDX(SLP_REGION_MAX_HEIGHT, CELL_LAYOUT_ROWS, y + height); for (r = top_row; r <= bottom_row; r++) { for (c = left_col; c <= right_col; c++) { byte_idx = ((r - 1) * CELL_LAYOUT_COLS + c - 1) / BITS_IN_BYTE; bit_idx = BITS_IN_BYTE - 1 - ((r - 1) * CELL_LAYOUT_COLS + c - 1) % BITS_IN_BYTE; bit_arr[byte_idx] |= (0x1<<bit_idx); } } } packbit_len = tiff6_PackBits(bit_arr, sizeof(bit_arr) / sizeof(bit_arr[0]), packbit_arr); onvif_base64_encode_alloc((char *)packbit_arr, packbit_len, &base64_buf); return base64_buf; } /**************************************************************************** * Function : md_get_active_cells_alloc_no_json * Description: 通过ds_read读取移动侦测区域信息,映射到22*18的网格中,将映射后的 * 的网格使用packbits和base64算法编码 * Input : SLP协议定义的区域信息 * Output : N/A * Return : base64编码的区域信息 ****************************************************************************/ char *md_get_active_cells_alloc_no_json() { int i = 0; int c = 0; int r = 0; int x = 0; int y = 0; int height = 0; int width = 0; int left_col = 0; int right_col = 0; int top_row = 0; int bottom_row = 0; int byte_idx = 0; int bit_idx = 0; int packbit_len = 0; DEVICE_INFO device_info; char path[DATA_PATH_SIZE] = {0}; MOTION_DETECT_REGION_INFO region = {0}; unsigned char bit_arr[(CELL_LAYOUT_ROWS * CELL_LAYOUT_COLS + BITS_IN_BYTE) / BITS_IN_BYTE] = {0}; unsigned char packbit_arr[((CELL_LAYOUT_ROWS * CELL_LAYOUT_COLS + BITS_IN_BYTE) * 2) / BITS_IN_BYTE] = {0}; char *base64_buf = NULL; if (0 == ds_read(DEVICE_INFO_PATH, (U8 *)&device_info, sizeof(DEVICE_INFO))) { ONVIF_DEBUG("read DEVICE_INFO fail."); return NULL; } for (i = 0; i < device_info.md_reg_num; i++) { snprintf(path, DATA_PATH_SIZE, "/motion_detection/%s%d", MOTION_DETECT_REGION_PREFIX, i + 1); if (0 == ds_read((const char*)path, &region, sizeof(MOTION_DETECT_REGION_INFO))) { ONVIF_TRACE("read %s fail.", path); continue; } x = region.x_coor; y = region.y_coor; height = region.height; width = region.width; left_col = UP_GET_CELL_IDX(SLP_REGION_MAX_WIDTH, CELL_LAYOUT_COLS, x); right_col = DOWN_GET_CELL_IDX(SLP_REGION_MAX_WIDTH, CELL_LAYOUT_COLS, x + width); top_row = UP_GET_CELL_IDX(SLP_REGION_MAX_HEIGHT, CELL_LAYOUT_ROWS, y); bottom_row = DOWN_GET_CELL_IDX(SLP_REGION_MAX_HEIGHT, CELL_LAYOUT_ROWS, y + height); for (r = top_row; r <= bottom_row; r++) { for (c = left_col; c <= right_col; c++) { byte_idx = ((r - 1) * CELL_LAYOUT_COLS + c - 1) / BITS_IN_BYTE; bit_idx = BITS_IN_BYTE - 1 - ((r - 1) * CELL_LAYOUT_COLS + c - 1) % BITS_IN_BYTE; bit_arr[byte_idx] |= (0x1<<bit_idx); } } } packbit_len = tiff6_PackBits(bit_arr, sizeof(bit_arr) / sizeof(bit_arr[0]), packbit_arr); onvif_base64_encode_alloc((char *)packbit_arr, packbit_len, &base64_buf); return base64_buf; } /**************************************************************************** * Function : md_parse_active_cells_alloc * Description: 将onvif标准格式的区域信息转换为SLP私有协议的json串 * 其中,该函数会申请内存用于存储json命令,需要调用者释放该内存 * Input : active_cells: onvif标准定义的的区域信息(base64Binary) * Output : json_set_region: SLP协议设置区域的json串命令 * Return : 0-success, -1-error ****************************************************************************/ int md_parse_active_cells_alloc(const char *active_cells, char **json_set_region) { int ret = -1; int found; int byte_len = 0; int region_cnt = 0; int r, c, byte_idx, bit_idx, mask; // 遍历整个bitmask int sub_r, sub_c, sub_byte_idx, sub_bit_idx, sub_mask; // 从新矩阵左上角开始遍历bitmask int top_left_row, top_left_col, right_col, bottom_row; // 最大矩形区域的左上角及右下角 int top_left_x, top_left_y, right_bottom_x, right_bottom_y; // 矩形映射为SLP协议的坐标值 size_t inlen = 0; size_t packbit_len = 0; float cell_width = (float)SLP_REGION_MAX_WIDTH / CELL_LAYOUT_COLS; float cell_height = (float)SLP_REGION_MAX_HEIGHT / CELL_LAYOUT_ROWS; char *packbit_buf = NULL; unsigned char byte_buf[(CELL_LAYOUT_ROWS * CELL_LAYOUT_COLS + BITS_IN_BYTE) / BITS_IN_BYTE] = {0}; unsigned char cell_visited[(CELL_LAYOUT_ROWS * CELL_LAYOUT_COLS + BITS_IN_BYTE) / BITS_IN_BYTE] = {0}; json_object *json_root = NULL; json_object *json_module = NULL; json_object *json_action = NULL; json_object *json_param = NULL; json_object *json_region = NULL; int array_len = (CELL_LAYOUT_ROWS * CELL_LAYOUT_COLS + BITS_IN_BYTE) / BITS_IN_BYTE; char *cmd = NULL; if (!active_cells || !json_set_region) { ret = -1; goto exit; } inlen = strlen(active_cells); memset(byte_buf, 0, sizeof(byte_buf)); memset(cell_visited, 0, sizeof(cell_visited)); json_root = jso_new_obj(); if (NULL == json_root) { ONVIF_ERROR("NULL == json_root"); ret = -1; goto exit; } jso_add_string(json_root, "method", "do"); json_module = jso_new_obj(); if (NULL == json_module) { ONVIF_ERROR("NULL == json_module"); ret = -1; goto free_json_obj; } jso_obj_add(json_root, "motion_detection", json_module); json_action = jso_new_obj(); if (NULL == json_action) { ONVIF_ERROR("NULL == json_action"); ret = -1; goto free_json_obj; } jso_obj_add(json_module, "add_md_regions", json_action); json_param = jso_new_array();//注意,创建数组不能用jso_new_obj,在添加元素时会失败 if (NULL == json_param) { ONVIF_ERROR(""); ret = -1; goto free_json_obj; } jso_obj_add(json_action, "region_info", json_param); ret = onvif_base64_decode_alloc(active_cells, inlen, &packbit_buf, &packbit_len); if (ret == FALSE || (ret == TRUE && !packbit_buf)) { ONVIF_ERROR("ret[%d], packbuf[%p]", ret, packbit_buf); ret = -1; goto free_json_obj; } byte_len = tiff6_unPackBits(packbit_buf, packbit_len, byte_buf, array_len); if (byte_len < 0) { ret = -1; goto free_packbit_buf; } // 遍历整个bitmask for (r = 1; r <= CELL_LAYOUT_ROWS; r++) { if (region_cnt >= SLP_REGION_MAX_NUM) { break; } for (c = 1; c <= CELL_LAYOUT_COLS; c++) { if (region_cnt >= SLP_REGION_MAX_NUM) { break; } byte_idx = ((r - 1) * CELL_LAYOUT_COLS + c - 1) / BITS_IN_BYTE; bit_idx = BITS_IN_BYTE - 1 - ((r - 1) * CELL_LAYOUT_COLS + c - 1) % BITS_IN_BYTE; mask = 0x1<<bit_idx; if (cell_visited[byte_idx] & mask) { continue; } // find a new region if (byte_buf[byte_idx] & mask) { top_left_row = r; top_left_col = c; bottom_row = top_left_row; right_col = CELL_LAYOUT_COLS; found = 0; // 从新矩阵左上角开始遍历bitmask for (sub_r = top_left_row; sub_r <= CELL_LAYOUT_ROWS; sub_r++) { // 找到最大矩形 if (found) { break; } for (sub_c = top_left_col; sub_c <= right_col; sub_c++) { sub_byte_idx = ((sub_r - 1) * CELL_LAYOUT_COLS + sub_c - 1) / BITS_IN_BYTE; sub_bit_idx = BITS_IN_BYTE - 1 - ((sub_r - 1) * CELL_LAYOUT_COLS + sub_c - 1) % BITS_IN_BYTE; sub_mask = 0x1<<sub_bit_idx; // blank cell if (!(byte_buf[sub_byte_idx] & sub_mask)) { // 第一列出现空cell,已找到最大矩形 if (sub_c == top_left_col) { found = 1; bottom_row = sub_r - 1; } else { right_col = (sub_c - 1) < right_col ? (sub_c - 1) : right_col; } break; } } } // 到达最底部第一列都没有空的cell if (!found) { bottom_row = CELL_LAYOUT_ROWS; } if (region_cnt < SLP_REGION_MAX_NUM) { // 设置visited mask for (sub_r = top_left_row; sub_r <= bottom_row; sub_r++) { for (sub_c = top_left_col; sub_c <= right_col; sub_c++) { sub_byte_idx = ((sub_r - 1) * CELL_LAYOUT_COLS + sub_c - 1) / BITS_IN_BYTE; sub_bit_idx = BITS_IN_BYTE - 1 - ((sub_r - 1) * CELL_LAYOUT_COLS + sub_c - 1) % BITS_IN_BYTE; sub_mask = 0x1<<sub_bit_idx; cell_visited[sub_byte_idx] |= sub_mask; } } top_left_x = (int)ceilf(cell_width * (top_left_col - 1)); top_left_y = (int)ceilf(cell_height * (top_left_row - 1)); right_bottom_x = (int)floorf(cell_width * right_col); right_bottom_y = (int)floorf(cell_height * bottom_row); json_region = jso_new_obj(); if (NULL == json_region) { ONVIF_ERROR("NULL == json_region"); ret = -1; goto free_packbit_buf; } jso_add_string_from_int(json_region, "x_coor", top_left_x); jso_add_string_from_int(json_region, "y_coor", top_left_y); jso_add_string_from_int(json_region, "width", right_bottom_x - top_left_x); jso_add_string_from_int(json_region, "height", right_bottom_y - top_left_y); jso_array_add(json_param, json_region); region_cnt++; } } } } cmd = (char *)json_object_to_json_string(json_root); *json_set_region = (char *)malloc((strlen(cmd) + 1) * sizeof(char)); if (!*json_set_region) { ONVIF_TRACE("Malloc fail"); ret = -1; goto free_packbit_buf; } strcpy(*json_set_region, cmd); ret = 0; free_packbit_buf: if (packbit_buf) { free(packbit_buf); packbit_buf = NULL; } free_json_obj: jso_free_obj(json_root); exit: return ret; } /* 输入:设置移动侦测区域的字符串(可以是base64编码的,也可以是slp协议的json字符串), * 返回:slp协议的json对象指针,失败返回NULL */ json_object * parse_modify_rules_value(char *md_active_cells) { MOTION_DETECT motion_detect; int md_on = FALSE; int region_cnt = 0; char *json_set_region_str = NULL; json_object *json_md_set = NULL; json_object *json_sub_module = NULL; json_object *json_region = NULL; if (NULL == md_active_cells) { ONVIF_ERROR("ptr is NULL"); return NULL; } memset(&motion_detect, 0, sizeof(MOTION_DETECT)); if (0 == ds_read(MOTION_DETECT_PATH, &motion_detect, sizeof(MOTION_DETECT))) { ONVIF_ERROR("ds_read return 0"); goto error_out; } md_on = motion_detect.enabled; // md_active_cells是slp接口的json字符串 if (md_active_cells) { /* '{' 是非base64编码字符,且合法的json串中都会包含该符号 * 如果activeCell的值包括'{',则其不是base64编码,按私有协议处理 * 如果activeCell的值不包括'{',则按照onvif标准解析 */ // 如果md_active_cells 是json格式的字符串串,则解析该串并构造slp接口json对象 if (strstr(md_active_cells, "{")) { json_md_set = jso_from_string(md_active_cells); if (NULL == json_md_set) { ONVIF_TRACE("Error parsing json"); goto error_out; } if (FALSE == md_on) { ONVIF_TRACE("Motion_detect.enable is FALSE"); return json_md_set; } json_sub_module = jso_obj_get(json_md_set, "motion_detection"); if (NULL == json_sub_module) { ONVIF_TRACE("Error parsing json"); goto error_out; } json_sub_module = jso_obj_get(json_sub_module, "add_md_regions"); if (NULL == json_sub_module) { ONVIF_TRACE("Error parsing json"); goto error_out; } json_sub_module = jso_obj_get(json_sub_module, "region_info"); if (NULL == json_sub_module) { ONVIF_TRACE("Error parsing json"); goto error_out; } /* 获取数组元素个数:即区域个数 */ region_cnt = jso_array_length(json_sub_module); if (region_cnt == 0) { // 如果region_info的值为空,则构造默认的值 json_region = jso_new_obj();//jsonRegion不是数组,是数组的元素 if (NULL == json_region) { ONVIF_TRACE("Ptr is NULL"); goto error_out; } jso_add_string_from_int(json_region, "x_coor", 0); jso_add_string_from_int(json_region, "y_coor", 0); jso_add_string_from_int(json_region, "width", SLP_REGION_MAX_WIDTH); jso_add_string_from_int(json_region, "height", SLP_REGION_MAX_HEIGHT); jso_array_add(json_sub_module, json_region); return json_md_set; } else/* 区域个数不为0,直接返回 */ { return json_md_set; } } else/* 如果md_active_cells不是json串,而是base64Binary的字符串 */ { /* 根据onvif标准定义的的区域信息(base64Binary)-md_active_cells */ /* 获取SLP协议设置区域的json串命令-json_set_region_str */ if (0 != md_parse_active_cells_alloc(md_active_cells, &json_set_region_str)) { ONVIF_TRACE("Decode base64 error"); goto error_out; } json_md_set = jso_from_string(json_set_region_str); if (NULL == json_md_set) { ONVIF_TRACE("Error parsing json"); goto error_out; } return json_md_set; } } error_out: jso_free_obj(json_md_set); ONVIF_FREE(json_set_region_str); return NULL; } /****************************************************************************** * Copyright (c) 2018-2018 TP-Link Systems Inc. * * Filename: onvif_passthrough.c * Version: 1.0 * Description: 将配置参数写入内存及FLASH的相关接口 * Author: liyijie<liyijie@tp-link.com.cn> * Date: 2019-02-01 ******************************************************************************/ #include <stdio.h> #include <stdlib.h> #include <unistd.h> #include "onvif_passthrough.h" #include "soap_parse.h" #include "libds.h" #define PASSTHROUGH_TAG_NUM 128 /*最大支持的透传标签数量*/ typedef struct _PASSTHROUGH_HANDLE { char tag[LEN_TAG]; /*标签名(如"tds:GetDeviceInformation")*/ S32 (*xml_to_json)(SOAP_CONTEXT *soap, JSON_OBJ **req_obj); /*XML转JSON函数指针*/ S32 (*json_to_xml)(SOAP_CONTEXT *soap, JSON_OBJ *rsp_obj); /*JSON转XML函数指针*/ }PASSTHROUGH_HANDLE; LOCAL PASSTHROUGH_HANDLE g_passthrough_handle[PASSTHROUGH_TAG_NUM]; /*全局透传处理函数表*/ /****************************************************************************** * 函数名称: onvif_passthrough() * 函数描述: 执行json指令 * 输 入: params_obj -- 输入的json指令 * 输 出: rsp_obj -- 应答json object * 返 回 值: ERROR/OK ******************************************************************************/ S32 onvif_passthrough(SOAP_CONTEXT *soap, JSON_OBJ *req_obj, JSON_OBJ **rsp_obj) { DS_HANDLE_CONTEXT context; if (NULL == req_obj || rsp_obj == NULL) { ONVIF_TRACE("Params_obj is NULL."); return ERROR; } memset(&context, 0, sizeof(DS_HANDLE_CONTEXT)); /* 运行该透传接口。*/ ONVIF_TRACE("request:%s\n", json_object_get_string(req_obj)); context.req_obj = req_obj; if (OK != ds_parse(&context)) { ONVIF_TRACE("Parse method failed."); return ERROR; } context.group_mask = ROOT_MASK; ds_handle(&context); ONVIF_TRACE("Ds_handle OK."); *rsp_obj = context.res_obj; ONVIF_TRACE("response:%s\n", json_object_get_string(*rsp_obj)); return OK; } /****************************************************************************** * 函数名称: onvif_serve_passthrough() * 函数描述: onvif透传请求处理入口 * 输 入: soap -- soap结构体地址 * 输 出: N/A * 返 回 值: PASSTHROUGH_RET * 流 程:标签路由匹配:根据SOAP请求中的标签(如tds:GetDeviceInformation)查找对应的转换函数。 XML->JSON转换:调用注册的xml_to_json函数解析SOAP请求体,生成JSON请求对象。 业务处理:调用onvif_passthrough()执行实际业务逻辑(通过设备服务库)。 JSON->XML转换:将业务返回的JSON对象转换成SOAP响应XML。 资源清理:释放临时创建的JSON对象。 ******************************************************************************/ PASSTHROUGH_RET onvif_serve_passthrough(SOAP_CONTEXT *soap) { U32 index = 0; PASSTHROUGH_RET ret = PASSTHROUGH_OK; JSON_OBJ *req_obj = NULL; JSON_OBJ *rsp_obj = NULL; if (soap == NULL || soap->tag[0] == '\0') { return PASSTHROUGH_ERROR; } for (index = 0; index < PASSTHROUGH_TAG_NUM; index++) { if (NULL == g_passthrough_handle[index].xml_to_json || NULL == g_passthrough_handle[index].json_to_xml) { ret = PASSTHROUGH_NOT_MATCH; break; } if (soap_match_tag(soap->tag, g_passthrough_handle[index].tag)) { if (OK != g_passthrough_handle[index].xml_to_json(soap, &req_obj) || OK != onvif_passthrough(soap, req_obj, &rsp_obj) || OK != g_passthrough_handle[index].json_to_xml(soap, rsp_obj)) { ret = PASSTHROUGH_SOAP_FAULT; break; } break; } } if (index >= PASSTHROUGH_TAG_NUM) { ret = PASSTHROUGH_NOT_MATCH; } jso_free_obj(req_obj); jso_free_obj(rsp_obj); return ret; } /****************************************************************************** * 函数名称: soap_tag_handle_add() * 函数描述: 注册onvif透传请求处理函数 * 输 入: tag -- 请求tag * handle -- 处理函数 * 输 出: N/A * 返 回 值: ERROR/OK ******************************************************************************/ void onvif_passthrough_handle_add(char *tag, void *xml_to_json, void *json_to_xml) { U32 index = 0; if ((NULL == tag) || (strlen(tag) >= 64) || (NULL == xml_to_json) || (NULL == json_to_xml)) { ONVIF_ERROR("soap handle add error, invalid arg."); return; } while ((index < PASSTHROUGH_TAG_NUM) && (NULL != g_passthrough_handle[index].xml_to_json) && (NULL != g_passthrough_handle[index].json_to_xml)) { index++; } if (index >= PASSTHROUGH_TAG_NUM) { ONVIF_ERROR("soap handle add error, max support %d tags.", PASSTHROUGH_TAG_NUM); return; } snprintf(g_passthrough_handle[index].tag, sizeof(g_passthrough_handle[index].tag), "%s", tag); g_passthrough_handle[index].xml_to_json = xml_to_json; g_passthrough_handle[index].json_to_xml = json_to_xml; return; } /****************************************************************************** * 函数名称: onvif_passthrough_handle_init() * 函数描述: 初始化onvif透传请求处理函数 * 输 入: N/A * 输 出: N/A * 返 回 值: ERROR/OK ******************************************************************************/ void onvif_passthrough_handle_init() { U32 index = 0; for (index = 0; index < PASSTHROUGH_TAG_NUM; index++) { memset(g_passthrough_handle[index].tag, 0, sizeof(g_passthrough_handle[index].tag)); g_passthrough_handle[index].xml_to_json = NULL; g_passthrough_handle[index].json_to_xml = NULL; } tds_passthrough_init(); imaging_passthrough_init(); tan_passthrough_init(); trt_passthrough_init(); tptz_passthrough_init(); tr2_passthrough_init(); return; } /* 标签路由匹配:当SOAP请求到来时,根据请求中的标签(如"tds:GetDeviceInformation")在全局处理函数表g_passthrough_handle中查找对应的处理函数。 XML->JSON转换:如果找到匹配的标签,调用注册的xml_to_json函数,将SOAP请求XML转换为JSON对象。 业务处理:调用onvif_passthrough()函数,它进一步调用ds_parse()和ds_handle()来执行实际的业务逻辑,并将结果返回为JSON对象。 JSON->XML转换:然后调用注册的json_to_xml函数,将JSON响应对象转换回SOAP响应XML。 资源清理:释放临时创建的JSON对象,避免内存泄漏。 *//****************************************************************************** * Copyright (c) 2015-2018 TP-Link Systems Inc. * * 文件名称: packbits.c * 版 本: 1.0 * 摘 要: compress/decompress image with packbits * 作 者: wupimin <wupimin@tp-link.com.cn> * 创建时间: 2018-02-23 ******************************************************************************/ #include <stdio.h> #include <string.h> #include "packbits.h" static signed char* Pack_init(unsigned char Pack[], unsigned char Byte); static signed char* Pack_byte(signed char* Count, unsigned char Byte); static unsigned char* End_byte(signed char* Count); static int unPack_count(char Pack[], int count); int tiff6_PackBits(unsigned char array[], int count, unsigned char Pack[]) { int i = 0; signed char* Count = Pack_init(Pack, array[i]); i++; for (; i < count; i++) { Count = Pack_byte(Count, array[i]); } unsigned char* End = End_byte(Count); *End = '\0'; return (End - Pack); } int tiff6_unPackBits(char Pack[], int count, unsigned char array[], int array_len) { if (!Pack) { return -1; } if (!array) { return unPack_count(Pack, count); } int nRes = 0; signed char* Count = (signed char*)Pack; while ((char*)Count < (Pack+count)) { int c = *Count; if (c<0) { int n = (1-c); if (nRes + n <= array_len) { memset(&(array[nRes]), Count[1], n); nRes += n; } else { return -1; } } else { int n = (1+c); if (nRes + n <= array_len) { memcpy(&(array[nRes]), &Count[1], n); nRes += n; } else { return -1; } } Count = (signed char*)End_byte(Count); } return nRes; } int unPack_count(char Pack[], int count) { int nRes = 0; signed char* Count = (signed char*)Pack; while ((char*)Count < (Pack+count)) { int c = *Count; if (c<0) { nRes += (1-c); } else { nRes += (1+c); } Count = (signed char*)End_byte(Count); } return nRes; } signed char* Pack_init( unsigned char Pack[], unsigned char Byte ) { signed char* Cnt = (signed char*)Pack; *Cnt = 0; Pack[1] = Byte; return Cnt; } unsigned char* End_byte( signed char* Count ) { unsigned char* Pack = (unsigned char*)(Count+1); signed char c = *Count; if (c >0) { Pack = &(Pack[c+1]); } else { Pack = &(Pack[1]); } return Pack; } signed char* Pack_byte( signed char* Count, unsigned char Byte ) { signed char c = *Count; unsigned char* End = End_byte(Count); if (127 == c || c == -127) { return Pack_init(End, Byte); } else { unsigned char* Pack = End-1; if (*(Pack) == Byte) { if (c >0) { (*Count) = c-1; Count = Pack_byte(Pack_init(Pack, Byte), Byte); } else { (*Count)--; } } else { if (c >= 0) { *End = Byte; (*Count)++; } else { Count = Pack_init(End, Byte); } } } return Count; }
最新发布
09-04
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值