No result defined for action com.igit.contract.web.action.ContContractAction and result input

本文详细解析了Web开发中常见的表单提交错误,包括数据类型问题、重复数据提交、文件上传导致的问题,并提供了相应的解决办法。通过实例演示,帮助开发者避免常见错误,提升Web应用的稳定性和用户体验。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

原因:

1,给action传递数据类型有问题。

2,给action提交了重复的数据。比如form中有名为id的input,action地址中也有名为id的参数,或者from中有两个名为id的input

3,jsp form表单中有<input type="file" name="aFile"/> 也就是表单包含文件上传,会出现这个问题。解决办法是在form标签里添加enctype="multipart/form-data";<form name="a" action="" method="" enctype="multipart/form-data">......</form>

enctype="multipart/form-data“参见我的另一篇转载的文章http://blog.youkuaiyun.com/truong/article/details/8296018

参考我给你的图像上传代码和下面的图像识别并在oled上显示的代码,将他们整合在一起/* Edge Impulse Arduino examples * Copyright (c) 2022 EdgeImpulse Inc. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to deal * in the Software without restriction, including without limitation the rights * to use, copy, modify, merge, publish, distribute, sublicense, and/or sell * copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. */ // These sketches are tested with 2.0.4 ESP32 Arduino Core // https://github.com/espressif/arduino-esp32/releases/tag/2.0.4 /* Includes ---------------------------------------------------------------- */ #include <shibie_inferencing.h> #include "edge-impulse-sdk/dsp/image/image.hpp" #include "esp_camera.h" // Select camera model - find more camera models in camera_pins.h file here // https://github.com/espressif/arduino-esp32/blob/master/libraries/ESP32/examples/Camera/CameraWebServer/camera_pins.h //#define CAMERA_MODEL_ESP_EYE // Has PSRAM #define CAMERA_MODEL_AI_THINKER // Has PSRAM #if defined(CAMERA_MODEL_ESP_EYE) #define PWDN_GPIO_NUM -1 #define RESET_GPIO_NUM -1 #define XCLK_GPIO_NUM 4 #define SIOD_GPIO_NUM 18 #define SIOC_GPIO_NUM 23 #define Y9_GPIO_NUM 36 #define Y8_GPIO_NUM 37 #define Y7_GPIO_NUM 38 #define Y6_GPIO_NUM 39 #define Y5_GPIO_NUM 35 #define Y4_GPIO_NUM 14 #define Y3_GPIO_NUM 13 #define Y2_GPIO_NUM 34 #define VSYNC_GPIO_NUM 5 #define HREF_GPIO_NUM 27 #define PCLK_GPIO_NUM 25 #elif defined(CAMERA_MODEL_AI_THINKER) #define PWDN_GPIO_NUM 32 #define RESET_GPIO_NUM -1 #define XCLK_GPIO_NUM 0 #define SIOD_GPIO_NUM 26 #define SIOC_GPIO_NUM 27 #define Y9_GPIO_NUM 35 #define Y8_GPIO_NUM 34 #define Y7_GPIO_NUM 39 #define Y6_GPIO_NUM 36 #define Y5_GPIO_NUM 21 #define Y4_GPIO_NUM 19 #define Y3_GPIO_NUM 18 #define Y2_GPIO_NUM 5 #define VSYNC_GPIO_NUM 25 #define HREF_GPIO_NUM 23 #define PCLK_GPIO_NUM 22 #else #error "Camera model not selected" #endif /* Constant defines -------------------------------------------------------- */ #define EI_CAMERA_RAW_FRAME_BUFFER_COLS 320 #define EI_CAMERA_RAW_FRAME_BUFFER_ROWS 240 #define EI_CAMERA_FRAME_BYTE_SIZE 3 /* Private variables ------------------------------------------------------- */ static bool debug_nn = false; // Set this to true to see e.g. features generated from the raw signal static bool is_initialised = false; uint8_t *snapshot_buf; //points to the output of the capture static camera_config_t camera_config = { .pin_pwdn = PWDN_GPIO_NUM, .pin_reset = RESET_GPIO_NUM, .pin_xclk = XCLK_GPIO_NUM, .pin_sscb_sda = SIOD_GPIO_NUM, .pin_sscb_scl = SIOC_GPIO_NUM, .pin_d7 = Y9_GPIO_NUM, .pin_d6 = Y8_GPIO_NUM, .pin_d5 = Y7_GPIO_NUM, .pin_d4 = Y6_GPIO_NUM, .pin_d3 = Y5_GPIO_NUM, .pin_d2 = Y4_GPIO_NUM, .pin_d1 = Y3_GPIO_NUM, .pin_d0 = Y2_GPIO_NUM, .pin_vsync = VSYNC_GPIO_NUM, .pin_href = HREF_GPIO_NUM, .pin_pclk = PCLK_GPIO_NUM, //XCLK 20MHz or 10MHz for OV2640 double FPS (Experimental) .xclk_freq_hz = 20000000, .ledc_timer = LEDC_TIMER_0, .ledc_channel = LEDC_CHANNEL_0, .pixel_format = PIXFORMAT_JPEG, //YUV422,GRAYSCALE,RGB565,JPEG .frame_size = FRAMESIZE_QVGA, //QQVGA-UXGA Do not use sizes above QVGA when not JPEG .jpeg_quality = 12, //0-63 lower number means higher quality .fb_count = 1, //if more than one, i2s runs in continuous mode. Use only with JPEG .fb_location = CAMERA_FB_IN_PSRAM, .grab_mode = CAMERA_GRAB_WHEN_EMPTY, }; /* Function definitions ------------------------------------------------------- */ bool ei_camera_init(void); void ei_camera_deinit(void); bool ei_camera_capture(uint32_t img_width, uint32_t img_height, uint8_t *out_buf) ; /** * @brief Arduino setup function */ void setup() { // put your setup code here, to run once: Serial.begin(115200); //comment out the below line to start inference immediately after upload while (!Serial); Serial.println("Edge Impulse Inferencing Demo"); if (ei_camera_init() == false) { ei_printf("Failed to initialize Camera!\r\n"); } else { ei_printf("Camera initialized\r\n"); } ei_printf("\nStarting continious inference in 2 seconds...\n"); ei_sleep(2000); } /** * @brief Get data and run inferencing * * @param[in] debug Get debug info if true */ void loop() { // instead of wait_ms, we'll wait on the signal, this allows threads to cancel us... if (ei_sleep(5) != EI_IMPULSE_OK) { return; } snapshot_buf = (uint8_t*)malloc(EI_CAMERA_RAW_FRAME_BUFFER_COLS * EI_CAMERA_RAW_FRAME_BUFFER_ROWS * EI_CAMERA_FRAME_BYTE_SIZE); // check if allocation was successful if(snapshot_buf == nullptr) { ei_printf("ERR: Failed to allocate snapshot buffer!\n"); return; } ei::signal_t signal; signal.total_length = EI_CLASSIFIER_INPUT_WIDTH * EI_CLASSIFIER_INPUT_HEIGHT; signal.get_data = &ei_camera_get_data; if (ei_camera_capture((size_t)EI_CLASSIFIER_INPUT_WIDTH, (size_t)EI_CLASSIFIER_INPUT_HEIGHT, snapshot_buf) == false) { ei_printf("Failed to capture image\r\n"); free(snapshot_buf); return; } // Run the classifier ei_impulse_result_t result = { 0 }; EI_IMPULSE_ERROR err = run_classifier(&signal, &result, debug_nn); if (err != EI_IMPULSE_OK) { ei_printf("ERR: Failed to run classifier (%d)\n", err); return; } // print the predictions ei_printf("Predictions (DSP: %d ms., Classification: %d ms., Anomaly: %d ms.): \n", result.timing.dsp, result.timing.classification, result.timing.anomaly); #if EI_CLASSIFIER_OBJECT_DETECTION == 1 ei_printf("Object detection bounding boxes:\r\n"); for (uint32_t i = 0; i < result.bounding_boxes_count; i++) { ei_impulse_result_bounding_box_t bb = result.bounding_boxes[i]; if (bb.value == 0) { continue; } ei_printf(" %s (%f) [ x: %u, y: %u, width: %u, height: %u ]\r\n", bb.label, bb.value, bb.x, bb.y, bb.width, bb.height); } // Print the prediction results (classification) #else ei_printf("Predictions:\r\n"); for (uint16_t i = 0; i < EI_CLASSIFIER_LABEL_COUNT; i++) { ei_printf(" %s: ", ei_classifier_inferencing_categories[i]); ei_printf("%.5f\r\n", result.classification[i].value); } #endif // Print anomaly result (if it exists) #if EI_CLASSIFIER_HAS_ANOMALY ei_printf("Anomaly prediction: %.3f\r\n", result.anomaly); #endif #if EI_CLASSIFIER_HAS_VISUAL_ANOMALY ei_printf("Visual anomalies:\r\n"); for (uint32_t i = 0; i < result.visual_ad_count; i++) { ei_impulse_result_bounding_box_t bb = result.visual_ad_grid_cells[i]; if (bb.value == 0) { continue; } ei_printf(" %s (%f) [ x: %u, y: %u, width: %u, height: %u ]\r\n", bb.label, bb.value, bb.x, bb.y, bb.width, bb.height); } #endif free(snapshot_buf); } /** * @brief Setup image sensor & start streaming * * @retval false if initialisation failed */ bool ei_camera_init(void) { if (is_initialised) return true; #if defined(CAMERA_MODEL_ESP_EYE) pinMode(13, INPUT_PULLUP); pinMode(14, INPUT_PULLUP); #endif //initialize the camera esp_err_t err = esp_camera_init(&camera_config); if (err != ESP_OK) { Serial.printf("Camera init failed with error 0x%x\n", err); return false; } sensor_t * s = esp_camera_sensor_get(); // initial sensors are flipped vertically and colors are a bit saturated if (s->id.PID == OV3660_PID) { s->set_vflip(s, 1); // flip it back s->set_brightness(s, 1); // up the brightness just a bit s->set_saturation(s, 0); // lower the saturation } #if defined(CAMERA_MODEL_M5STACK_WIDE) s->set_vflip(s, 1); s->set_hmirror(s, 1); #elif defined(CAMERA_MODEL_ESP_EYE) s->set_vflip(s, 1); s->set_hmirror(s, 1); s->set_awb_gain(s, 1); #endif is_initialised = true; return true; } /** * @brief Stop streaming of sensor data */ void ei_camera_deinit(void) { //deinitialize the camera esp_err_t err = esp_camera_deinit(); if (err != ESP_OK) { ei_printf("Camera deinit failed\n"); return; } is_initialised = false; return; } /** * @brief Capture, rescale and crop image * * @param[in] img_width width of output image * @param[in] img_height height of output image * @param[in] out_buf pointer to store output image, NULL may be used * if ei_camera_frame_buffer is to be used for capture and resize/cropping. * * @retval false if not initialised, image captured, rescaled or cropped failed * */ bool ei_camera_capture(uint32_t img_width, uint32_t img_height, uint8_t *out_buf) { bool do_resize = false; if (!is_initialised) { ei_printf("ERR: Camera is not initialized\r\n"); return false; } camera_fb_t *fb = esp_camera_fb_get(); if (!fb) { ei_printf("Camera capture failed\n"); return false; } bool converted = fmt2rgb888(fb->buf, fb->len, PIXFORMAT_JPEG, snapshot_buf); esp_camera_fb_return(fb); if(!converted){ ei_printf("Conversion failed\n"); return false; } if ((img_width != EI_CAMERA_RAW_FRAME_BUFFER_COLS) || (img_height != EI_CAMERA_RAW_FRAME_BUFFER_ROWS)) { do_resize = true; } if (do_resize) { ei::image::processing::crop_and_interpolate_rgb888( out_buf, EI_CAMERA_RAW_FRAME_BUFFER_COLS, EI_CAMERA_RAW_FRAME_BUFFER_ROWS, out_buf, img_width, img_height); } return true; } static int ei_camera_get_data(size_t offset, size_t length, float *out_ptr) { // we already have a RGB888 buffer, so recalculate offset into pixel index size_t pixel_ix = offset * 3; size_t pixels_left = length; size_t out_ptr_ix = 0; while (pixels_left != 0) { // Swap BGR to RGB here // due to https://github.com/espressif/esp32-camera/issues/379 out_ptr[out_ptr_ix] = (snapshot_buf[pixel_ix + 2] << 16) + (snapshot_buf[pixel_ix + 1] << 8) + snapshot_buf[pixel_ix]; // go to the next pixel out_ptr_ix++; pixel_ix+=3; pixels_left--; } // and done! return 0; } #if !defined(EI_CLASSIFIER_SENSOR) || EI_CLASSIFIER_SENSOR != EI_CLASSIFIER_SENSOR_CAMERA使用下面的头文件#include <WiFi.h> #include "esp_camera.h" #include <shibie_inferencing.h> #include "edge-impulse-sdk/dsp/image/image.hpp" #include "freertos/semphr.h" // 互斥锁头文件 #include "esp_task_wdt.h" #include "freertos/task.h" #include "esp_http_server.h" #error "Invalid model for current sensor" #endif
07-22
<think>好的,我现在需要解决用户提到的Struts2登录操作中出现的“未定义的结果ERROR”问题。用户提供的错误信息是HTTP Status 404 - No result defined for action xxx and result error,这通常表示在某个Action执行后,返回的结果名称没有在配置文件中找到对应的视图。 首先,我应该回顾一下Struts2的结果配置机制。根据引用[2]和[3],Struts2的Action配置中可以包含多个<result>元素,每个result对应一个逻辑视图名称,比如“success”或“error”。如果Action返回的结果名称在局部或全局结果配置中没有定义,就会抛出这个错误。 用户的问题发生在登录操作中,所以需要检查相关的LoginAction配置。可能的原因有几个: 1. **局部结果未定义**:可能在LoginAction的配置中缺少了名为“error”的result。需要检查struts.xml中对应actionresult标签,确保有<result name="error">...</result>的定义。 2. **全局结果配置遗漏**:有时候,可能会将通用的结果(如error)放在全局配置中。根据引用[2],全局结果定义在<global-results>标签下。如果用户没有在全局或局部配置中定义error结果,就会导致问题。 3. **结果类型或路径错误**:即使定义了error结果,如果路径不正确或者类型(如redirectAction)配置不当,也可能导致404错误。需要确认error结果指向的JSP或Action是否存在,路径是否正确。 4. **Action名称冲突或错误**:引用[4]提到,有时候修改Action的名称可以解决问题。可能当前Action的名称与其他配置冲突,导致无法正确解析结果。 接下来,我需要一步步分析可能的解决方案,并参考用户提供的引用内容来验证。例如,在struts.xml中检查LoginActionresult配置,确保有error结果的定义,或者在全局结果中添加。同时,检查结果路径是否正确,是否存在拼写错误。此外,还可以建议用户尝试修改Action名称,如引用[4]所述,避免潜在的命名冲突。 最后,还需要考虑其他可能性,例如项目部署结构是否正确,JSP文件是否在正确的位置,以及是否有其他拦截器或配置影响了结果的解析。需要综合用户的引用信息和Struts2的配置规范,给出明确的解决步骤和建议。</think>### 解决Struts2登录操作中未定义的结果ERROR问题 #### 问题原因 当Struts2的Action返回`error`结果时,若未在**局部Action配置**或**全局结果配置**中定义对应的视图路径,则会触发`No result defined for action ... and result error`错误[^1][^2][^3]。 --- #### 解决方案步骤 1. **检查局部Action的结果配置** 在`struts.xml`中找到`LoginAction`的配置,确保包含`name="error"`的`<result>`标签,并指向正确的视图文件(如`/error.jsp`): ```xml <action name="login" class="com.example.LoginAction"> <result name="success">/home.jsp</result> <result name="error">/error.jsp</result> <!-- 添加error结果 --> </action> ``` 2. **配置全局结果作为备选** 如果多个Action共用`error`结果,可在`<global-results>`中定义(需确保全局配置所在的`package`被继承)[^2]: ```xml <package name="default" extends="struts-default"> <global-results> <result name="error">/error.jsp</result> </global-results> <!-- 其他Action配置 --> </package> ``` 3. **验证结果路径是否正确** - 确认`/error.jsp`实际存在于项目`WebContent`或`WEB-INF`目录下。 - 若使用重定向或链式操作,需指定结果类型(如`type="redirect"`)。 4. **检查Action命名冲突(少见但可能)** 尝试修改Action的`name`属性,例如从`login`改为`userLogin`,避免与其他配置冲突[^4]。 5. **添加默认异常处理(可选)** 在`struts.xml`中配置异常映射,将未处理的异常导向`error`结果: ```xml <action name="login" class="com.example.LoginAction"> <exception-mapping result="error" exception="java.lang.Exception"/> <result name="error">/error.jsp</result> </action> ``` --- #### 示例配置修正 ```xml <package name="user" extends="struts-default" namespace="/user"> <global-results> <result name="error">/error.jsp</result> </global-results> <action name="login" class="com.example.LoginAction"> <result type="redirect">/dashboard.jsp</result> <result name="input">/login.jsp</result> <!-- error结果已由全局配置处理 --> </action> </package> ``` ---
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值