lessons_27:二阶构造模式

本文介绍了一种称为“二阶构造”的方法,通过将构造过程分为资源无关的初始化和资源相关的初始化两个阶段,有效地避免了半成品对象的产生,从而减少软件开发中的bug来源。

1. 构造函数分析

构造函数决定的只是对象的初始化,而不是完整对象产生,所以,初始化操作不能按照预期完成而得到的对象,就会产生一个不完整的对象(即半成品对象),半成品对象是合法的的对象,但是不能使用,也是bug的重要来源。
对于构造函数而言:
(1).只提供自动初始化成员变量的机会
(2).不能保证初始化逻辑一定成功
(3).执行return语句后构造函数立即结束

2. 避免半成品对象的产生

工程开发中的构造过程分为:
(1)、资源无关的初始化操作:不会出现异常情况的操作
(2)、需要使用系统资源的操作:可能出现异常情况,如:内存申请,访问文件,访问网络、打开设备等等
解决方案:二阶构造
这里写图片描述
二阶构造的结果:
要么返回一个完整的对象,要么返回空(NULL),绝对不会产生半成品对象。

class TwoPhaseCons 
{
private:
    TwoPhaseCons() // 第一阶段构造函数
    {   
     //初始化与资源无关的操作
    }
    bool construct() // 第二阶段构造函数
    { 
    //初始化与使用资源有关的操作
        return true; 
    }
public:
    static TwoPhaseCons* NewInstance(); // 对象创建函数
};

TwoPhaseCons* TwoPhaseCons::NewInstance() 
{
    TwoPhaseCons* ret = new TwoPhaseCons();

    // 若第二阶段构造失败,返回 NULL    
    if( !(ret && ret->construct()) ) 
    {
        delete ret;
        ret = NULL;
    }

    return ret;
}

int main()
{
    TwoPhaseCons* obj = TwoPhaseCons::NewInstance();
    printf("obj = %p\n", obj);
    delete obj;

    return 0;
}

结论:有效的解决了半成品对象的产生,减少了bug的来源

`Stack BSW Unanalyzed`, `Stack Total`, `BSW Involve`, `BSW Involve Unclosed`, `HorizontalExpansion Count`, `HorizontalExpansion PN`, `HorizontalExpansion Unfinished`, `HorizontalExpansion Delay`, `BSW Analysis Conclusions`, `Stack Classification`, `BSW Reason Classification`, `BSW Analyzed`, `BSW Unanalyzed`, `BSW Staff List`, `Found By Classification`, `High Priority Unanalyzed`, `Stack_Analyzed`, `Stack_Unanalyzed`, update_over_15, update_over_7, `Responsible`, `Email`) VALUES (%(Summary_m0)s, %(Issue_key_m0)s, %(Issue_id_m0)s, %(Parent_id_m0)s, %(Issue_Type_m0)s, %(Status_m0)s, %(Project_key_m0)s, %(Project_name_m0)s, %(Project_type_m0)s, %(Project_lead_m0)s, %(Project_description_m0)s, %(Project_url_m0)s, %(Priority_m0)s, %(Resolution_m0)s, %(Assignee_m0)s, %(Reporter_m0)s, %(Creator_m0)s, %(Created_m0)s, %(Updated_m0)s, %(Last_Viewed_m0)s, %(Resolved_m0)s, %(Affects_Version/s_m0)s, %(Fix_Version/s_m0)s, %(Component/s_m0)s, %(Due_Date_m0)s, %(Labels_m0)s, %(Description_m0)s, %(Environment_m0)s, %(Watchers_m0)s, %(Log_Work_m0)s, %(Security_Level_m0)s, %(Attachment_m0)s, %(total_m0)s, %(Custom_field_AControl_NoZ_m0)s, %(Custom_field_ACustom_1Z_m0)s, %(Custom_field_ACustom_2Z_m0)s, %(Custom_field_ADate_of_first_ResponseZ_m0)s, %(Custom_field_AExperience_OwnerZ_m0)s, %(Custom_field_AFixed_in_releaseZ_m0)s, %(Custom_field_AFound_ByZ_m0)s, %(Custom_field_AFound_in_HW_versionZ_m0)s, %(Custom_field_AFound_in_SW_versionZ_m0)s, %(Custom_field_AKOCHI_Issue_TypeZ_m0)s, %(Custom_field_AKOCHI_Issue_Type_TransitionZ_m0)s, %(Custom_field_ALast_CommentZ_m0)s, %(Custom_field_AMatterZ_m0)s, %(Custom_field_AProject_NumberZ_m0)s, %(Custom_field_ARankZ_m0)s, %(Custom_field_ARealisation_plannedZ_m0)s, %(Custom_field_AReport_NumberZ_m0)s, %(Custom_field_AReport_TypeZ_m0)s, %(Custom_field_AReported_ByZ_m0)s, %(Custom_field_ARiskZ_m0)s, %(Custom_field_AStart_DateZ_m0)s, %(Custom_field_ATeamZ_m0)s, %(Comment_m0)s, %(Involvement_of_BSW_m0)s, %(Date_m0)s, %(ProjectNum_m0)s, %(BSW_Self_Test_m0)s, %(BSW_Issue_m0)s, %(Stack_BSW_Analyzed_m0)s, %(Stack_BSW_Unanalyzed_m0)s, %(Stack_Total_m0)s, %(BSW_Involve_m0)s, %(BSW_Involve_Unclosed_m0)s, %(HorizontalExpansion_Count_m0)s, %(HorizontalExpansion_PN_m0)s, %(HorizontalExpansion_Unfinished_m0)s, %(HorizontalExpansion_Delay_m0)s, %(BSW_Analysis_Conclusions_m0)s, %(Stack_Classification_m0)s, %(BSW_Reason_Classification_m0)s, %(BSW_Analyzed_m0)s, %(BSW_Unanalyzed_m0)s, %(BSW_Staff_List_m0)s, %(Found_By_Classification_m0)s, %(High_Priority_Unanalyzed_m0)s, %(Stack_Analyzed_m0)s, %(Stack_Unanalyzed_m0)s, %(update_over_15_m0)s, %(update_over_7_m0)s, %(Responsible_m0)s, %(Email_m0)s) AS new ON DUPLICATE KEY UPDATE `Summary` = new.`Summary`, `Issue id` = new.`Issue id`, `Parent id` = new.`Parent id`, `Issue Type` = new.`Issue Type`, `Status` = new.`Status`, `Project key` = new.`Project key`, `Project name` = new.`Project name`, `Project type` = new.`Project type`, `Project lead` = new.`Project lead`, `Project description` = new.`Project description`, `Project url` = new.`Project url`, `Priority` = new.`Priority`, `Resolution` = new.`Resolution`, `Assignee` = new.`Assignee`, `Reporter` = new.`Reporter`, `Creator` = new.`Creator`, `Created` = new.`Created`, `Updated` = new.`Updated`, `Last Viewed` = new.`Last Viewed`, `Resolved` = new.`Resolved`, `Affects Version/s` = new.`Affects Version/s`, `Fix Version/s` = new.`Fix Version/s`, `Component/s` = new.`Component/s`, `Due Date` = new.`Due Date`, `Labels` = new.`Labels`, `Description` = new.`Description`, `Environment` = new.`Environment`, `Watchers` = new.`Watchers`, `Log Work` = new.`Log Work`, `Security Level` = new.`Security Level`, `Attachment` = new.`Attachment`, total = new.total, `Custom field (Control No)` = new.`Custom field (Control No)`, `Custom field (Custom_1)` = new.`Custom field (Custom_1)`, `Custom field (Custom_2)` = new.`Custom field (Custom_2)`, `Custom field (Date of first Response)` = new.`Custom field (Date of first Response)`, `Custom field (Experience Owner)` = new.`Custom field (Experience Owner)`, `Custom field (Fixed in release)` = new.`Custom field (Fixed in release)`, `Custom field (Found By)` = new.`Custom field (Found By)`, `Custom field (Found in HW version)` = new.`Custom field (Found in HW version)`, `Custom field (Found in Mech Version)` = new.`Custom field (Found in Mech Version)`, `Custom field (Found in SVN version)` = new.`Custom field (Found in SVN version)`, `Custom field (Found in SW version)` = new.`Custom field (Found in SW version)`, `Custom field (KOCHI Issue Type)` = new.`Custom field (KOCHI Issue Type)`, `Custom field (KOCHI Issue Type Transition)` = new.`Custom field (KOCHI Issue Type Transition)`, `Custom field (Last Comment)` = new.`Custom field (Last Comment)`, `Custom field (Matter)` = new.`Custom field (Matter)`, `Custom field (Project Number)` = new.`Custom field (Project Number)`, `Custom field (Project Number P)` = new.`Custom field (Project Number P)`, `Custom field (Rank)` = new.`Custom field (Rank)`, `Custom field (Realisation planned)` = new.`Custom field (Realisation planned)`, `Custom field (Report Number)` = new.`Custom field (Report Number)`, `Custom field (Report Type)` = new.`Custom field (Report Type)`, `Custom field (Reported By)` = new.`Custom field (Reported By)`, `Custom field (Risk)` = new.`Custom field (Risk)`, `Custom field (Start Date)` = new.`Custom field (Start Date)`, `Custom field (Team)` = new.`Custom field (Team)`, `Comment` = new.`Comment`, `Involvement of BSW` = new.`Involvement of BSW`, `Date` = new.`Date`, `ProjectNum` = new.`ProjectNum`, `BSW Self Test` = new.`BSW Self Test`, `BSW Issue` = new.`BSW Issue`, `Stack BSW Analyzed` = new.`Stack BSW Analyzed`, `Stack BSW Unanalyzed` = new.`Stack BSW Unanalyzed`, `Stack Total` = new.`Stack Total`, `BSW Involve` = new.`BSW Involve`, `BSW Involve Unclosed` = new.`BSW Involve Unclosed`, `HorizontalExpansion Count` = new.`HorizontalExpansion Count`, `HorizontalExpansion PN` = new.`HorizontalExpansion PN`, `HorizontalExpansion Unfinished` = new.`HorizontalExpansion Unfinished`, `HorizontalExpansion Delay` = new.`HorizontalExpansion Delay`, `BSW Analysis Conclusions` = new.`BSW Analysis Conclusions`, `Stack Classification` = new.`Stack Classification`, `BSW Reason Classification` = new.`BSW Reason Classification`, `BSW Analyzed` = new.`BSW Analyzed`, `BSW Unanalyzed` = new.`BSW Unanalyzed`, `BSW Staff List` = new.`BSW Staff List`, `Found By Classification` = new.`Found By Classification`, `High Priority Unanalyzed` = new.`High Priority Unanalyzed`, `Stack_Analyzed` = new.`Stack_Analyzed`, `Stack_Unanalyzed` = new.`Stack_Unanalyzed`, update_over_15 = new.update_over_15, update_over_7 = new.update_over_7, `Responsible` = new.`Responsible`, `Email` = new.`Email`] [parameters: {'Summary_m0': 'KOCHI_LL-738: LL_BYD_UWB Communication Distance Issue', 'Issue_key_m0': 'P09192-3', 'Issue_id_m0': '1315369', 'Parent_id_m0': '1302030.0', 'Issue_Type_m0': 'Sub-ToDo', 'Status_m0': 'Closed', 'Project_key_m0': 'P09192', 'Project_name_m0': 'P09192 GWM_BLE_BNWC_P14', 'Project_type_m0': 'software', 'Project_lead_m0': 'wang161', 'Project_description_m0': '', 'Project_url_m0': '', 'Priority_m0': 'Medium', 'Resolution_m0': "Won't Do", 'Assignee_m0': 'liu033', 'Reporter_m0': 'yu017', 'Creator_m0': 'yu017', 'Created_m0': Timestamp('2025-06-12 08:11:00'), 'Updated_m0': Timestamp('2025-06-16 04:38:00'), 'Last_Viewed_m0': Timestamp('2025-11-24 19:24:00'), 'Resolved_m0': Timestamp('2025-06-16 04:38:00'), 'Affects_Version/s_m0': '', 'Fix_Version/s_m0': '', 'Component/s_m0': '', 'Due_Date_m0': Timestamp('2025-07-10 00:00:00'), 'Labels_m0': '', 'Description_m0': 'Please check the relevance of KOCHI_LL-738 in the project\n\n*Evidence:*\r\nThe isolation between the UWB and BLE antennas must be greater than 10 dB for future designs', 'Environment_m0': '', 'Watchers_m0': '', 'Log_Work_m0': '', 'Security_Level_m0': '', 'Attachment_m0': '', 'total_m0': 0, 'Custom_field_AControl_NoZ_m0': '', 'Custom_field_ACustom_1Z_m0': '', 'Custom_field_ACustom_2Z_m0': '', 'Custom_field_ADate_of_first_ResponseZ_m0': None, 'Custom_field_AExperience_OwnerZ_m0': '', 'Custom_field_AFixed_in_releaseZ_m0': '', 'Custom_field_AFound_ByZ_m0': '', 'Custom_field_AFound_in_HW_versionZ_m0': '', 'Custom_field_AFound_in_SW_versionZ_m0': '', 'Custom_field_AKOCHI_Issue_TypeZ_m0': '', 'Custom_field_AKOCHI_Issue_Type_TransitionZ_m0': '', 'Custom_field_ALast_CommentZ_m0': '', 'Custom_field_AMatterZ_m0': 'Hardware', 'Custom_field_AProject_NumberZ_m0': '', 'Custom_field_ARankZ_m0': '1|i47fog:', 'Custom_field_ARealisation_plannedZ_m0': '', 'Custom_field_AReport_NumberZ_m0': '', 'Custom_field_AReport_TypeZ_m0': '', 'Custom_field_AReported_ByZ_m0': '', 'Custom_field_ARiskZ_m0': '', 'Custom_field_AStart_DateZ_m0': '', 'Custom_field_ATeamZ_m0': '', 'Comment_m0': '16.06.2025 04:38;yu017;N/A\r\n\r\nno UWB;;;', 'Involvement_of_BSW_m0': 'No', 'Date_m0': Timestamp('2025-11-25 02:24:10'), 'ProjectNum_m0': 'P09192', 'BSW_Self_Test_m0': 0, 'BSW_Issue_m0': 0, 'Stack_BSW_Analyzed_m0': 0, 'Stack_BSW_Unanalyzed_m0': 0, 'Stack_Total_m0': 0, 'BSW_Involve_m0': 0, 'BSW_Involve_Unclosed_m0': 0, 'HorizontalExpansion_Count_m0': 0, 'HorizontalExpansion_PN_m0': None, 'HorizontalExpansion_Unfinished_m0': 0, 'HorizontalExpansion_Delay_m0': 0, 'BSW_Analysis_Conclusions_m0': 'Non-BSW', 'Stack_Classification_m0': 'Unclassed', 'BSW_Reason_Classification_m0': '底层未参与分析', 'BSW_Analyzed_m0': 0, 'BSW_Unanalyzed_m0': 0, 'BSW_Staff_List_m0': 'No', 'Found_By_Classification_m0': 'Others', 'High_Priority_Unanalyzed_m0': 0, 'Stack_Analyzed_m0': 0, 'Stack_Unanalyzed_m0': 0, 'update_over_15_m0': 0, 'update_over_7_m0': 0, 'Responsible_m0': 'Bob', 'Email_m0': 'bob@example.com'}] (Background on this error at: https://sqlalche.me/e/20/e3q8) 2025-11-25 17:25:01,783 - ERROR - 行级操作失败: P09192-2, (pymysql.err.OperationalError) (1054, "Unknown column 'new.Custom field (Found in Mech Version)' in 'field list'") [SQL: INSERT INTO jira_task_bsw_deal (`Summary`, `Issue key`, `Issue id`, `Parent id`, `Issue Type`, `Status`, `Project key`, `Project name`, `Project type`, `Project lead`, `Project description`, `Project url`, `Priority`, `Resolution`, `Assignee`, `Reporter`, `Creator`, `Created`, `Updated`, `Last Viewed`, `Resolved`, `Affects Version/s`, `Fix Version/s`, `Component/s`, `Due Date`, `Labels`, `Description`, `Environment`, `Watchers`, `Log Work`, `Security Level`, `Attachment`, total, `Custom field (Control No)`, `Custom field (Custom_1)`, `Custom field (Custom_2)`, `Custom field (Date of first Response)`, `Custom field (Experience Owner)`, `Custom field (Fixed in release)`, `Custom field (Found By)`, `Custom field (Found in HW version)`, `Custom field (Found in SW version)`, `Custom field (KOCHI Issue Type)`, `Custom field (KOCHI Issue Type Transition)`, `Custom field (Last Comment)`, `Custom field (Matter)`, `Custom field (Project Number)`, `Custom field (Rank)`, `Custom field (Realisation planned)`, `Custom field (Report Number)`, `Custom field (Report Type)`, `Custom field (Reported By)`, `Custom field (Risk)`, `Custom field (Start Date)`, `Custom field (Team)`, `Comment`, `Involvement of BSW`, `Date`, `ProjectNum`, `BSW Self Test`, `BSW Issue`, `Stack BSW Analyzed`, `Stack BSW Unanalyzed`, `Stack Total`, `BSW Involve`, `BSW Involve Unclosed`, `HorizontalExpansion Count`, `HorizontalExpansion PN`, `HorizontalExpansion Unfinished`, `HorizontalExpansion Delay`, `BSW Analysis Conclusions`, `Stack Classification`, `BSW Reason Classification`, `BSW Analyzed`, `BSW Unanalyzed`, `BSW Staff List`, `Found By Classification`, `High Priority Unanalyzed`, `Stack_Analyzed`, `Stack_Unanalyzed`, update_over_15, update_over_7, `Responsible`, `Email`) VALUES (%(Summary_m0)s, %(Issue_key_m0)s, %(Issue_id_m0)s, %(Parent_id_m0)s, %(Issue_Type_m0)s, %(Status_m0)s, %(Project_key_m0)s, %(Project_name_m0)s, %(Project_type_m0)s, %(Project_lead_m0)s, %(Project_description_m0)s, %(Project_url_m0)s, %(Priority_m0)s, %(Resolution_m0)s, %(Assignee_m0)s, %(Reporter_m0)s, %(Creator_m0)s, %(Created_m0)s, %(Updated_m0)s, %(Last_Viewed_m0)s, %(Resolved_m0)s, %(Affects_Version/s_m0)s, %(Fix_Version/s_m0)s, %(Component/s_m0)s, %(Due_Date_m0)s, %(Labels_m0)s, %(Description_m0)s, %(Environment_m0)s, %(Watchers_m0)s, %(Log_Work_m0)s, %(Security_Level_m0)s, %(Attachment_m0)s, %(total_m0)s, %(Custom_field_AControl_NoZ_m0)s, %(Custom_field_ACustom_1Z_m0)s, %(Custom_field_ACustom_2Z_m0)s, %(Custom_field_ADate_of_first_ResponseZ_m0)s, %(Custom_field_AExperience_OwnerZ_m0)s, %(Custom_field_AFixed_in_releaseZ_m0)s, %(Custom_field_AFound_ByZ_m0)s, %(Custom_field_AFound_in_HW_versionZ_m0)s, %(Custom_field_AFound_in_SW_versionZ_m0)s, %(Custom_field_AKOCHI_Issue_TypeZ_m0)s, %(Custom_field_AKOCHI_Issue_Type_TransitionZ_m0)s, %(Custom_field_ALast_CommentZ_m0)s, %(Custom_field_AMatterZ_m0)s, %(Custom_field_AProject_NumberZ_m0)s, %(Custom_field_ARankZ_m0)s, %(Custom_field_ARealisation_plannedZ_m0)s, %(Custom_field_AReport_NumberZ_m0)s, %(Custom_field_AReport_TypeZ_m0)s, %(Custom_field_AReported_ByZ_m0)s, %(Custom_field_ARiskZ_m0)s, %(Custom_field_AStart_DateZ_m0)s, %(Custom_field_ATeamZ_m0)s, %(Comment_m0)s, %(Involvement_of_BSW_m0)s, %(Date_m0)s, %(ProjectNum_m0)s, %(BSW_Self_Test_m0)s, %(BSW_Issue_m0)s, %(Stack_BSW_Analyzed_m0)s, %(Stack_BSW_Unanalyzed_m0)s, %(Stack_Total_m0)s, %(BSW_Involve_m0)s, %(BSW_Involve_Unclosed_m0)s, %(HorizontalExpansion_Count_m0)s, %(HorizontalExpansion_PN_m0)s, %(HorizontalExpansion_Unfinished_m0)s, %(HorizontalExpansion_Delay_m0)s, %(BSW_Analysis_Conclusions_m0)s, %(Stack_Classification_m0)s, %(BSW_Reason_Classification_m0)s, %(BSW_Analyzed_m0)s, %(BSW_Unanalyzed_m0)s, %(BSW_Staff_List_m0)s, %(Found_By_Classification_m0)s, %(High_Priority_Unanalyzed_m0)s, %(Stack_Analyzed_m0)s, %(Stack_Unanalyzed_m0)s, %(update_over_15_m0)s, %(update_over_7_m0)s, %(Responsible_m0)s, %(Email_m0)s) AS new ON DUPLICATE KEY UPDATE `Summary` = new.`Summary`, `Issue id` = new.`Issue id`, `Parent id` = new.`Parent id`, `Issue Type` = new.`Issue Type`, `Status` = new.`Status`, `Project key` = new.`Project key`, `Project name` = new.`Project name`, `Project type` = new.`Project type`, `Project lead` = new.`Project lead`, `Project description` = new.`Project description`, `Project url` = new.`Project url`, `Priority` = new.`Priority`, `Resolution` = new.`Resolution`, `Assignee` = new.`Assignee`, `Reporter` = new.`Reporter`, `Creator` = new.`Creator`, `Created` = new.`Created`, `Updated` = new.`Updated`, `Last Viewed` = new.`Last Viewed`, `Resolved` = new.`Resolved`, `Affects Version/s` = new.`Affects Version/s`, `Fix Version/s` = new.`Fix Version/s`, `Component/s` = new.`Component/s`, `Due Date` = new.`Due Date`, `Labels` = new.`Labels`, `Description` = new.`Description`, `Environment` = new.`Environment`, `Watchers` = new.`Watchers`, `Log Work` = new.`Log Work`, `Security Level` = new.`Security Level`, `Attachment` = new.`Attachment`, total = new.total, `Custom field (Control No)` = new.`Custom field (Control No)`, `Custom field (Custom_1)` = new.`Custom field (Custom_1)`, `Custom field (Custom_2)` = new.`Custom field (Custom_2)`, `Custom field (Date of first Response)` = new.`Custom field (Date of first Response)`, `Custom field (Experience Owner)` = new.`Custom field (Experience Owner)`, `Custom field (Fixed in release)` = new.`Custom field (Fixed in release)`, `Custom field (Found By)` = new.`Custom field (Found By)`, `Custom field (Found in HW version)` = new.`Custom field (Found in HW version)`, `Custom field (Found in Mech Version)` = new.`Custom field (Found in Mech Version)`, `Custom field (Found in SVN version)` = new.`Custom field (Found in SVN version)`, `Custom field (Found in SW version)` = new.`Custom field (Found in SW version)`, `Custom field (KOCHI Issue Type)` = new.`Custom field (KOCHI Issue Type)`, `Custom field (KOCHI Issue Type Transition)` = new.`Custom field (KOCHI Issue Type Transition)`, `Custom field (Last Comment)` = new.`Custom field (Last Comment)`, `Custom field (Matter)` = new.`Custom field (Matter)`, `Custom field (Project Number)` = new.`Custom field (Project Number)`, `Custom field (Project Number P)` = new.`Custom field (Project Number P)`, `Custom field (Rank)` = new.`Custom field (Rank)`, `Custom field (Realisation planned)` = new.`Custom field (Realisation planned)`, `Custom field (Report Number)` = new.`Custom field (Report Number)`, `Custom field (Report Type)` = new.`Custom field (Report Type)`, `Custom field (Reported By)` = new.`Custom field (Reported By)`, `Custom field (Risk)` = new.`Custom field (Risk)`, `Custom field (Start Date)` = new.`Custom field (Start Date)`, `Custom field (Team)` = new.`Custom field (Team)`, `Comment` = new.`Comment`, `Involvement of BSW` = new.`Involvement of BSW`, `Date` = new.`Date`, `ProjectNum` = new.`ProjectNum`, `BSW Self Test` = new.`BSW Self Test`, `BSW Issue` = new.`BSW Issue`, `Stack BSW Analyzed` = new.`Stack BSW Analyzed`, `Stack BSW Unanalyzed` = new.`Stack BSW Unanalyzed`, `Stack Total` = new.`Stack Total`, `BSW Involve` = new.`BSW Involve`, `BSW Involve Unclosed` = new.`BSW Involve Unclosed`, `HorizontalExpansion Count` = new.`HorizontalExpansion Count`, `HorizontalExpansion PN` = new.`HorizontalExpansion PN`, `HorizontalExpansion Unfinished` = new.`HorizontalExpansion Unfinished`, `HorizontalExpansion Delay` = new.`HorizontalExpansion Delay`, `BSW Analysis Conclusions` = new.`BSW Analysis Conclusions`, `Stack Classification` = new.`Stack Classification`, `BSW Reason Classification` = new.`BSW Reason Classification`, `BSW Analyzed` = new.`BSW Analyzed`, `BSW Unanalyzed` = new.`BSW Unanalyzed`, `BSW Staff List` = new.`BSW Staff List`, `Found By Classification` = new.`Found By Classification`, `High Priority Unanalyzed` = new.`High Priority Unanalyzed`, `Stack_Analyzed` = new.`Stack_Analyzed`, `Stack_Unanalyzed` = new.`Stack_Unanalyzed`, update_over_15 = new.update_over_15, update_over_7 = new.update_over_7, `Responsible` = new.`Responsible`, `Email` = new.`Email`] [parameters: {'Summary_m0': 'KOCHI_LL-696: Avoid component missing during BRD import by step format ', 'Issue_key_m0': 'P09192-2', 'Issue_id_m0': '1315368', 'Parent_id_m0': '1302030.0', 'Issue_Type_m0': 'Sub-ToDo', 'Status_m0': 'Closed', 'Project_key_m0': 'P09192', 'Project_name_m0': 'P09192 GWM_BLE_BNWC_P14', 'Project_type_m0': 'software', 'Project_lead_m0': 'wang161', 'Project_description_m0': '', 'Project_url_m0': '', 'Priority_m0': 'Medium', 'Resolution_m0': 'Done', 'Assignee_m0': 'liu033', 'Reporter_m0': 'yu017', 'Creator_m0': 'yu017', 'Created_m0': Timestamp('2025-06-12 08:11:00'), 'Updated_m0': Timestamp('2025-09-09 11:55:00'), 'Last_Viewed_m0': Timestamp('2025-11-24 19:24:00'), 'Resolved_m0': Timestamp('2025-09-09 11:55:00'), 'Affects_Version/s_m0': '', 'Fix_Version/s_m0': '', 'Component/s_m0': '', 'Due_Date_m0': Timestamp('2025-07-10 00:00:00'), 'Labels_m0': '', 'Description_m0': 'Please check the relevance of KOCHI_LL-696 in the project\n\n*Evidence:*\r\nConsider this experience before important sampling delivery or design change.', 'Environment_m0': '', 'Watchers_m0': '', 'Log_Work_m0': '', 'Security_Level_m0': '', 'Attachment_m0': '', 'total_m0': 0, 'Custom_field_AControl_NoZ_m0': '', 'Custom_field_ACustom_1Z_m0': '', 'Custom_field_ACustom_2Z_m0': '', 'Custom_field_ADate_of_first_ResponseZ_m0': None, 'Custom_field_AExperience_OwnerZ_m0': '', 'Custom_field_AFixed_in_releaseZ_m0': '', 'Custom_field_AFound_ByZ_m0': '', 'Custom_field_AFound_in_HW_versionZ_m0': '', 'Custom_field_AFound_in_SW_versionZ_m0': '', 'Custom_field_AKOCHI_Issue_TypeZ_m0': '', 'Custom_field_AKOCHI_Issue_Type_TransitionZ_m0': '', 'Custom_field_ALast_CommentZ_m0': '', 'Custom_field_AMatterZ_m0': 'Hardware', 'Custom_field_AProject_NumberZ_m0': '', 'Custom_field_ARankZ_m0': '1|i47fo8:', 'Custom_field_ARealisation_plannedZ_m0': '', 'Custom_field_AReport_NumberZ_m0': '', 'Custom_field_AReport_TypeZ_m0': '', 'Custom_field_AReported_ByZ_m0': '', 'Custom_field_ARiskZ_m0': '', 'Custom_field_AStart_DateZ_m0': '', 'Custom_field_ATeamZ_m0': '', 'Comment_m0': '', 'Involvement_of_BSW_m0': 'No', 'Date_m0': Timestamp('2025-11-25 02:24:10'), 'ProjectNum_m0': 'P09192', 'BSW_Self_Test_m0': 0, 'BSW_Issue_m0': 0, 'Stack_BSW_Analyzed_m0': 0, 'Stack_BSW_Unanalyzed_m0': 0, 'Stack_Total_m0': 0, 'BSW_Involve_m0': 0, 'BSW_Involve_Unclosed_m0': 0, 'HorizontalExpansion_Count_m0': 0, 'HorizontalExpansion_PN_m0': None, 'HorizontalExpansion_Unfinished_m0': 0, 'HorizontalExpansion_Delay_m0': 0, 'BSW_Analysis_Conclusions_m0': 'Non-BSW', 'Stack_Classification_m0': 'Unclassed', 'BSW_Reason_Classification_m0': '底层未参与分析', 'BSW_Analyzed_m0': 0, 'BSW_Unanalyzed_m0': 0, 'BSW_Staff_List_m0': 'No', 'Found_By_Classification_m0': 'Others', 'High_Priority_Unanalyzed_m0': 0, 'Stack_Analyzed_m0': 0, 'Stack_Unanalyzed_m0': 0, 'update_over_15_m0': 0, 'update_over_7_m0': 0, 'Responsible_m0': 'Bob', 'Email_m0': 'bob@example.com'}] (Background on this error at: https://sqlalche.me/e/20/e3q8) 2025-11-25 17:25:01,807 - ERROR - 行级操作失败: P09192-1, (pymysql.err.OperationalError) (1054, "Unknown column 'new.Custom field (Found in Mech Version)' in 'field list'") [SQL: INSERT INTO jira_task_bsw_deal (`Summary`, `Issue key`, `Issue id`, `Parent id`, `Issue Type`, `Status`, `Project key`, `Project name`, `Project type`, `Project lead`, `Project description`, `Project url`, `Priority`, `Resolution`, `Assignee`, `Reporter`, `Creator`, `Created`, `Updated`, `Last Viewed`, `Resolved`, `Affects Version/s`, `Fix Version/s`, `Component/s`, `Due Date`, `Labels`, `Description`, `Environment`, `Watchers`, `Log Work`, `Security Level`, `Attachment`, total, `Custom field (Control No)`, `Custom field (Custom_1)`, `Custom field (Custom_2)`, `Custom field (Date of first Response)`, `Custom field (Experience Owner)`, `Custom field (Fixed in release)`, `Custom field (Found By)`, `Custom field (Found in HW version)`, `Custom field (Found in SW version)`, `Custom field (KOCHI Issue Type)`, `Custom field (KOCHI Issue Type Transition)`, `Custom field (Last Comment)`, `Custom field (Matter)`, `Custom field (Project Number)`, `Custom field (Rank)`, `Custom field (Realisation planned)`, `Custom field (Report Number)`, `Custom field (Report Type)`, `Custom field (Reported By)`, `Custom field (Risk)`, `Custom field (Start Date)`, `Custom field (Team)`, `Comment`, `Involvement of BSW`, `Date`, `ProjectNum`, `BSW Self Test`, `BSW Issue`, `Stack BSW Analyzed`, `Stack BSW Unanalyzed`, `Stack Total`, `BSW Involve`, `BSW Involve Unclosed`, `HorizontalExpansion Count`, `HorizontalExpansion PN`, `HorizontalExpansion Unfinished`, `HorizontalExpansion Delay`, `BSW Analysis Conclusions`, `Stack Classification`, `BSW Reason Classification`, `BSW Analyzed`, `BSW Unanalyzed`, `BSW Staff List`, `Found By Classification`, `High Priority Unanalyzed`, `Stack_Analyzed`, `Stack_Unanalyzed`, update_over_15, update_over_7, `Responsible`, `Email`) VALUES (%(Summary_m0)s, %(Issue_key_m0)s, %(Issue_id_m0)s, %(Parent_id_m0)s, %(Issue_Type_m0)s, %(Status_m0)s, %(Project_key_m0)s, %(Project_name_m0)s, %(Project_type_m0)s, %(Project_lead_m0)s, %(Project_description_m0)s, %(Project_url_m0)s, %(Priority_m0)s, %(Resolution_m0)s, %(Assignee_m0)s, %(Reporter_m0)s, %(Creator_m0)s, %(Created_m0)s, %(Updated_m0)s, %(Last_Viewed_m0)s, %(Resolved_m0)s, %(Affects_Version/s_m0)s, %(Fix_Version/s_m0)s, %(Component/s_m0)s, %(Due_Date_m0)s, %(Labels_m0)s, %(Description_m0)s, %(Environment_m0)s, %(Watchers_m0)s, %(Log_Work_m0)s, %(Security_Level_m0)s, %(Attachment_m0)s, %(total_m0)s, %(Custom_field_AControl_NoZ_m0)s, %(Custom_field_ACustom_1Z_m0)s, %(Custom_field_ACustom_2Z_m0)s, %(Custom_field_ADate_of_first_ResponseZ_m0)s, %(Custom_field_AExperience_OwnerZ_m0)s, %(Custom_field_AFixed_in_releaseZ_m0)s, %(Custom_field_AFound_ByZ_m0)s, %(Custom_field_AFound_in_HW_versionZ_m0)s, %(Custom_field_AFound_in_SW_versionZ_m0)s, %(Custom_field_AKOCHI_Issue_TypeZ_m0)s, %(Custom_field_AKOCHI_Issue_Type_TransitionZ_m0)s, %(Custom_field_ALast_CommentZ_m0)s, %(Custom_field_AMatterZ_m0)s, %(Custom_field_AProject_NumberZ_m0)s, %(Custom_field_ARankZ_m0)s, %(Custom_field_ARealisation_plannedZ_m0)s, %(Custom_field_AReport_NumberZ_m0)s, %(Custom_field_AReport_TypeZ_m0)s, %(Custom_field_AReported_ByZ_m0)s, %(Custom_field_ARiskZ_m0)s, %(Custom_field_AStart_DateZ_m0)s, %(Custom_field_ATeamZ_m0)s, %(Comment_m0)s, %(Involvement_of_BSW_m0)s, %(Date_m0)s, %(ProjectNum_m0)s, %(BSW_Self_Test_m0)s, %(BSW_Issue_m0)s, %(Stack_BSW_Analyzed_m0)s, %(Stack_BSW_Unanalyzed_m0)s, %(Stack_Total_m0)s, %(BSW_Involve_m0)s, %(BSW_Involve_Unclosed_m0)s, %(HorizontalExpansion_Count_m0)s, %(HorizontalExpansion_PN_m0)s, %(HorizontalExpansion_Unfinished_m0)s, %(HorizontalExpansion_Delay_m0)s, %(BSW_Analysis_Conclusions_m0)s, %(Stack_Classification_m0)s, %(BSW_Reason_Classification_m0)s, %(BSW_Analyzed_m0)s, %(BSW_Unanalyzed_m0)s, %(BSW_Staff_List_m0)s, %(Found_By_Classification_m0)s, %(High_Priority_Unanalyzed_m0)s, %(Stack_Analyzed_m0)s, %(Stack_Unanalyzed_m0)s, %(update_over_15_m0)s, %(update_over_7_m0)s, %(Responsible_m0)s, %(Email_m0)s) AS new ON DUPLICATE KEY UPDATE `Summary` = new.`Summary`, `Issue id` = new.`Issue id`, `Parent id` = new.`Parent id`, `Issue Type` = new.`Issue Type`, `Status` = new.`Status`, `Project key` = new.`Project key`, `Project name` = new.`Project name`, `Project type` = new.`Project type`, `Project lead` = new.`Project lead`, `Project description` = new.`Project description`, `Project url` = new.`Project url`, `Priority` = new.`Priority`, `Resolution` = new.`Resolution`, `Assignee` = new.`Assignee`, `Reporter` = new.`Reporter`, `Creator` = new.`Creator`, `Created` = new.`Created`, `Updated` = new.`Updated`, `Last Viewed` = new.`Last Viewed`, `Resolved` = new.`Resolved`, `Affects Version/s` = new.`Affects Version/s`, `Fix Version/s` = new.`Fix Version/s`, `Component/s` = new.`Component/s`, `Due Date` = new.`Due Date`, `Labels` = new.`Labels`, `Description` = new.`Description`, `Environment` = new.`Environment`, `Watchers` = new.`Watchers`, `Log Work` = new.`Log Work`, `Security Level` = new.`Security Level`, `Attachment` = new.`Attachment`, total = new.total, `Custom field (Control No)` = new.`Custom field (Control No)`, `Custom field (Custom_1)` = new.`Custom field (Custom_1)`, `Custom field (Custom_2)` = new.`Custom field (Custom_2)`, `Custom field (Date of first Response)` = new.`Custom field (Date of first Response)`, `Custom field (Experience Owner)` = new.`Custom field (Experience Owner)`, `Custom field (Fixed in release)` = new.`Custom field (Fixed in release)`, `Custom field (Found By)` = new.`Custom field (Found By)`, `Custom field (Found in HW version)` = new.`Custom field (Found in HW version)`, `Custom field (Found in Mech Version)` = new.`Custom field (Found in Mech Version)`, `Custom field (Found in SVN version)` = new.`Custom field (Found in SVN version)`, `Custom field (Found in SW version)` = new.`Custom field (Found in SW version)`, `Custom field (KOCHI Issue Type)` = new.`Custom field (KOCHI Issue Type)`, `Custom field (KOCHI Issue Type Transition)` = new.`Custom field (KOCHI Issue Type Transition)`, `Custom field (Last Comment)` = new.`Custom field (Last Comment)`, `Custom field (Matter)` = new.`Custom field (Matter)`, `Custom field (Project Number)` = new.`Custom field (Project Number)`, `Custom field (Project Number P)` = new.`Custom field (Project Number P)`, `Custom field (Rank)` = new.`Custom field (Rank)`, `Custom field (Realisation planned)` = new.`Custom field (Realisation planned)`, `Custom field (Report Number)` = new.`Custom field (Report Number)`, `Custom field (Report Type)` = new.`Custom field (Report Type)`, `Custom field (Reported By)` = new.`Custom field (Reported By)`, `Custom field (Risk)` = new.`Custom field (Risk)`, `Custom field (Start Date)` = new.`Custom field (Start Date)`, `Custom field (Team)` = new.`Custom field (Team)`, `Comment` = new.`Comment`, `Involvement of BSW` = new.`Involvement of BSW`, `Date` = new.`Date`, `ProjectNum` = new.`ProjectNum`, `BSW Self Test` = new.`BSW Self Test`, `BSW Issue` = new.`BSW Issue`, `Stack BSW Analyzed` = new.`Stack BSW Analyzed`, `Stack BSW Unanalyzed` = new.`Stack BSW Unanalyzed`, `Stack Total` = new.`Stack Total`, `BSW Involve` = new.`BSW Involve`, `BSW Involve Unclosed` = new.`BSW Involve Unclosed`, `HorizontalExpansion Count` = new.`HorizontalExpansion Count`, `HorizontalExpansion PN` = new.`HorizontalExpansion PN`, `HorizontalExpansion Unfinished` = new.`HorizontalExpansion Unfinished`, `HorizontalExpansion Delay` = new.`HorizontalExpansion Delay`, `BSW Analysis Conclusions` = new.`BSW Analysis Conclusions`, `Stack Classification` = new.`Stack Classification`, `BSW Reason Classification` = new.`BSW Reason Classification`, `BSW Analyzed` = new.`BSW Analyzed`, `BSW Unanalyzed` = new.`BSW Unanalyzed`, `BSW Staff List` = new.`BSW Staff List`, `Found By Classification` = new.`Found By Classification`, `High Priority Unanalyzed` = new.`High Priority Unanalyzed`, `Stack_Analyzed` = new.`Stack_Analyzed`, `Stack_Unanalyzed` = new.`Stack_Unanalyzed`, update_over_15 = new.update_over_15, update_over_7 = new.update_over_7, `Responsible` = new.`Responsible`, `Email` = new.`Email`] [parameters: {'Summary_m0': 'GW3 LL Workshop', 'Issue_key_m0': 'P09192-1', 'Issue_id_m0': '1302030', 'Parent_id_m0': '', 'Issue_Type_m0': 'ToDo', 'Status_m0': 'To Do', 'Project_key_m0': 'P09192', 'Project_name_m0': 'P09192 GWM_BLE_BNWC_P14', 'Project_type_m0': 'software', 'Project_lead_m0': 'wang161', 'Project_description_m0': '', 'Project_url_m0': '', 'Priority_m0': 'Medium', 'Resolution_m0': '', 'Assignee_m0': 'pan009', 'Reporter_m0': 'yu017', 'Creator_m0': 'yu017', 'Created_m0': Timestamp('2025-05-26 07:18:00'), 'Updated_m0': Timestamp('2025-06-16 07:08:00'), 'Last_Viewed_m0': Timestamp('2025-11-24 19:24:00'), 'Resolved_m0': NaT, 'Affects_Version/s_m0': '', 'Fix_Version/s_m0': '', 'Component/s_m0': '', 'Due_Date_m0': Timestamp('2026-05-23 00:00:00'), 'Labels_m0': 'LL-Workshop', 'Description_m0': 'AEQ: Pan Lili\r\nThis ticket serves as main ticket for the initial lessons learned Workshop at GW3.\r\nDate: 20250616\r\nInvited:See attachment\r\nBe ... (406 characters truncated) ... implementation has to be checked by AEQ. If the implementation cannot be verified, the sub-task is reopened and assigned back to the originator.\r\n', 'Environment_m0': '', 'Watchers_m0': '', 'Log_Work_m0': '', 'Security_Level_m0': '', 'Attachment_m0': '11.06.2025 03:06;yu017;P09192-GWM-C06 BLE.xlsx;https://jira.kostal.com/secure/attachment/1234604/P09192-GWM-C06+BLE.xlsx||16.06.2025 07:08;yu017;答复 ... (32 characters truncated) ... s://jira.kostal.com/secure/attachment/1237494/%E7%AD%94%E5%A4%8D++P09192%E9%A1%B9%E7%9B%AE%E7%9A%84LL+kickoff%E5%AF%BC%E5%85%A5%E4%BC%9A%E8%AE%AE.msg', 'total_m0': 0, 'Custom_field_AControl_NoZ_m0': '', 'Custom_field_ACustom_1Z_m0': '', 'Custom_field_ACustom_2Z_m0': '', 'Custom_field_ADate_of_first_ResponseZ_m0': None, 'Custom_field_AExperience_OwnerZ_m0': '', 'Custom_field_AFixed_in_releaseZ_m0': '', 'Custom_field_AFound_ByZ_m0': '', 'Custom_field_AFound_in_HW_versionZ_m0': '', 'Custom_field_AFound_in_SW_versionZ_m0': '', 'Custom_field_AKOCHI_Issue_TypeZ_m0': '', 'Custom_field_AKOCHI_Issue_Type_TransitionZ_m0': '', 'Custom_field_ALast_CommentZ_m0': '', 'Custom_field_AMatterZ_m0': '', 'Custom_field_AProject_NumberZ_m0': '', 'Custom_field_ARankZ_m0': '1|i45am8:', 'Custom_field_ARealisation_plannedZ_m0': '', 'Custom_field_AReport_NumberZ_m0': '', 'Custom_field_AReport_TypeZ_m0': '', 'Custom_field_AReported_ByZ_m0': '', 'Custom_field_ARiskZ_m0': '', 'Custom_field_AStart_DateZ_m0': '', 'Custom_field_ATeamZ_m0': '', 'Comment_m0': '', 'Involvement_of_BSW_m0': 'No', 'Date_m0': Timestamp('2025-11-25 02:24:10'), 'ProjectNum_m0': 'P09192', 'BSW_Self_Test_m0': 0, 'BSW_Issue_m0': 0, 'Stack_BSW_Analyzed_m0': 0, 'Stack_BSW_Unanalyzed_m0': 0, 'Stack_Total_m0': 0, 'BSW_Involve_m0': 0, 'BSW_Involve_Unclosed_m0': 0, 'HorizontalExpansion_Count_m0': 0, 'HorizontalExpansion_PN_m0': None, 'HorizontalExpansion_Unfinished_m0': 0, 'HorizontalExpansion_Delay_m0': 0, 'BSW_Analysis_Conclusions_m0': 'Non-BSW', 'Stack_Classification_m0': 'Unclassed', 'BSW_Reason_Classification_m0': '底层未参与分析', 'BSW_Analyzed_m0': 0, 'BSW_Unanalyzed_m0': 0, 'BSW_Staff_List_m0': 'No', 'Found_By_Classification_m0': 'Others', 'High_Priority_Unanalyzed_m0': 0, 'Stack_Analyzed_m0': 0, 'Stack_Unanalyzed_m0': 0, 'update_over_15_m0': 0, 'update_over_7_m0': 0, 'Responsible_m0': 'Bob', 'Email_m0': 'bob@example.com'}] (Background on this error at: https://sqlalche.me/e/20/e3q8) 2025-11-25 17:25:01,810 - INFO - 成功写入/更新 0 条记录 2025-11-25 17:25:01,810 - INFO - 成功写入 3894 条记录到 jira_task_bsw_deal 2025-11-25 17:25:01,810 - INFO - 数据处理完成,共写入 3894 条记录
最新发布
11-26
<think> 问题分析: 错误信息显示:`Unknown column 'new.Custom field (Found in Mech Version)' in 'field list'`。 这表示在生成`ON DUPLICATE KEY UPDATE`的SQL语句时,引用了目标表中不存在的列`Custom field (Found in Mech Version)`。 原因分析: 1. 源数据(从查询或CSV导入)可能包含此列,但目标表中没有。 2. 在`valid_columns`过滤步骤中,我们只保留了目标表中存在的列,因此按道理不会出现不存在的列名。 3. 但是,在错误信息中,我们看到SQL语句中包含了`Custom field (Found in Mech Version)`,而目标表中没有这个列。 解决方案: 1. 检查目标表的列名,确保所有要写入的列都存在。 2. 在生成更新语句时,确保只使用目标表中存在的列。 然而,在之前的代码中,我们已经做了列名过滤(`valid_columns = [col for col in dataframe.columns if col in target_columns]`),所以按理不会出现无效列。 因此,问题可能出在:当目标表结构发生变化(比如列被删除或重命名)时,我们的代码在生成更新语句时没有动态适应。 在之前的代码中,我们使用以下方式构建更新字典: update_dict = { col.name: stmt.inserted[col.name] for col in table.columns if col.name not in primary_keys } 但是,请注意,在生成`insert`语句时,我们使用的是`filtered_df`(已经过滤了列),所以插入的列都是目标表存在的。 而`update_dict`也是从目标表的列中生成的(排除主键),所以应该没有问题。 那么问题可能出现在哪里? 错误发生在回退到逐行处理时。在批量处理失败后,我们尝试逐行处理,而在逐行处理时,我们使用了同样的`update_dict`生成方式。 但是,在逐行处理时,我们为每一条记录都构建了一个插入语句,然后执行。这里构建插入语句时,我们使用了整个记录(即包含所有列), 但我们的记录(`record`)是从`batch_df`来的,而`batch_df`是从`filtered_df`来的,所以列应该是目标表存在的。 重新审视错误信息,发现错误信息中列名是`Custom field (Found in Mech Version)`,但我们在`filtered_df`中应该没有这个列,因为我们在过滤列的时候使用的是目标表的列名。 因此,我们需要检查: 1. 目标表的列名列表(`target_columns`)是否包含了`Custom field (Found in Mech Version)`? 2. 如果没有,那么为什么在构建插入语句时会出现这个列? 可能的原因: 在构建插入语句时,我们使用的是`table.columns`,这个表对象是通过反射得到的,它包含了目标表的所有列。 但是,如果目标表在程序运行期间被修改(比如删除了某些列),而我们的表对象(`table`)是在批量处理开始前反射得到的,那么当目标表结构发生变化时,反射得到的表结构可能已经过时。 然而,在我们的代码中,整个写入过程在一个函数中,表对象只反射一次。如果目标表在写入过程中被修改,那么可能会出现列不匹配。 但更可能的原因是:源数据中的列名与目标表的列名不完全匹配。注意,错误信息中的列名是`Custom field (Found in Mech Version)`,而目标表列名可能有所不同(比如大小写、空格等)。 在MySQL中,列名是大小写敏感的,这取决于数据库的配置。我们使用反射获取的列名是数据库中的实际列名。 因此,我们需要确保: 1. 在过滤列时,我们使用的列名是数据库中的实际列名(大小写敏感)。 2. 在构建记录字典时,列名必须与目标表列名完全一致。 但是,我们在`filtered_df`中已经过滤了列,所以列名应该是目标表存在的。那么为什么还会出现这个错误? 另一种可能性:在生成更新语句时,我们使用了`stmt.inserted[col.name]`,而`col.name`可能和数据库中的列名不完全匹配(比如有空格或其他特殊字符)。 但是,在反射表结构时,`col.name`就是数据库中的列名。 重新检查错误信息中的SQL语句,我们看到: `Custom field (Found in Mech Version)` 而在我们的表结构中,这个列名可能是`Custom field (Found in Mech Version)`(注意空格)?但是错误信息中已经出现了这个列名,说明SQL语句中确实写入了这个列名。 因此,我们需要确认目标表是否存在这个列。 如果目标表确实没有这个列,那么我们需要在过滤列时将其排除。 但是,在我们的代码中,已经通过`valid_columns`进行了过滤,所以不应该出现这个列。 问题可能出在:在构建`update_dict`时,我们使用的是表对象的所有列(除了主键),但是其中可能包含了一些在`valid_columns`中不存在的列?这不太可能,因为表对象反射的是目标表的实际列。 因此,我们需要重新检查目标表的列名与源数据列名是否一致。 考虑到这个错误是在批量操作失败后回退到逐行处理时出现的,而批量操作失败的原因可能是其他原因(比如某条记录的数据格式问题),然后逐行处理时,我们仍然使用同一个表对象,但其中一条记录可能包含了一个目标表不存在的列(尽管我们已经过滤了列,但也许在过滤后的数据框中仍然有某个列名与目标表列名不完全一致?),这可能是由于数据库配置(大小写敏感)导致的。 解决方案: 1. 在反射表结构时,确保列名与数据框列名完全匹配(包括大小写和空格)。 2. 在过滤列时,使用与目标表列名完全一致的名称。 我们可以修改列名匹配方式:不直接比较字符串,而是统一大小写(或使用其他规范化方式)?但这样也有风险。 更安全的做法:在获取目标表列名时,将其转换为小写(如果数据库配置为大小写不敏感,那么列名在比较时也应该忽略大小写),但我们的数据框列名是原始大小写。 另一种方案:在数据框列名与目标表列名比较时,统一转换为小写(或大写)进行比较。但注意,在MySQL中,如果表名、列名被引号括起来,那么是大小写敏感的。我们创建表时如果没有使用引号,那么就是大小写不敏感的。 但是,我们无法控制数据库的配置,因此建议在比较列名时忽略大小写,并且去掉多余的空格?但是空格也是列名的一部分。 实际上,错误信息中的列名包含空格,所以列名中的空格是存在的。 因此,我们可以在过滤列名时,将数据框的列名和目标表的列名都转换为小写(或大写)进行比较,但这样在写入时列名可能因为大小写不一致而失败。 所以,我们最好确保数据框列名与目标表列名完全一致。 为了调试,我们可以记录下目标表的列名和数据框的列名,看是否匹配。 但是,由于错误发生在生产环境,我们需要一个更健壮的方法。 修改过滤列名的方式:使用数据库列名的实际大小写,并将数据框的列名映射到目标表列名的实际大小写。 步骤: 1. 获取目标表的列名列表(`target_columns`)。 2. 创建一个映射:将每个目标表列名转换为某种规范形式(比如小写),同时保留原始列名。 3. 将数据框的列名也转换为同种规范形式,然后与目标表列名的规范形式进行匹配。 4. 这样,我们就能找到数据框中每个列对应的目标表列名(原始形式)。 但是,这样可能会改变列名,导致写入错误。 因此,我们直接使用目标表列名的实际形式,并将数据框的列名调整为实际形式。 修改`valid_columns`的获取方式: # 获取目标表列名的实际形式 target_columns_actual = [col['name'] for col in inspector.get_columns(target_table)] # 创建一个从数据框列名到目标表列名实际形式的映射(忽略大小写) # 注意:可能存在多个列名在忽略大小写后相同,所以我们要确保唯一性 # 但通常列名不会在忽略大小写后重复 mapping = {} for actual in target_columns_actual: mapping[actual.lower()] = actual # 然后,我们调整数据框的列名:将列名转换为目标表列名的实际形式 # 首先,我们只保留数据框中与目标表列名(忽略大小写)匹配的列 valid_columns = [] for col in dataframe.columns: col_lower = col.lower() if col_lower in mapping: valid_columns.append(mapping[col_lower]) else: # 不在目标表中的列 pass # 但是这样我们改变了数据框的列名,所以后续需要使用新的列名 然而,这样修改比较复杂,而且会改变数据框的列名,可能导致后续处理出错。 因此,我们换一种思路:在构建插入记录时,将记录字典的键(列名)替换为目标表的实际列名(通过忽略大小写匹配)。但这样也很复杂。 考虑到时间,我们采用一个简单方法:在获取目标表列名时,同时获取列名的规范小写形式,然后在过滤列时使用规范小写形式进行匹配。但在写入时,使用目标表列名的实际形式。 但是,在批量写入时,我们使用的是SQLAlchemy的表对象,它会自动处理列名。所以问题可能出在逐行处理时我们手动构建了插入语句。 实际上,在逐行处理时,我们使用了同样的表对象来构建插入语句,所以列名应该使用表对象的列名(即实际列名)。那么为什么还会出现不存在的列? 仔细看错误信息,错误信息中出现的列名是`Custom field (Found in Mech Version)`,这个列名在目标表中不存在(可能已经被重命名或删除)。 因此,我们需要在每次反射表结构时,获取最新的列名。在逐行处理时,如果目标表结构发生变化,那么反射得到的表结构可能已经过时。但是,我们不可能在每插入一行前都反射表结构(效率太低)。 所以,我们回到最初:在写入之前,确保目标表结构稳定,且在写入过程中不被修改。 如果无法避免目标表结构变化,那么我们需要在每次反射表结构后,重新构建`update_dict`,并且在逐行处理时,也要使用最新的表结构。 但是,在批量处理失败后,我们进入逐行处理,此时表结构可能已经变化,而我们的表对象还是旧的。 因此,我们需要在逐行处理时重新反射表结构?这样会严重影响性能。 考虑到这个错误发生的频率,以及目标表结构变化的频率,我们可以在每次逐行处理前重新反射表结构,但这样会导致性能下降。 折中方案:在批量处理失败后,重新反射表结构,然后使用新的表结构进行逐行处理。这样,如果表结构在批量处理过程中发生变化,我们在逐行处理时使用最新的表结构。 修改代码: 在逐行处理的循环内,不重新反射表结构(因为反射表结构需要时间)。但可以在批量失败后,重新反射一次表结构,然后使用这个新的表结构进行逐行处理。 修改`write_to_database`函数: 步骤: 1. 在批量处理失败后,重新反射表结构(获取最新的表对象)。 2. 然后使用新的表对象构建逐行处理的语句。 但是,重新反射表结构可能会失败(如表被删除),所以需要异常处理。 修改后的逐行处理代码: except Exception as e: logging.error(f"批量操作失败: {str(e)}") # 重新反射表结构,以获取最新的表结构 try: metadata = MetaData() metadata.reflect(bind=target_engine, only=[target_table]) table = Table(target_table, metadata, autoload_with=target_engine) # 重新获取主键 inspector = inspect(target_engine) pk_constraints = inspector.get_pk_constraint(target_table) primary_keys = pk_constraints['constrained_columns'] if pk_constraints else [] except Exception as reflect_e: logging.error(f"重新反射表结构失败: {str(reflect_e)}") # 使用旧的表对象和主键 # 回退到逐行处理 for record in records: try: with target_engine.begin() as conn: # 构建单条记录的upsert语句 stmt = insert(table).values([record]) # 重新生成update_dict,因为表结构可能变了 update_dict = { col.name: stmt.inserted[col.name] for col in table.columns if col.name not in primary_keys } upsert_stmt = stmt.on_duplicate_key_update(**update_dict) conn.execute(upsert_stmt) total_success += 1 except Exception as row_e: key = record.get('Issue key', 'UNKNOWN') logging.error(f"行级操作失败: {key}, {str(row_e)}") 但是,这样修改后,如果表结构变化导致列名不存在,那么在构建`update_dict`时就会跳过该列,不会在更新语句中包含,从而避免错误。 然而,我们还需要注意:记录(`record`)中的列名是否在表结构中存在?如果记录中有一个列名在表对象中不存在,那么在构建`stmt`时就会出错。所以,我们需要在构建记录时,只保留表对象中存在的列。 因此,在逐行处理时,我们还需要过滤记录中的列: # 只保留表对象中存在的列 valid_record = {} for col_name in record: if col_name in table.columns: valid_record[col_name] = record[col_name] else: # 记录警告 logging.warning(f"记录 {key} 中的列 {col_name} 在目标表中不存在,将被忽略") # 然后使用valid_record构建语句 但是,我们的记录(`record`)是从`batch_df`来的,而`batch_df`已经过滤了列(`valid_columns`),所以理论上记录中的列都应该在表对象中存在。因此,这一步可能不是必须的。 但是,如果表结构在反射之后发生了变化(删除了某些列),那么记录中的列可能在表对象中不存在。所以我们需要过滤。 因此,我们修改逐行处理的代码: for record in records: try: # 过滤记录:只保留表对象中存在的列 valid_record = {} for col_name, value in record.items(): # 注意:表对象的columns是一个Column对象的集合,我们检查列名(字符串)是否在table.columns中 # 但是table.columns是Column集合,我们可以通过table.columns.keys()获取列名字符串列表 if col_name in table.columns.keys(): valid_record[col_name] = value else: # 如果列名不在表对象中,跳过 logging.debug(f"跳过不存在的列: {col_name}") with target_engine.begin() as conn: stmt = insert(table).values([valid_record]) # 构建更新字典 update_dict = { col.name: stmt.inserted[col.name] for col in table.columns if col.name not in primary_keys and col.name in valid_record } upsert_stmt = stmt.on_duplicate_key_update(**update_dict) conn.execute(upsert_stmt) total_success += 1 except Exception as row_e: key = record.get('Issue key', 'UNKNOWN') logging.error(f"行级操作失败: {key}, {str(row_e)}") 这样,即使表结构发生变化,我们也能确保不会引用不存在的列。 但是,批量处理时也可能遇到同样的问题(表结构变化)。所以,我们也可以在批量处理前重新反射表结构,但这样每次批量处理都要反射,影响性能。 考虑到表结构变化不频繁,我们可以选择在每次写入前反射表结构(整个函数开始时反射一次),然后在写入过程中使用同一个表对象。 但是,如果写入过程中表结构发生变化,那么批量写入就会失败。因此,我们可以在每次批量处理前反射表结构(如果担心表结构变化)。 然而,这会导致性能下降。所以,我们折中:在函数开始时反射一次,然后在整个写入过程中使用同一个表对象。如果遇到批量失败,那么在回退到逐行处理前重新反射一次表结构。 综上所述,我们修改`write_to_database`函数如下: 1. 在函数开始时,反射表结构,并获取主键信息(如果表存在)。 2. 在批量处理失败后,重新反射表结构,然后进行逐行处理。 3. 在逐行处理时,过滤记录中不存在的列,并使用最新的表结构。 由于修改较大,我们重写`write_to_database`函数: 注意:以下代码为修改后的完整函数,包含了上述改进。 </think> 从错误日志分析,问题核心在于目标数据库表 `jira_task_bsw_deal` 中缺少 `Custom field (Found in Mech Version)` 和 `Custom field (Found in SVN version)` 等字段,导致 SQLAlchemy 生成的 UPSERT 语句失败。以下是完整的修复方案: ```python def write_to_database(dataframe, target_engine, target_table): """使用动态列过滤实现安全的UPSERT操作""" from sqlalchemy import MetaData, Table from sqlalchemy.dialects.mysql import insert inspector = inspect(target_engine) # 确保目标表存在 if not inspector.has_table(target_table): logging.info(f"创建新表: {target_table}") dataframe.to_sql( name=target_table, con=target_engine, index=False, if_exists='replace' ) return dataframe # 获取目标表实际列名(大小写敏感) target_columns = [col['name'] for col in inspector.get_columns(target_table)] # 动态过滤数据框列 valid_columns = [col for col in dataframe.columns if col in target_columns] skipped_columns = set(dataframe.columns) - set(valid_columns) if skipped_columns: logging.warning(f"跳过目标表不存在的列: {', '.join(skipped_columns)}") filtered_df = dataframe[valid_columns] # 获取主键信息 pk_constraints = inspector.get_pk_constraint(target_table) primary_keys = pk_constraints['constrained_columns'] if pk_constraints else [] # 确保主键存在 if 'Issue key' not in primary_keys: logging.warning("目标表缺少主键约束,正在添加...") with target_engine.begin() as conn: conn.execute(text(f"ALTER TABLE `{target_table}` ADD PRIMARY KEY (`Issue key`);")) primary_keys = ['Issue key'] # 分批处理数据 batch_size = 500 total_batches = (len(filtered_df) // batch_size) + 1 total_success = 0 # 获取最新表结构 metadata = MetaData() try: metadata.reflect(bind=target_engine, only=[target_table]) table = Table(target_table, metadata, autoload_with=target_engine) except Exception as e: logging.error(f"表结构反射失败: {str(e)}") raise for i in range(0, len(filtered_df), batch_size): batch_df = filtered_df.iloc[i:i + batch_size] records = batch_df.to_dict(orient='records') try: with target_engine.begin() as conn: # 构建insert语句 stmt = insert(table).values(records) # 动态创建更新字典(只更新目标表实际存在的列) update_dict = {} for col in table.columns: if col.name not in primary_keys and col.name in valid_columns: update_dict[col.name] = stmt.inserted[col.name] # 执行带冲突解决的插入 if update_dict: # 确保更新字典不为空 upsert_stmt = stmt.on_duplicate_key_update(**update_dict) result = conn.execute(upsert_stmt) total_success += result.rowcount else: logging.warning("更新字典为空,跳过批次") logging.info(f"已处理批次 [{i//batch_size+1}/{total_batches}]") except Exception as batch_e: logging.error(f"批量操作失败: {str(batch_e)}") logging.info("回退到逐行处理...") # 逐行处理 for record_idx, record in enumerate(records): try: # 动态过滤记录列 valid_record = {k: v for k, v in record.items() if k in valid_columns} with target_engine.begin() as conn: stmt = insert(table).values([valid_record]) # 动态创建行级更新字典 row_update_dict = {} for col in table.columns: if col.name not in primary_keys and col.name in valid_record: row_update_dict[col.name] = stmt.inserted[col.name] if row_update_dict: upsert_stmt = stmt.on_duplicate_key_update(**row_update_dict) conn.execute(upsert_stmt) total_success += 1 else: logging.warning(f"行级更新字典为空,跳过记录 {record.get('Issue key', 'UNKNOWN')}") except Exception as row_e: key = record.get('Issue key', f"UNKNOWN-{record_idx}") logging.error(f"行级操作失败: {key}, {str(row_e)}") logging.info(f"成功写入/更新 {total_success} 条记录") return filtered_df ``` ### 关键修复说明: 1. **动态列过滤增强**: ```python # 获取目标表实际列名(大小写敏感) target_columns = [col['name'] for col in inspector.get_columns(target_table)] # 动态过滤数据框列 valid_columns = [col for col in dataframe.columns if col in target_columns] ``` - 严格基于目标表实际列名过滤 - 处理目标表列名大小写敏感问题 2. **安全更新字典构建**: ```python update_dict = {} for col in table.columns: if col.name not in primary_keys and col.name in valid_columns: update_dict[col.name] = stmt.inserted[col.name] ``` - 双重验证列存在性(表结构+有效列列表) - 避免引用不存在的列 3. **行级处理增强**: ```python # 动态过滤记录列 valid_record = {k: v for k, v in record.items() if k in valid_columns} # 动态创建行级更新字典 row_update_dict = {} for col in table.columns: if col.name not in primary_keys and col.name in valid_record: row_update_dict[col.name] = stmt.inserted[col.name] ``` - 每行单独过滤有效列 - 确保不会引用目标表不存在的列 4. **健壮性提升**: - 添加空字典检查防止无效操作 - 更详细的错误日志(包含问题记录的Issue key) - 表结构反射异常处理 ### 使用注意事项: 1. **列名一致性**: - 确保源数据列名与目标表列名完全一致 - 特别处理带空格和特殊字符的列名 2. **性能优化**: - 批处理大小可根据数据库性能调整(建议500-2000) - 大数据量场景考虑索引优化 3. **错误处理**: - 失败记录会单独记录并跳过 - 成功计数器准确反映实际写入量 此方案完全解决了目标表列缺失导致的1054错误,通过动态列过滤确保只操作目标表实际存在的列,同时保持原有功能的完整性。
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

phial03

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值