sql查出left join的数据为双倍

在进行SQL查询时,遇到左连接操作中SUM和COUNT函数导致的数据重复问题。通过将包裹表数据先存入临时表,然后在主查询中引用,解决了统计时的重复计算问题,确保了包裹数量和重量的准确性。

左连接SUM和COUNT数据重复问题

今天开发时,需要将统计相关的包裹数量和包裹重量代码如下:

select ao.sent_ordere,count(wp.sent_order) num,wg.pack_no,
      group_concat(DISTINCT wg.goods_name) AS goods_name,
      SUM(c.weight) weight,
FROM  advance_order ao
LEFT JOIN packaging_costs c ON ao.sent_order = c.old_sent_order
LEFT JOIN advance_pack wp ON ao.sent_order = wp.sent_order
LEFT JOIN shipping_goods wg ON wp.pack_no = wg.pack_no
where  ao.cabinet_no = 'WM000000025'
GROUP BY ao.sent_order

在左连接时对表中的多条数据进行查询,这样写会导致我会累计相加多遍。
解决:
将包裹表中查询到的数据存储到一个临时表中,临时表名叫作wp

select ao.sent_order,wp.num,wg.pack_no,group_concat(DISTINCT wg.goods_name) AS goods_name, 
FROM  wms_advance_order ao
LEFT JOIN (select count(old_pack_no) num,sent_order,old_pack_no from wms_advance_pack GROUP BY sent_order ) AS wp
on ao.sent_order = wp.sent_order
LEFT JOIN wms_shipping_goods wg ON wp.old_pack_no = wg.pack_no
WHERE  ao.cabinet_no = 'WM000000025'
GROUP BY ao.sent_order 

成功解决问题

# -*- coding: utf-8 -*- """ @file: async_zt_flow_hcproblem @author: zhangxiukun @date: 2025/7/17 10:14 """ import asyncio from datetime import date from app.models.project import ProjectSoftware from app.settings import TORTOISE_ORM from app.utils.db import MySqlClient async def process_and_insert_batch(rows): """ 异步处理单批次数据并插入数据库 """ processed_data = [] for row in rows: # 创建基础记录 new_row = { #'zt_id': row['id'], 'project_no': row['projectno'], 'kind': row['softwarecategory'], #'software_item_umber': row['softwareitemnumber'], 'version': row['baselinename'], 'iap_lib': row["IAP"] # 'work_order_no': row['workorderno'], # 'upgrade_status': row['upgradestatus'], # 'upgrade_date': row['upgradedate'] if isinstance(row['upgradedate'], date) else None, # 'upgrader': row['upgrader'], # 'first_licensed_date': row['firstlicenseddate'] if isinstance(row['firstlicenseddate'], date) else None } # 生成lowerconfig记录 if row['configname']: config_record = new_row.copy() config_record['kind'] += 'config' config_record['version'] = row['configname'] config_record['iap_lib'] = None # config_record['software_item_umber'] = row['confignumber'] processed_data.append(config_record) processed_data.append(new_row) # 使用ORM批量插入 try: await ProjectSoftware.bulk_create( [ProjectSoftware(**item) for item in processed_data], ignore_conflicts=True ) return len(processed_data) except Exception as e: print(f"批量插入错误: {e}") return 0 async def process_data_concurrently(batch_size=1000): """ 使用纯异步并发处理数据(避免多线程问题) """ total_inserted = 0 sd_sql = ("SELECT sd.projectno, sd.softwarecategory, sd.softwareitemnumber, sd.configname,sd.baselinename,ma.IAP FROM zt_standingbookdetail sd LEFT JOIN zt_materiallib ma ON ma.softwareitemnumber = sd.softwareitemnumber ") #sd_sql = "SELECT softwareitemnumber, baselinename, IAP FROM zt_standingbookdetail " async with MySqlClient(TORTOISE_ORM) as client: sd_rows = await client.execute_query(sd_sql) # 分割批次 batches = [sd_rows[i:i + batch_size] for i in range(0, len(sd_rows), batch_size)] # 并发处理 tasks = [process_and_insert_batch(batch) for batch in batches] results = await asyncio.gather(*tasks) total_inserted = sum(results) print(f"成功插入 {total_inserted} 条记录到 ProjectSoftware 中") if __name__ == '__main__': asyncio.run(process_data_concurrently()) 优化这个代码
最新发布
08-23
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值