Duplicate key or integrity

博主在测试Hibernate时遇到重复键或完整性约束违规异常。经排查,发现是定义数据表字段时添加了not null unique,而一次添加多条相同内容记录,且主键自动生成,从而导致问题出现。

在测试hibernate时出现了如下异常:

   [Duplicate key or integrity constraint violation message from server: "Duplicate entry 'rick0@foobar.com' for key 2"]; SQL was [] for task [Hibernate operation]
 org.springframework.dao.DataIntegrityViolationException: Hibernate operation: Duplicate key or integrity constraint violation message from server: "Duplicate entry 'rick0@foobar.com' for key 2"; nested exception is java.sql.SQLException: Duplicate key or integrity constraint violation message from server: "Duplicate entry 'rick0@foobar.com' for key 2"
java.sql.SQLException: Duplicate key or integrity constraint violation message from server: "Duplicate entry 'rick0@foobar.com' for key 2"
 at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:1997)
 at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1167)

  这个问题找了很久,晕了很久,终于找到了问题的所在。因我在定义数据表字段时加了not null unique,而我一次添加N条记录时的内容一样,主键是自动生成的,所以出问题了。

KeyError Traceback (most recent call last) C:\ProgramData\Anaconda3\lib\site-packages\pandas\core\indexes\base.py in get_loc(self, key, method, tolerance) 3062 try: -> 3063 return self._engine.get_loc(key) 3064 except KeyError: pandas\_libs\index.pyx in pandas._libs.index.IndexEngine.get_loc() pandas\_libs\index.pyx in pandas._libs.index.IndexEngine.get_loc() pandas\_libs\hashtable_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item() pandas\_libs\hashtable_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item() KeyError: 'id' During handling of the above exception, another exception occurred: KeyError Traceback (most recent call last) <ipython-input-72-9dcf5dd1411b> in <module>() 3 4 # 使用 apply 和 pd.Series 实现类似 explode 的效果 ----> 5 data_expanded = data.set_index(['id', 'title'])['Author_作者'].apply(pd.Series).stack().reset_index(level=-1, drop=True).reset_index(name='Author_作者') 6 7 print(data_expanded) C:\ProgramData\Anaconda3\lib\site-packages\pandas\core\frame.py in set_index(self, keys, drop, append, inplace, verify_integrity) 3904 names.append(None) 3905 else: -> 3906 level = frame[col]._values 3907 names.append(col) 3908 if drop: C:\ProgramData\Anaconda3\lib\site-packages\pandas\core\frame.py in __getitem__(self, key) 2683 return self._getitem_multilevel(key) 2684 else: -> 2685 return self._getitem_column(key) 2686 2687 def _getitem_column(self, key): C:\ProgramData\Anaconda3\lib\site-packages\pandas\core\frame.py in _getitem_column(self, key) 2690 # get column 2691 if self.columns.is_unique: -> 2692 return self._get_item_cache(key) 2693 2694 # duplicate columns & possible reduce dimensionality C:\ProgramData\Anaconda3\lib\site-packages\pandas\core\generic.py in _get_item_cache(self, item) 2484 res = cache.get(item) 2485 if res is None: -> 2486 values = self._data.get(item) 2487 res = self._box_item_values(item, values) 2488 cache[item] = res C:\ProgramData\Anaconda3\lib\site-packages\pandas\core\internals.py in get(self, item, fastpath) 4113 4114 if not isna(item): -> 4115 loc = self.items.get_loc(item) 4116 else: 4117 indexer = np.arange(len(self.items))[isna(self.items)] C:\ProgramData\Anaconda3\lib\site-packages\pandas\core\indexes\base.py in get_loc(self, key, method, tolerance) 3063 return self._engine.get_loc(key) 3064 except KeyError: -> 3065 return self._engine.get_loc(self._maybe_cast_indexer(key)) 3066 3067 indexer = self.get_indexer([key], method=method, tolerance=tolerance) pandas\_libs\index.pyx in pandas._libs.index.IndexEngine.get_loc() pandas\_libs\index.pyx in pandas._libs.index.IndexEngine.get_loc() pandas\_libs\hashtable_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item() pandas\_libs\hashtable_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item() KeyError: 'id' ​
06-11
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值