SqlServer FlinkCDC同步到Paimon一直卡在Busy100%的问题处理: Failed to convert value xxx to type TINYINT

本文介绍了在使用Flink CDC从SqlServer同步数据到Paimon时遇到的问题,即作业卡在CDC MultiplexWriter, Busy 100%。原因是Paimon在处理TINYINT类型时的类型转换错误,导致越界异常。解决方案是修改Paimon的源码,将TINYINT处理方式改为与SMALLINT相同,重新编译并替换jar文件,最后重启Flink作业。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

解决FlinkCDC流作业卡住问题

问题描述

通过FlinkCDC启动流式数据同步,将SqlServer数据同步到Paimon数据湖中,在Flink管理里界面发现同步状态异常:卡在CDC MultiplexWriter,一直处于Busy 100%。如图:
Flink同步作业状态异常
当Flink作业卡住一段时间之后,Flink Dashboard Exceptions页面可以看到以下异常信息:

org.apache.flink.util.FlinkRuntimeException: Exceeded checkpoint tolerable failure threshold. The latest checkpoint failed due to Checkpoint expired before completing., view the Checkpoint History tab or the Job Manager log to find out why continuous checkpoints failed.
	at org.apache.flink.runtime.checkpoint.CheckpointFailureManager.checkFailureAgainstCounter(CheckpointFailureManager.java:212)
	at org.apache.flink.runtime.checkpoint.CheckpointFailureManager.handleJobLevelCheckpointException(CheckpointFailureManager.java:169)
	at org.apache.flink.runtime.checkpoint.CheckpointFailureManager.handleCheckpointException(CheckpointFailureManager.java:122)
	at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.abortPendingCheckpoint(CheckpointCoordinator.java:2155)
	at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.abortPendingCheckpoint(CheckpointCoordinator.java:2134)
	at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.access$700(CheckpointCoordinator.java:101)
	at org.apache.flink.runtime.checkpoint.CheckpointCoordinator$CheckpointCanceller.run(CheckpointCoordinator.java:2216)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

问题原因

打开Task Managers日志界面,或者到hadoop的日志目录下查找taskmanager.log日志,定位到异常错误位置:

2024-06-11 18:17:52,318 INFO  org.apache.paimon.flink.sink.cdc.CdcRecordUtils     [] - Failed to convert value 162 to type TINYINT. Waiting for schema update. 
java.lang.NumberFormatException: Value out of range. Value:"162" Radix:10
        at java.lang.Byte.parseByte(Byte.java:151) ~[?:1.8.0_371]
        at java.lang.Byte.valueOf(Byte.java:205) ~[?:1.8.0_371]
        at java.lang.Byte.valueOf(Byte.java:231) ~[?:1.8.0_371]
        at org.apache.paimon.utils.TypeUtils.castFromStringInternal(TypeUtils.java:125) ~[paimon-flink-1.17-0.5.jar:0.5]
        at org.apache.paimon.utils.TypeUtils.castFromCdcValueString(TypeUtils.java:60) ~[paimon-flink-1.17-0.5.jar:0.5]
        at org.apache.paimon.flink.sink.cdc.CdcRecordUtils.toGenericRow(CdcRecordUtils.java:103) ~[paimon-flink-1.17-0.5.jar:0.5]
        at org.apache.paimon.flink.sink.cdc.CdcRecordUtils.toGenericRow(CdcRecordUtils.java:103) ~[paimon-flink-1.17-0.5.jar:0.5]
        at org.apache.paimon.flink.sink.cdc.CdcRecordStoreMultiWriteOperator.processElement(CdcRecordStoreMultiWriteOperator.java:153) ~[paimon-flink-1.17-0.5.jar:0.5]
        at org.apache.flink.streaming.runtime.tasks.OneInputStreamTask$StreamTaskNetworkOutput.emitRecord(OneInputStreamTask.java:237) ~[flink-dist-1.17.1.jar:1.17.1]
        at org.apache.flink.streaming.runtime.io.AbstractStreamTaskNetworkInput.processElement(AbstractStreamTaskNetworkInput.java:146) ~[flink-dist-1.17.1.jar:1.17.1]
        at org.apache.flink.streaming.runtime.io.AbstractStreamTaskNetworkInput.emitNext(AbstractStreamTaskNetworkInput.java:110) ~[flink-dist-1.17.1.jar:1.17.1]
        at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) ~[flink-dist-1.17.1.jar:1.17.1]
        at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:550) ~[flink-dist-1.17.1.jar:1.17.1]
        at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:231) ~[flink-dist-1.17.1.jar:1.17.1]
        at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:839) ~[flink-dist-1.17.1.jar:1.17.1]
        at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:788) ~[flink-dist-1.17.1.jar:1.17.1]
        at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:952) [flink-dist-1.17.1.jar:1.17.1]
        at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:931) [flink-dist-1.17.1.jar:1.17.1]
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:745) [flink-dist-1.17.1.jar:1.17.1]
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:562) [flink-dist-1.17.1.jar:1.17.1]
        at java.lang.Thread.run(Thread.java:750) [?:1.8.0_371]

在错误栈中注意到org.apache.paimon.utils.TypeUtils,查看其源码:

private static Object castFromStringInternal(String s, DataType type, boolean isCdcValue) {
   
        BinaryString str = BinaryString
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值