问题描述
通过FlinkCDC启动流式数据同步,将SqlServer数据同步到Paimon数据湖中,在Flink管理里界面发现同步状态异常:卡在CDC MultiplexWriter,一直处于Busy 100%。如图:

当Flink作业卡住一段时间之后,Flink Dashboard Exceptions页面可以看到以下异常信息:
org.apache.flink.util.FlinkRuntimeException: Exceeded checkpoint tolerable failure threshold. The latest checkpoint failed due to Checkpoint expired before completing., view the Checkpoint History tab or the Job Manager log to find out why continuous checkpoints failed.
at org.apache.flink.runtime.checkpoint.CheckpointFailureManager.checkFailureAgainstCounter(CheckpointFailureManager.java:212)
at org.apache.flink.runtime.checkpoint.CheckpointFailureManager.handleJobLevelCheckpointException(CheckpointFailureManager.java:169)
at org.apache.flink.runtime.checkpoint.CheckpointFailureManager.handleCheckpointException(CheckpointFailureManager.java:122)
at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.abortPendingCheckpoint(CheckpointCoordinator.java:2155)
at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.abortPendingCheckpoint(CheckpointCoordinator.java:2134)
at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.access$700(CheckpointCoordinator.java:101)
at org.apache.flink.runtime.checkpoint.CheckpointCoordinator$CheckpointCanceller.run(CheckpointCoordinator.java:2216)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
问题原因
打开Task Managers日志界面,或者到hadoop的日志目录下查找taskmanager.log日志,定位到异常错误位置:
2024-06-11 18:17:52,318 INFO org.apache.paimon.flink.sink.cdc.CdcRecordUtils [] - Failed to convert value 162 to type TINYINT. Waiting for schema update.
java.lang.NumberFormatException: Value out of range. Value:"162" Radix:10
at java.lang.Byte.parseByte(Byte.java:151) ~[?:1.8.0_371]
at java.lang.Byte.valueOf(Byte.java:205) ~[?:1.8.0_371]
at java.lang.Byte.valueOf(Byte.java:231) ~[?:1.8.0_371]
at org.apache.paimon.utils.TypeUtils.castFromStringInternal(TypeUtils.java:125) ~[paimon-flink-1.17-0.5.jar:0.5]
at org.apache.paimon.utils.TypeUtils.castFromCdcValueString(TypeUtils.java:60) ~[paimon-flink-1.17-0.5.jar:0.5]
at org.apache.paimon.flink.sink.cdc.CdcRecordUtils.toGenericRow(CdcRecordUtils.java:103) ~[paimon-flink-1.17-0.5.jar:0.5]
at org.apache.paimon.flink.sink.cdc.CdcRecordUtils.toGenericRow(CdcRecordUtils.java:103) ~[paimon-flink-1.17-0.5.jar:0.5]
at org.apache.paimon.flink.sink.cdc.CdcRecordStoreMultiWriteOperator.processElement(CdcRecordStoreMultiWriteOperator.java:153) ~[paimon-flink-1.17-0.5.jar:0.5]
at org.apache.flink.streaming.runtime.tasks.OneInputStreamTask$StreamTaskNetworkOutput.emitRecord(OneInputStreamTask.java:237) ~[flink-dist-1.17.1.jar:1.17.1]
at org.apache.flink.streaming.runtime.io.AbstractStreamTaskNetworkInput.processElement(AbstractStreamTaskNetworkInput.java:146) ~[flink-dist-1.17.1.jar:1.17.1]
at org.apache.flink.streaming.runtime.io.AbstractStreamTaskNetworkInput.emitNext(AbstractStreamTaskNetworkInput.java:110) ~[flink-dist-1.17.1.jar:1.17.1]
at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) ~[flink-dist-1.17.1.jar:1.17.1]
at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:550) ~[flink-dist-1.17.1.jar:1.17.1]
at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:231) ~[flink-dist-1.17.1.jar:1.17.1]
at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:839) ~[flink-dist-1.17.1.jar:1.17.1]
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:788) ~[flink-dist-1.17.1.jar:1.17.1]
at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:952) [flink-dist-1.17.1.jar:1.17.1]
at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:931) [flink-dist-1.17.1.jar:1.17.1]
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:745) [flink-dist-1.17.1.jar:1.17.1]
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:562) [flink-dist-1.17.1.jar:1.17.1]
at java.lang.Thread.run(Thread.java:750) [?:1.8.0_371]
在错误栈中注意到org.apache.paimon.utils.TypeUtils,查看其源码:
private static Object castFromStringInternal(String s, DataType type, boolean isCdcValue) {
BinaryString str = BinaryString.

本文介绍了在使用Flink CDC从SqlServer同步数据到Paimon时遇到的问题,即作业卡在CDC MultiplexWriter, Busy 100%。原因是Paimon在处理TINYINT类型时的类型转换错误,导致越界异常。解决方案是修改Paimon的源码,将TINYINT处理方式改为与SMALLINT相同,重新编译并替换jar文件,最后重启Flink作业。
最低0.47元/天 解锁文章
1878

被折叠的 条评论
为什么被折叠?



