1. 提交flink任务:
Process finished with exit code 130
org.apache.flink.runtime.rest.NotFoundException: Job 6fe06e87852fa7a12e2276e2876c067a
not found at org.apache.flink.runtime.rest.handler.job.AbstractExecutionGraphHandler.lambda$handleRequest$1(AbstractExecutionGraphHandler.java:89) at java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870) at java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:852)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
at org.apache.flink.runtime.rest.handler.legacy.DefaultExecutionGraphCache.lambda$getExeakka.actor.ActorCell.invoke(ActorCell.scala:561) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235) ... 4 more
提交flink任务失败,与打的jar包有关系,检查resources中的配置文件是否打入jar包。
2. java.lang.ClassCastException: cannot assign instance of org.apache.commons.collections.map.LinkedMap to field
org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot instantiate user function.
at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperatorFactory(StreamConfig.java:275)
at org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:126)
at org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:459)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:526)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassCastException: cannot assign instance of org.apache.commons.collections.map.LinkedMap to field org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.pendingOffsetsToCommit of type org.apache.commons.collections.map.LinkedMap in instance of org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010
at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2287)
at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1417)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2293)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
报这个错误,网上查都是说和flink的类加载机制有关系,需要修改配置文件;
Kafka库与Flink的反向类加载方法不兼容,修改 conf/flink-conf.yaml 并重启Flink
classloader.resolve-order: parent-first
背景:集群里面原来提交jar包是没有问题的,修改代码后重新打了一个包(将项目达成一个jar包,用maven-shade-pugin插件),提交任务就报这种错误。
通过对比生产环境和测试环境中flink/lib 目录下的jar包,发现项目依赖的jar包都在测试环境lib 目录下,将依赖的jar包删除,然后重新提交,则能提交成功。
解决方式: 进入flink安装目录/lib ,将项目依赖的一些jar包删除(要保留lib下原有的jar包),避免有冲突。