需要增加hadoop的依赖包
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.2</version>
</dependency>
org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme ‘hdfs’. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
aused by: org.apache.flink.util.FlinkRuntimeException: Failed to create checkpoint storage at checkpoint coordinator side.
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
本文详细描述了在使用Apache Flink时遇到的UnsupportedFileSystemSchemeException错误,该错误源于未能加载支持HDFS方案的Hadoop文件系统实现。文中提供了添加Hadoop依赖包的具体方法,包括hadoop-common、hadoop-client和hadoop-hdfs,版本为2.7.2,以确保Flink能够正确地与HDFS交互。
1493

被折叠的 条评论
为什么被折叠?



