flink版本1.9.1
1.编辑flink/conf/flink-conf.yaml ,搜索history,主要编辑如下
#==============================================================================
# HistoryServer
#==============================================================================
# The HistoryServer is started and stopped via bin/historyserver.sh (start|stop)
#
# # Directory to upload completed jobs to. Add this directory to the list of
# # monitored directories of the HistoryServer as well (see below).
# #jobmanager.archive.fs.dir: hdfs:///completed-jobs/
jobmanager.archive.fs.dir: hdfs://beh/flink/
#
# # The address under which the web-based HistoryServer listens.
# #historyserver.web.address: 0.0.0.0
historyserver.web.address: 10.198.162.24
#
# # The port under which the web-based HistoryServer listens.
# #historyserver.web.port: 8082
historyserver.web.port: 8083
# # Comma separated list of directories to monitor for completed jobs.
# #historyserver.archive.fs.dir: hdfs:///completed-jobs/
historyserver.archive.fs.dir: hdfs://beh/flink/
# Interval in milliseconds for refreshing the monitored directories.
#historyserver.archive.fs.refresh-interval: 10000
historyserver.archive.fs.refresh-interval: 10000
2.参数释义
- jobmanager.archive.fs.dir:flink job运行完成后的日志存放目录
- historyserver.archive.fs.dir:flink history进程的hdfs监控目录
- historyserver.web.address:flink history进程所在的主机
- historyserver.web.port:flink history进程的占用端口
- historyserver.archive.fs.refresh-interval:刷新受监视目录的时间间隔(以毫秒为单位)。
3.注意jobmanager.archive.fs.dir要和historyserver.archive.fs.dir值一样:
实例:jobmanager.archive.fs.dir:为hdfs://beh/flink/
historyserver.archive.fs.dir:为hdfs://beh/flink/jobs
job跑完出现如下:
然后查看history页面发现内容为空。
修改 historyserver.archive.fs.dir:为hdfs://beh/flink/后,再次运行job后如下:
然后查看history页面发现不仅吸现在的job,之前运行的也显示出来了。
注1:可以把其他集群的jobs日志拉过来,然后启动history进程进行页面展示。
注2:启动过程可能报错:
2019-12-20 15:55:29,985 WARN org.apache.flink.runtime.webmonitor.history.HistoryServer - Failed to create Path or FileSystem fo
r directory 'hdfs://beh/flink/'. Directory will not be monitored.
org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The sc
heme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:447)
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:359)
at org.apache.flink.core.fs.Path.getFileSystem(Path.java:298)
at org.apache.flink.runtime.webmonitor.history.HistoryServer.<init>(HistoryServer.java:174)
at org.apache.flink.runtime.webmonitor.history.HistoryServer.<init>(HistoryServer.java:137)
at org.apache.flink.runtime.webmonitor.history.HistoryServer$1.call(HistoryServer.java:122)
at org.apache.flink.runtime.webmonitor.history.HistoryServer$1.call(HistoryServer.java:119)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.runtime.webmonitor.history.HistoryServer.main(HistoryServer.java:119)
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop is not in the classpath/dependencies.
at org.apache.flink.core.fs.UnsupportedSchemeFactory.create(UnsupportedSchemeFactory.java:58)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:443)
... 8 more
2019-12-20 15:55:29,988 ERROR org.apache.flink.runtime.webmonitor.history.HistoryServer - Failed to run HistoryServer.
org.apache.flink.util.FlinkException: Failed to validate any of the configured directories to monitor.
at org.apache.flink.runtime.webmonitor.history.HistoryServer.<init>(HistoryServer.java:183)
at org.apache.flink.runtime.webmonitor.history.HistoryServer.<init>(HistoryServer.java:137)
at org.apache.flink.runtime.webmonitor.history.HistoryServer$1.call(HistoryServer.java:122)
具体解决办法见:https://my.oschina.net/u/2338224/blog/3101005(敬佩大佬无私)
原因是在flink集群的CLASS_PATH下缺少了 HDFS相关的jar,其下载地址为(看个人hadoop和flink版本):
https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.6.5-9.0/