flink集群运行任务,将kafka数据写入hbase,任务执行不下去,hbase没有数据。
1. 配置环境变量:
HADOOP_HOME=/usr/lib/hadoop
HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce
HADOOP_CLASSPATH=`hadoop classpath`
PATH=$PATH:$HADOOP_HOME/bin:......
export HADOOP_HOME HADOOP_MAPRED_HOME HADOOP_CLASSPATH CLASS_PATH PATH
2.
2.1 检查hadoop版本:
hadoop version
2.2 按照HADOOP版本下载jar包
https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/
cd flink_home/lib
proxychains4 wget https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.6.5-10.0/flink-shaded-hadoop-2-uber-2.6.5-10.0.jar
scp到其他节点
3. 重启集群
stop-cluster.sh
start-cluster.sh