/usr/custom/spark/bin/spark-sql --deploy-mode client
add jar hdfs://${clusterName}/user/hive/udf/udf.jar
报错信息如下:
java.lang.ExceptionInInitializerError
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:662)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:889)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:947)
at java.io.DataInputStream.read(DataInputStream.java:100)
......
Caused by: java.lang.NullPointerException
at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:746)
at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:376)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:662)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:889)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:947)
......
解决方法:
在/usr/custom/spark/conf/spark-defaults.conf文件中添加如下配置:spark.jars=hdfs://cbasNA/user/hive/udf/udf.jar
```
作者:伍柒大人_HQQ
链接:https://www.jianshu.com/p/d630582c8108
來源:简书
简书著作权归作者所有,任何形式的转载都请联系作者获得授权并注明出处。
```