出现no XXX in java.library.path等问题是由于当前环境的java.library.path所包含的目录中没有我们需要的XXX依赖,解决的方法就是将所需要的依赖目录加入到 java.library.path属性中
cd /home1/chenxianyu11665/poi-unique
JARS=$(echo /home1/chenxianyu11665/poi-unique/search-api/lib/*.jar | tr ' ' ',')
echo $JARS
nohup spark-submit --class com.navinfo.datalake.task.unique.HuaweiUnique --name 'HuaweiUniqueNew' --master yarn --deploy-mode client --queue track --driver-memory 15g --num-executors 20 --executor-memory 15g --executor-cores 2 --conf spark.default.parallelism=100 --conf spark.executor.memoryOverheadFactor=0.3 --conf spark.driver.memoryOverheadFactor=0.3 \
--conf spark.driver.extraJavaOptions='-Dfile.encoding=utf-8' \
--conf spark.executor.extraJavaOptions='-Dfile.encoding=UTF-8 -Dsun.jnu.encoding=UTF-8' \
--conf spark.executor.extraJavaOptions='-Djava.library.path=./segment/lib_native' \
--conf spark.yarn.am.extraJavaOptions='-XX:+PrintGCDetails -Dfile.encoding=UTF-8 -Dsun.jnu.encoding=UTF-8' \
--conf spark.executorEnv.LD_LIBRARY_PATH=./segment/lib_native --conf spark.scheduler.maxRegisteredResourcesWaitingTime=1000000 --archives search-api.zip --jars $JARS poi-workflows-task-unique-1.0-SNAPSHOT.jar &
然而通过各种方式想将目录/segment/lib_native
添加到java.library.path属性中都没有解决问题
如--conf spark.executor.extraJavaOptions='-Djava.library.path=./segment/lib_native'
这样的方式指定以及通过spark.executorEnv.LD_LIBRARY_PATH
方式来指定都不行,不知道为啥,还请知道原因的不惜赐教,最终是通过如下方式解决问题
的
cd /home1/chenxianyu11665/poi-unique
JARS=$(echo /home1/chenxianyu11665/poi-unique/search-api/lib/*.jar | tr ' ' ',')
export LD_LIBRARY_PATH=./segment/lib_native:$LD_LIBRARY_PATH
echo $JARS
nohup spark-submit --class com.navinfo.datalake.task.unique.HuaweiUnique --name 'HuaweiUniqueNew' --master yarn --deploy-mode client --queue track --driver-memory 15g --num-executors 20 --executor-memory 15g --executor-cores 2 --conf spark.default.parallelism=100 --conf spark.executor.memoryOverheadFactor=0.3 --conf spark.driver.memoryOverheadFactor=0.3 \
--conf spark.driver.extraJavaOptions='-Dfile.encoding=utf-8' \
--conf spark.executor.extraJavaOptions='-Dfile.encoding=UTF-8 -Dsun.jnu.encoding=UTF-8' \
--conf spark.executor.extraJavaOptions='-Djava.library.path=./segment/lib_native' \
--conf spark.yarn.am.extraJavaOptions='-XX:+PrintGCDetails -Dfile.encoding=UTF-8 -Dsun.jnu.encoding=UTF-8' \
--conf spark.executorEnv.LD_LIBRARY_PATH=./segment/lib_native --conf spark.scheduler.maxRegisteredResourcesWaitingTime=1000000 --archives search-api.zip --jars $JARS poi-workflows-task-unique-1.0-SNAPSHOT.jar &
主要是export LD_LIBRARY_PATH=./segment/lib_native:$LD_LIBRARY_PATH
方式,就不知道为啥上两种方式都不行