解决HDFS和spark-shell启动的警告:Unable to load native-hadoop library for your platform... using builtin-java

一、问题

在启动hadoop和spark-shell的时候会有警告:

start-dfs.sh
2018-10-03 09:43:31,795 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
spark-shell
2018-10-03 09:49:05 WARN  NativeCodeLoader:60 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

 虽然这两个警告不会对任务产生影响,但是看到WARN就会感到不安,所以就解决一下

二、解决方案

(问题详细描述可以参考:WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform...问题详解

针对启动hadoop的警告,需要将hadoop安装目录下的lib/native引入到环境变量中

   export JAVA_LIBRAY_PATH=/usr/local/hadoop/lib/native

针对spark-shell的警告,需要在Spark安装目录下的conf/spark-env.sh引入JAVA_LIBRAY_PATH

  export LD_LIBRARY_PATH=$JAVA_LIBRARY_PATH

重启生效后就不会有警告了

 

 

 

E:\Anaconda3\envs\web\python.exe E:\1.lwzp\python\17-python4svlqv70\python4svlqv70\run.py from . import config_v,Yonghu_v,common,Systemnotice_v,index_v,schema_v,Systemintro_v,Address_v,Wangluoxiaoshuo_v,users_v E:\1.lwzp\python\17-python4svlqv70\python4svlqv70\api\bin\mysql-connector-java-8.0.32.jar 25/12/12 16:31:07 WARN Shell: Did not find winutils.exe: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems 25/12/12 16:31:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 25/12/12 16:31:09 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 25/12/12 16:31:09 ERROR SparkContext: Error initializing SparkContext. java.lang.RuntimeException: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:735) at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:270) at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:1108) at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:1094) at org.apache.spark.util.Utils$.fetchFile(Utils.scala:579) at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13(Executor.scala:1010) at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13$adapted(Executor.scala:1002) at scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:985) at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149) at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237) at scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230) at scala.collection.mutable.Ha
最新发布
12-13
评论 6
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值