hadoop-WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using b

在运行hadoop-2.6.2版本的hdfs命令时,遇到WARN util.NativeCodeLoader告警,提示无法加载native-hadoop库。尝试过网络上的常见解决方案,如修改hadoop-env.sh和log4j.properties文件,但问题仍然存在。在log4j.properties中将日志级别改为ERROR可以屏蔽此警告。

hadoop-2.6.2版本,使用hdfs命令时总会提示如下的告警:
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
有人说这是hadoop的预编译包是32bit的,运行在64bit上就会有问题。不少人都重新进行编译源码重新安装。结果问题依然存在。
问题详细信息如下:

#1.开启hadoop调试信息
[root@hadoop1 hadoop]# export HADOOP_ROOT_LOGGER=DEBUG,console
#2.执行hdfs命令查看调试信息
[root@hadoop0 hadoop]# hadoop fs -ls
20/04/21 15:43:46 DEBUG util.Shell: setsid exited with exit code 0
20/04/21 15:43:47 DEBUG conf.Configuration: parsing URL jar:file:/usr/local/hadoop-2.6.2/share/hadoop/common/hadoop-common-2.6.2.jar!/core-default.xml
20/04/21 15:43:47 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@5593d23b
20/04/21 15:43:47 DEBUG conf.Configuration: parsing URL file:/usr/local/hadoop-2.6.2/etc/hadoop/core-site.xml
20/04/21 15:43:47 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@201da694
20/04/21 15:43:47 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Rate of successful kerberos logins and latency (milliseconds)], valueName=Time, about=, type=DEFAULT, always=false, sampleName=Ops)
20/04/21 15:43:47 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Rate of failed kerberos logins and latency (milliseconds)], valueName=Time, about=, type=DEFAULT, always=false, sampleName=Ops)
20/04/21 15:43:47 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[GetGroups], valueName=Time, about=, type=DEFAULT, always=false, sampleName=Ops)
20/04/21 15:43:47 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
20/04/21 15:43:47 DEBUG security.Groups:  Creating new Groups object
20/04/21 15:43:47 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
20/04/21 15:43:47 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/local/hadoop-2.6.2/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version 'GLIBC_2.14' not found (required by /usr/local/hadoop-2.6.2/lib/native/libhadoop.so.1.0.0)
20/04/21 15:43:47 DEBUG util.NativeCodeLoader: java.library.path=/usr/local/hadoop-2.6.2/lib/native
20/04/21 15:43:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/04/21 15:43:47 DEBUG util.PerformanceAdvisory: Falling back to shell based
20/04/21 15:43:47 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
20/04/21 15:43:47 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
20/04/21 15:43:47 DEBUG security.UserGroupInformation: hadoop login
20/04/21 15:43:47 DEBUG security.UserGroupInformation: hadoop login commit
20/04/21 15:43:47 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: root
20/04/21 15:43:47 DEBUG security.UserGroupInformation: Using user: "UnixPrincipal: root" with name root
20/04/21 15:43:47 DEBUG security.UserGroupInformation: User entry: "root"
20/04/21 15:43:47 DEBUG security.UserGroupInformation: UGI loginUser:root (auth:SIMPLE)
20/04/21 15:43:48 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
20/04/21 15:43:48 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
20/04/21 15:43:48 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
20/04/21 15:43:48 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path = 
20/04/21 15:43:48 DEBUG hdfs.DFSClient: No KeyProvider found.
20/04/21 15:43:48 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
20/04/21 15:43:48 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@5001b9f5
20/04/21 15:43:48 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@187790ec
20/04/21 15:43:48 DEBUG util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled.
20/04/21 15:43:48 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
20/04/21 15:43:48 DEBUG ipc.Client: The ping interval is 60000 ms.
20/04/21 15:43:48 DEBUG ipc.Client: Connecting to hadoop0/192.168.1.5:9000
20/04/21 15:43:48 DEBUG ipc.Client: IPC Client (34104097) connection to hadoop0/192.168.1.5:9000 from root: starting, having connections 1
20/04/21 15:43:48 DEBUG ipc.Client: IPC Client (34104097) connection to hadoop0/192.168.1.5:9000 from root sending #0
20/04/21 15:43:48 DEBUG ipc.Client: IPC Client (34104097) connection to hadoop0/192.168.1.5:9000 from root got value #0
20/04/21 15:43:48 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 76ms
ls: '.': No such file or directory
20/04/21 15:43:48 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@187790ec
20/04/21 15:43:48 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@187790ec
20/04/21 15:43:48 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@187790ec
20/04/21 15:43:48 DEBUG ipc.Client: Stopping client
20/04/21 15:43:48 DEBUG ipc.Client: IPC Client (34104097) connection to hadoop0/192.168.1.5:9000 from root: closed
20/04/21 15:43:48 DEBUG ipc.Client: IPC Client (34104097) connection to hadoop0/192.168.1.5:9000 from root: stopped, remaining connections 0

#3.关闭hadoop调试信息
[root@hadoop1 hadoop]# export HADOOP_ROOT_LOGGER=INFO,console

在warn信息附近的调试信息如下:

20/04/21 15:43:47 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/local/hadoop-2.6.2/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version 'GLIBC_2.14' not found (required by /usr/local/hadoop-2.6.2/lib/native/libhadoop.so.1.0.0)
20/04/21 15:43:47 DEBUG util.NativeCodeLoader: java.library.path=/usr/local/hadoop-2.6.2/lib/native
20/04/21 15:43:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

方案一:网络上的一种解决方案是:在文件hadoop-env.sh中增加:

export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native"  

该方案我尝试了并未解决该问题。
参考:方案一原创

方案二:由于此处只是warn并不影响使用,只是显示看着不舒服。直接在log4j日志中去除告警信息。在$HADOOP_HOME/etc/hadoop/log4j.properties文件中添加下面代码,将日志打印级别修改为ERROR:

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR

修改后确实可以屏蔽该warn的打印。
参考:方案二原创

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值