hadoop中执行命令时发生错误

通过设置环境变量开启Hadoop的详细调试信息,便于快速定位执行命令时产生的错误原因,本文具体展示了如何在执行命令时启用调试模式,并通过错误信息进行故障排查。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

通过一下方式找到错误的原因,开启hadoop的调试信息

[root@yts bin]# export HADOOP_ROOT_LOGGER=DEBUG,console

这样在执行命令时,可以通过error字样定位执行命令时产生错误的原因

[root@yts bin]# ./hadoop fs -mkdir test
14/10/08 11:17:55 DEBUG util.Shell: setsid exited with exit code 0
14/10/08 11:17:56 DEBUG conf.Configuration: parsing URL jar:file:/usr/local/hadoop-2.5.1/share/hadoop/common/hadoop-common-2.5.1.jar!/core-default.xml
14/10/08 11:17:56 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@66048bfd
14/10/08 11:17:56 DEBUG conf.Configuration: parsing URL file:/usr/local/hadoop-2.5.1/etc/hadoop/core-site.xml
14/10/08 11:17:56 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@340f438e
14/10/08 11:17:56 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
14/10/08 11:17:56 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
14/10/08 11:17:56 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups])
14/10/08 11:17:56 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
14/10/08 11:17:56 DEBUG security.Groups: Creating new Groups object
14/10/08 11:17:56 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
14/10/08 11:17:56 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/local/hadoop-2.5.1/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /usr/local/hadoop-2.5.1/lib/native/libhadoop.so.1.0.0)
14/10/08 11:17:56 DEBUG util.NativeCodeLoader: java.library.path=/usr/local/hadoop-2.5.1/lib/native
14/10/08 11:17:56 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/10/08 11:17:56 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
14/10/08 11:17:56 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
14/10/08 11:17:56 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
14/10/08 11:17:56 DEBUG security.UserGroupInformation: hadoop login
14/10/08 11:17:56 DEBUG security.UserGroupInformation: hadoop login commit
14/10/08 11:17:56 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: root
14/10/08 11:17:56 DEBUG security.UserGroupInformation: UGI loginUser:root (auth:SIMPLE)
14/10/08 11:17:56 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
14/10/08 11:17:56 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
14/10/08 11:17:56 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
14/10/08 11:17:56 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
14/10/08 11:17:57 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
14/10/08 11:17:57 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@55a1c291
14/10/08 11:17:57 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@48524010
14/10/08 11:17:57 DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.
14/10/08 11:17:57 DEBUG ipc.Client: The ping interval is 60000 ms.
14/10/08 11:17:57 DEBUG ipc.Client: Connecting to yts/192.168.3.82:9000
14/10/08 11:17:57 DEBUG ipc.Client: IPC Client (124407148) connection to yts/192.168.3.82:9000 from root: starting, having connections 1
14/10/08 11:17:57 DEBUG ipc.Client: IPC Client (124407148) connection to yts/192.168.3.82:9000 from root sending #0
14/10/08 11:17:57 DEBUG ipc.Client: IPC Client (124407148) connection to yts/192.168.3.82:9000 from root got value #0
14/10/08 11:17:57 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 143ms
14/10/08 11:17:57 DEBUG ipc.Client: IPC Client (124407148) connection to yts/192.168.3.82:9000 from root sending #1
14/10/08 11:17:57 DEBUG ipc.Client: IPC Client (124407148) connection to yts/192.168.3.82:9000 from root got value #1
14/10/08 11:17:57 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
mkdir: `test': No such file or directory
14/10/08 11:17:57 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@48524010
14/10/08 11:17:57 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@48524010
14/10/08 11:17:57 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@48524010
14/10/08 11:17:57 DEBUG ipc.Client: Stopping client
14/10/08 11:17:57 DEBUG ipc.Client: IPC Client (124407148) connection to yts/192.168.3.82:9000 from root: closed
14/10/08 11:17:57 DEBUG ipc.Client: IPC Client (124407148) connection to yts/192.168.3.82:9000 from root: stopped, remaining connections 0

转载于:https://www.cnblogs.com/yts1dx/p/4011019.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值