ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException

本文记录了一次Hadoop集群中文件复制出现的问题及其解决过程。主要原因是配置文件中主机名与实际IP不匹配及防火墙设置不当导致。通过修改配置文件中的主机名、更新/etc/hosts文件并关闭防火墙成功解决了问题。
    2013-06-24 11:39:32,383 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:zqgame cause:java.io.IOException: File /data/zqhadoop/data/mapred/system/jobtracker.info could only be replicated to 0 nodes, instead of 1  
    2013-06-24 11:39:32,384 INFO org.apache.hadoop.ipc.Server: IPC Server handler 1 on 9000, call addBlock(/data/zqhadoop/data/mapred/system/jobtracker.info, DFSClient_NONMAPREDUCE_-344066732_1, null) from 192.168.216.133:59866: error: java.io.IOException: File /data/zqhadoop/data/mapred/system/jobtracker.info could only be replicated to 0 nodes, instead of 1  
    java.io.IOException: File /data/zqhadoop/data/mapred/system/jobtracker.info <span style="color:#FF0000;">could only be replicated to 0 nodes, instead of 1</span>  
            at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1920)  
            at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:783)  
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)  
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)  
            at java.lang.reflect.Method.invoke(Method.java:601)  
            at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)  
            at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1432)  
            at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1428)  
            at java.security.AccessController.doPrivileged(Native Method)  
            at javax.security.auth.Subject.doAs(Subject.java:415)  
            at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)  
            at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1426)  



这里hadoop去查找可用的节点,但是结果找不到。

问题处在/etc/hosts和$HADOOP_HOME/conf/mapred-site.xml和core-site.xml。

解决方法:

1、修改$HADOOP_HOME/conf/mapred-site.xml和core-site.xml,把host修改为IP地址

core-site.xml

[plain] view plain copy
  1. zqgame@master:~/hadoop-1.2.0/bin$ more ../conf/core-site.xml   
  2. <?xml version="1.0"?>  
  3. <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>  
  4.   
  5. <!-- Put site-specific property overrides in this file. -->  
  6.   
  7. <configuration>  
  8.         <property>  
  9.          <name>fs.default.name</name>  
  10.          <value>hdfs://192.168.216.133:9000</value>  
  11.      </property>  
  12.         <property>  
  13.                 <name>hadoop.tmp.dir</name>  
  14.                 <value>/data/zqhadoop/data</value>  
  15.         </property>  
  16. </configuration>  
mapred-site.xml
[plain] view plain copy
  1. zqgame@master:~/hadoop-1.2.0/bin$ more ../conf/mapred-site.xml   
  2. <?xml version="1.0"?>  
  3. <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>  
  4.   
  5. <!-- Put site-specific property overrides in this file. -->  
  6.   
  7. <configuration>  
  8.         <property>  
  9.          <name>mapred.job.tracker</name>  
  10.          <value>192.168.216.133:9001</value>  
  11.      </property>  
  12. </configuration>  

2、修改/etc/hosts配置,添加本机IP绑定

[plain] view plain copy
  1. zqgame@master:~/hadoop-1.2.0/bin$ more /etc/hosts  
  2. 127.0.0.1       localhost  
  3. 127.0.1.1       master  
  4. <span style="color:#FF0000;">192.168.216.133 localhost.localdomain localhost</span>  

3、关闭防火墙


PS:我的hosts配置文件是:


127.0.0.1 localhost
192.168.20.114 master
192.168.20.84 slave1
192.168.20.85 slave2





转账请注明:http://blog.youkuaiyun.com/weijonathan/article/details/9162619
The specified database user/password combination is rejected: [ 08S01] Could not establish connection to jdbc:hive2://192.168.88.80:10000: Required field 'serverProtocolVersion' is unset! Struct:TOpenSessionResp(status:TStatus(statusCode:ERROR_STATUS, infoMessages:[*org.apache.hive.service.cli.HiveSQLException:Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root:14:13, org.apache.hive.service.cli.session.SessionManager:createSession:SessionManager.java:420, org.apache.hive.service.cli.session.SessionManager:openSession:SessionManager.java:363, org.apache.hive.service.cli.CLIService:openSessionWithImpersonation:CLIService.java:189, org.apache.hive.service.cli.thrift.ThriftCLIService:getSessionHandle:ThriftCLIService.java:424, org.apache.hive.service.cli.thrift.ThriftCLIService:OpenSession:ThriftCLIService.java:313, org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1377, org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1362, org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39, org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39, org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:56, org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286, java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1149, java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:624, java.lang.Thread:run:Thread.java:748, *java.lang.RuntimeException:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root:22:8, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:89, org.apache.hive.service.cli.session.HiveSessionProxy:access$000:HiveSessionProxy.java:36, org.apache.hive.service.cli.session.HiveSessionProxy$1:run:HiveSessionProxy.java:63, java.security.AccessController:doPrivileged:AccessController.java:-2, javax.security.auth.Subject:doAs:Subject.java:422, org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1875, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:59, com.sun.proxy.$Proxy35:open::-1, org.apache.hive.service.cli.session.SessionManager:createSession:SessionManager.java:411, *java.lang.RuntimeException:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root:29:7, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:602, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:538, org.apache.hive.service.cli.session.HiveSessionImpl:open:HiveSessionImpl.java:171, sun.reflect.NativeMethodAccessorImpl:invoke0:NativeMethodAccessorImpl.java:-2, sun.reflect.NativeMethodAccessorImpl:invoke:NativeMethodAccessorImpl.java:62, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:498, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:78, *org.apache.hadoop.ipc.RemoteException:User: root is not allowed to impersonate root:54:25, org.apache.hadoop.ipc.Client:getRpcResponse:Client.java:1499, org.apache.hadoop.ipc.Client:call:Client.java:1445, org.apache.hadoop.ipc.Client:call:Client.java:1355, org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke:ProtobufRpcEngine.java:228, org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke:ProtobufRpcEngine.java:116, com.sun.proxy.$Proxy29:getFileInfo::-1, org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB:getFileInfo:ClientNamenodeProtocolTranslatorPB.java:875, sun.reflect.NativeMethodAccessorImpl:invoke0:NativeMethodAccessorImpl.java:-2, sun.reflect.NativeMethodAccessorImpl:invoke:NativeMethodAccessorImpl.java:62, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:498, org.apache.hadoop.io.retry.RetryInvocationHandler:invokeMethod:RetryInvocationHandler.java:422, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invokeMethod:RetryInvocationHandler.java:165, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invoke:RetryInvocationHandler.java:157, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invokeOnce:RetryInvocationHandler.java:95, org.apache.hadoop.io.retry.RetryInvocationHandler:invoke:RetryInvocationHandler.java:359, com.sun.proxy.$Proxy30:getFileInfo::-1, org.apache.hadoop.hdfs.DFSClient:getFileInfo:DFSClient.java:1624, org.apache.hadoop.hdfs.DistributedFileSystem$29:doCall:DistributedFileSystem.java:1495, org.apache.hadoop.hdfs.DistributedFileSystem$29:doCall:DistributedFileSystem.java:1492, org.apache.hadoop.fs.FileSystemLinkResolver:resolve:FileSystemLinkResolver.java:81, org.apache.hadoop.hdfs.DistributedFileSystem:getFileStatus:DistributedFileSystem.java:1507, org.apache.hadoop.fs.FileSystem:exists:FileSystem.java:1668, org.apache.hadoop.hive.ql.session.SessionState:createRootHDFSDir:SessionState.java:710, org.apache.hadoop.hive.ql.session.SessionState:createSessionDirs:SessionState.java:648, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:578], errorCode:0, errorMessage:Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root), serverProtocolVersion:null) org.apache.thrift.protocol.TProtocolException: Required field 'serverProtocolVersion' is unset! Struct:TOpenSessionResp(status:TStatus(statusCode:ERROR_STATUS, infoMessages:[*org.apache.hive.service.cli.HiveSQLException:Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root:14:13, org.apache.hive.service.cli.session.SessionManager:createSession:SessionManager.java:420, org.apache.hive.service.cli.session.SessionManager:openSession:SessionManager.java:363, org.apache.hive.service.cli.CLIService:openSessionWithImpersonation:CLIService.java:189, org.apache.hive.service.cli.thrift.ThriftCLIService:getSessionHandle:ThriftCLIService.java:424, org.apache.hive.service.cli.thrift.ThriftCLIService:OpenSession:ThriftCLIService.java:313, org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1377, org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1362, org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39, org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39, org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:56, org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286, java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1149, java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:624, java.lang.Thread:run:Thread.java:748, *java.lang.RuntimeException:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root:22:8, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:89, org.apache.hive.service.cli.session.HiveSessionProxy:access$000:HiveSessionProxy.java:36, org.apache.hive.service.cli.session.HiveSessionProxy$1:run:HiveSessionProxy.java:63, java.security.AccessController:doPrivileged:AccessController.java:-2, javax.security.auth.Subject:doAs:Subject.java:422, org.apache.hadoop.security.UserGroupInformation:doAs:UserGroupInformation.java:1875, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:59, com.sun.proxy.$Proxy35:open::-1, org.apache.hive.service.cli.session.SessionManager:createSession:SessionManager.java:411, *java.lang.RuntimeException:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root:29:7, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:602, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:538, org.apache.hive.service.cli.session.HiveSessionImpl:open:HiveSessionImpl.java:171, sun.reflect.NativeMethodAccessorImpl:invoke0:NativeMethodAccessorImpl.java:-2, sun.reflect.NativeMethodAccessorImpl:invoke:NativeMethodAccessorImpl.java:62, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:498, org.apache.hive.service.cli.session.HiveSessionProxy:invoke:HiveSessionProxy.java:78, *org.apache.hadoop.ipc.RemoteException:User: root is not allowed to impersonate root:54:25, org.apache.hadoop.ipc.Client:getRpcResponse:Client.java:1499, org.apache.hadoop.ipc.Client:call:Client.java:1445, org.apache.hadoop.ipc.Client:call:Client.java:1355, org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke:ProtobufRpcEngine.java:228, org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker:invoke:ProtobufRpcEngine.java:116, com.sun.proxy.$Proxy29:getFileInfo::-1, org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB:getFileInfo:ClientNamenodeProtocolTranslatorPB.java:875, sun.reflect.NativeMethodAccessorImpl:invoke0:NativeMethodAccessorImpl.java:-2, sun.reflect.NativeMethodAccessorImpl:invoke:NativeMethodAccessorImpl.java:62, sun.reflect.DelegatingMethodAccessorImpl:invoke:DelegatingMethodAccessorImpl.java:43, java.lang.reflect.Method:invoke:Method.java:498, org.apache.hadoop.io.retry.RetryInvocationHandler:invokeMethod:RetryInvocationHandler.java:422, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invokeMethod:RetryInvocationHandler.java:165, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invoke:RetryInvocationHandler.java:157, org.apache.hadoop.io.retry.RetryInvocationHandler$Call:invokeOnce:RetryInvocationHandler.java:95, org.apache.hadoop.io.retry.RetryInvocationHandler:invoke:RetryInvocationHandler.java:359, com.sun.proxy.$Proxy30:getFileInfo::-1, org.apache.hadoop.hdfs.DFSClient:getFileInfo:DFSClient.java:1624, org.apache.hadoop.hdfs.DistributedFileSystem$29:doCall:DistributedFileSystem.java:1495, org.apache.hadoop.hdfs.DistributedFileSystem$29:doCall:DistributedFileSystem.java:1492, org.apache.hadoop.fs.FileSystemLinkResolver:resolve:FileSystemLinkResolver.java:81, org.apache.hadoop.hdfs.DistributedFileSystem:getFileStatus:DistributedFileSystem.java:1507, org.apache.hadoop.fs.FileSystem:exists:FileSystem.java:1668, org.apache.hadoop.hive.ql.session.SessionState:createRootHDFSDir:SessionState.java:710, org.apache.hadoop.hive.ql.session.SessionState:createSessionDirs:SessionState.java:648, org.apache.hadoop.hive.ql.session.SessionState:start:SessionState.java:578], errorCode:0, errorMessage:Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root), serverProtocolVersion:null)
最新发布
10-04
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值