hadoop ERROR security.UserGroupInformation: PriviledgedActionException as:root

 错误描述:

在运行 hadoop-examples-1.2.1.jar例子时抛错误!!

命令: ./bin/hadoop jar hadoop-examples-1.2.1.jar  grep input_yumh out_yumh 'dfs[a-z.]+'


错误内容:

15/10/20 22:04:41 INFO util.NativeCodeLoader: Loaded the native-hadoop library
15/10/20 22:04:41 INFO mapred.JobClient: Cleaning up the staging area file:/tmp/hadoop-root/mapred/staging/root67934146/.staging/job_local67934146_0001
15/10/20 22:04:41 ERROR security.UserGroupInformation: PriviledgedActionException as:root cause:java.net.UnknownHostException: gd36: gd36: Name or service not known

java.net.UnknownHostException: gd36: gd36: Name or service not known


以本地模式运行hadoop时,抛异常
ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:java.net.UnknownHostException: hadoop: hadoop: Name or service not known
Exception in thread "main" java.net.UnknownHostException: hadoop: hadoop: Name or service not known

把/etc/hostname文件中的主机名加入到/etc/hosts文件中,问题解决。

如果没有hostname文件,可以把hostname的主机名修改为localhost


命令#hostname localhost


-------------------------------------------------------------------

linux的hostname修改详解

#hostname

#cat /proc/sys/kernel/hostname

上面两种输出结果相同。

修改运行时Linux系统的hostname,即不需要重启系统

hostname命令可以设置系统的hostname

#hostname newname

newname即要设置的新的hostname,运行后立即生效,但是在系统重启后会丢失所做的修改,如果要永久更改系统的hostname,就要修改相关的设置文件。


SQL 错误 [1] [08S01]: org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. Permission denied: user=anonymous, access=EXECUTE, inode="/tmp":root:supergroup:drwx------ at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:350) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:311) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:238) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:189) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1753) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1737) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkOwner(FSDirectory.java:1682) at org.apache.hadoop.hdfs.server.namenode.FSDirAttrOp.setPermission(FSDirAttrOp.java:64) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1781) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:830) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:469) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:503) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:871) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:817) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2606) at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:380) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:257) at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91) at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:348) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893) at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:362) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=anonymous, access=EXECUTE, inode="/tmp":root:supergroup:drwx------ at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:350) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:311) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:238) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:189) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1753) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1737) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkOwner(FSDirectory.java:1682) at org.apache.hadoop.hdfs.server.namenode.FSDirAttrOp.setPermission(FSDirAttrOp.java:64) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1781) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:830) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:469) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:503) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:871) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:817) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2606) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121) at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88) at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:1987) at org.apache.hadoop.hdfs.DistributedFileSystem$35.doCall(DistributedFileSystem.java:1690) at org.apache.hadoop.hdfs.DistributedFileSystem$35.doCall(DistributedFileSystem.java:1687) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.setPermission(DistributedFileSystem.java:1687) at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:720) at org.apache.hadoop.mapreduce.JobResourceUploader.mkdirs(JobResourceUploader.java:648) at org.apache.hadoop.mapreduce.JobResourceUploader.uploadResourcesInternal(JobResourceUploader.java:167) at org.apache.hadoop.mapreduce.JobResourceUploader.uploadResources(JobResourceUploader.java:128) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:101) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:576) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:571) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:571) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:562) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:414) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:151) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1232) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:255) ... 11 more Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=anonymous, access=EXECUTE, inode="/tmp":root:supergroup:drwx------ at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:350) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:311) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:238) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:189) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1753) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1737) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkOwner(FSDirectory.java:1682) at org.apache.hadoop.hdfs.server.namenode.FSDirAttrOp.setPermission(FSDirAttrOp.java:64) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1781) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:830) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:469) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:503) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:871) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:817) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1893) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2606) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1507) at org.apache.hadoop.ipc.Client.call(Client.java:1453) at org.apache.hadoop.ipc.Client.call(Client.java:1363) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.$Proxy30.setPermission(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setPermission(ClientNamenodeProtocolTranslatorPB.java:365) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy31.setPermission(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:1985) ... 44 more
最新发布
06-18
Jun 04, 2025 4:11:52 PM org.apache.ranger.server.tomcat.EmbeddedServer start INFO: Webapp file =./webapp, webAppName = /kms Jun 04, 2025 4:11:52 PM org.apache.ranger.server.tomcat.EmbeddedServer start INFO: Adding webapp [/kms] = path [./webapp] ..... Jun 04, 2025 4:11:52 PM org.apache.ranger.server.tomcat.EmbeddedServer start INFO: Finished init of webapp [/kms] = path [./webapp]. Jun 04, 2025 4:11:52 PM org.apache.coyote.AbstractProtocol init INFO: Initializing ProtocolHandler ["http-bio-9292"] Jun 04, 2025 4:11:52 PM org.apache.catalina.core.StandardService startInternal INFO: Starting service Tomcat Jun 04, 2025 4:11:52 PM org.apache.catalina.core.StandardEngine startInternal INFO: Starting Servlet Engine: Apache Tomcat/7.0.81 Jun 04, 2025 4:11:52 PM org.apache.catalina.startup.ContextConfig getDefaultWebXmlFragment INFO: No global web.xml found Jun 04, 2025 4:11:55 PM org.apache.catalina.startup.TldConfig execute INFO: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time. log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender. log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender. java.lang.IllegalArgumentException: Can't get Kerberos realm at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:306) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:291) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:846) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:816) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:689) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2954) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2944) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2810) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.initFileSystem(JavaKeyStoreProvider.java:89) at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.<init>(AbstractJavaKeyStoreProvider.java:82) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:49) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:41) at org.apache.hadoop.security.alias.JavaKeyStoreProvider$Factory.createProvider(JavaKeyStoreProvider.java:100) at org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:71) at org.apache.ranger.credentialapi.CredentialReader.getDecryptedString(CredentialReader.java:59) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.getFromJceks(RangerKeyStoreProvider.java:368) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.getDBKSConf(RangerKeyStoreProvider.java:118) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.<init>(RangerKeyStoreProvider.java:82) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider$Factory.createProvider(RangerKeyStoreProvider.java:399) at org.apache.hadoop.crypto.key.KeyProviderFactory.get(KeyProviderFactory.java:95) at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:177) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5110) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5633) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1694) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1684) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:88) at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63) ... 32 more Caused by: KrbException: Cannot locate default realm at sun.security.krb5.Config.getDefaultRealm(Config.java:1029) ... 38 more java.lang.IllegalArgumentException: Can't get Kerberos realm at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:306) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:291) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:846) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:816) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:689) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2954) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2944) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2810) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.initFileSystem(JavaKeyStoreProvider.java:89) at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.<init>(AbstractJavaKeyStoreProvider.java:82) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:49) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:41) at org.apache.hadoop.security.alias.JavaKeyStoreProvider$Factory.createProvider(JavaKeyStoreProvider.java:100) at org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:71) at org.apache.ranger.credentialapi.CredentialReader.getDecryptedString(CredentialReader.java:59) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.getFromJceks(RangerKeyStoreProvider.java:368) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.getDBKSConf(RangerKeyStoreProvider.java:119) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.<init>(RangerKeyStoreProvider.java:82) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider$Factory.createProvider(RangerKeyStoreProvider.java:399) at org.apache.hadoop.crypto.key.KeyProviderFactory.get(KeyProviderFactory.java:95) at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:177) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5110) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5633) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1694) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1684) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:88) at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63) ... 32 more Caused by: KrbException: Cannot locate default realm at sun.security.krb5.Config.getDefaultRealm(Config.java:1029) ... 38 more java.lang.IllegalArgumentException: Can't get Kerberos realm at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:306) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:291) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:846) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:816) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:689) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2954) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2944) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2810) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.initFileSystem(JavaKeyStoreProvider.java:89) at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.<init>(AbstractJavaKeyStoreProvider.java:82) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:49) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:41) at org.apache.hadoop.security.alias.JavaKeyStoreProvider$Factory.createProvider(JavaKeyStoreProvider.java:100) at org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:71) at org.apache.ranger.credentialapi.CredentialReader.getDecryptedString(CredentialReader.java:59) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.getFromJceks(RangerKeyStoreProvider.java:368) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.<init>(RangerKeyStoreProvider.java:83) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider$Factory.createProvider(RangerKeyStoreProvider.java:399) at org.apache.hadoop.crypto.key.KeyProviderFactory.get(KeyProviderFactory.java:95) at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:177) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5110) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5633) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1694) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1684) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:88) at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63) ... 31 more Caused by: KrbException: Cannot locate default realm at sun.security.krb5.Config.getDefaultRealm(Config.java:1029) ... 37 more java.lang.IllegalArgumentException: Can't get Kerberos realm at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:306) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:291) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:846) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:816) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:689) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2954) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2944) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2810) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.initFileSystem(JavaKeyStoreProvider.java:89) at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.<init>(AbstractJavaKeyStoreProvider.java:82) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:49) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:41) at org.apache.hadoop.security.alias.JavaKeyStoreProvider$Factory.createProvider(JavaKeyStoreProvider.java:100) at org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:71) at org.apache.ranger.credentialapi.CredentialReader.getDecryptedString(CredentialReader.java:59) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.getFromJceks(RangerKeyStoreProvider.java:368) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.<init>(RangerKeyStoreProvider.java:84) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider$Factory.createProvider(RangerKeyStoreProvider.java:399) at org.apache.hadoop.crypto.key.KeyProviderFactory.get(KeyProviderFactory.java:95) at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:177) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5110) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5633) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1694) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1684) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:88) at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63) ... 31 more Caused by: KrbException: Cannot locate default realm at sun.security.krb5.Config.getDefaultRealm(Config.java:1029) ... 37 more java.lang.IllegalArgumentException: Can't get Kerberos realm at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:306) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:291) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:846) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:816) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:689) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2954) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2944) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2810) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.initFileSystem(JavaKeyStoreProvider.java:89) at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.<init>(AbstractJavaKeyStoreProvider.java:82) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:49) at org.apache.hadoop.security.alias.JavaKeyStoreProvider.<init>(JavaKeyStoreProvider.java:41) at org.apache.hadoop.security.alias.JavaKeyStoreProvider$Factory.createProvider(JavaKeyStoreProvider.java:100) at org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:71) at org.apache.ranger.credentialapi.CredentialReader.getDecryptedString(CredentialReader.java:59) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.getFromJceks(RangerKeyStoreProvider.java:368) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.<init>(RangerKeyStoreProvider.java:85) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider$Factory.createProvider(RangerKeyStoreProvider.java:399) at org.apache.hadoop.crypto.key.KeyProviderFactory.get(KeyProviderFactory.java:95) at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:177) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5110) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5633) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1694) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1684) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:88) at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63) ... 31 more Caused by: KrbException: Cannot locate default realm at sun.security.krb5.Config.getDefaultRealm(Config.java:1029) ... 37 more [EL Severe]: ejb: 2025-06-04 16:11:57.703--ServerSession(987689800)--Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.DatabaseException Internal Exception: java.sql.SQLException: Access denied for user 'rangerkms'@'uc-hb2-zqkj-hadoop-hdp-node01' (using password: YES) Error Code: 1045 javax.persistence.PersistenceException: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.DatabaseException Internal Exception: java.sql.SQLException: Access denied for user 'rangerkms'@'uc-hb2-zqkj-hadoop-hdp-node01' (using password: YES) Error Code: 1045 at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.deploy(EntityManagerSetupImpl.java:766) at org.eclipse.persistence.internal.jpa.EntityManagerFactoryDelegate.getAbstractSession(EntityManagerFactoryDelegate.java:204) at org.eclipse.persistence.internal.jpa.EntityManagerFactoryDelegate.createEntityManagerImpl(EntityManagerFactoryDelegate.java:304) at org.eclipse.persistence.internal.jpa.EntityManagerFactoryImpl.createEntityManagerImpl(EntityManagerFactoryImpl.java:336) at org.eclipse.persistence.internal.jpa.EntityManagerFactoryImpl.createEntityManager(EntityManagerFactoryImpl.java:302) at org.apache.ranger.kms.dao.DaoManager.getEntityManager(DaoManager.java:44) at org.apache.hadoop.crypto.key.RangerKMSDB.initDBConnectivity(RangerKMSDB.java:116) at org.apache.hadoop.crypto.key.RangerKMSDB.<init>(RangerKMSDB.java:81) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.<init>(RangerKeyStoreProvider.java:86) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider$Factory.createProvider(RangerKeyStoreProvider.java:399) at org.apache.hadoop.crypto.key.KeyProviderFactory.get(KeyProviderFactory.java:95) at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:177) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5110) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5633) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1694) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1684) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.DatabaseException Internal Exception: java.sql.SQLException: Access denied for user 'rangerkms'@'uc-hb2-zqkj-hadoop-hdp-node01' (using password: YES) Error Code: 1045 at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:331) at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:326) at org.eclipse.persistence.sessions.DefaultConnector.connect(DefaultConnector.java:138) at org.eclipse.persistence.sessions.DatasourceLogin.connectToDatasource(DatasourceLogin.java:162) at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.setOrDetectDatasource(DatabaseSessionImpl.java:204) at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.loginAndDetectDatasource(DatabaseSessionImpl.java:741) at org.eclipse.persistence.internal.jpa.EntityManagerFactoryProvider.login(EntityManagerFactoryProvider.java:239) at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.deploy(EntityManagerSetupImpl.java:685) ... 20 more Caused by: java.sql.SQLException: Access denied for user 'rangerkms'@'uc-hb2-zqkj-hadoop-hdp-node01' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873) at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1710) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1226) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2253) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083) at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:806) at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.eclipse.persistence.sessions.DefaultConnector.connect(DefaultConnector.java:98) ... 25 more java.io.IOException: Master Key Jceks does not exists at org.apache.hadoop.crypto.key.RangerKeyStoreProvider.<init>(RangerKeyStoreProvider.java:92) at org.apache.hadoop.crypto.key.RangerKeyStoreProvider$Factory.createProvider(RangerKeyStoreProvider.java:399) at org.apache.hadoop.crypto.key.KeyProviderFactory.get(KeyProviderFactory.java:95) at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:177) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5110) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5633) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1694) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1684) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) ERROR: Hadoop KMS could not be started REASON: java.lang.NullPointerException Stacktrace: --------------------------------------------------- java.lang.NullPointerException at org.apache.hadoop.crypto.key.kms.server.KMSWebApp.contextInitialized(KMSWebApp.java:178) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5110) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5633) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1694) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1684) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) ---------------------------------------------------
06-05
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值