kinit报错 /etc/host.conf: line 3: bad command `nospoof on‘

在这里插入图片描述
kinit报错 /etc/host.conf: line 3: bad command `nospoof on’
linux7.5不再支持nospoof命令了, 修改/etc/host.conf注释掉nospoof on即可。

--- apiVersion: flink.apache.org/v1beta1 kind: FlinkDeployment metadata: name: flinksql-hive-test-job spec: flinkConfiguration: classloader.resolve-order: parent-first pekko.ask.timeout: '30s' taskmanager.numberOfTaskSlots: '4' execution.checkpointing.interval: 60000ms state.backend.type: 'rocksdb' execution.checkpointing.incremental: true execution.checkpointing.dir: 'hdfs:///tmp/checkpoints' execution.checkpointing.savepoint-dir: 'hdfs:///tmp/savepoints' security.kerberos.login.keytab: '/opt/keytab/work.keytab' security.kerberos.login.principal: 'work/work@BAIDU.COM' flinkVersion: v1_20 image: 'swr.cn-east-3.myhuaweicloud.com/yifanzhang/flink:1.20_cdc_mysql_to_paimon_starrocks' imagePullPolicy: Always job: jarURI: 'local:///opt/flink/lib/flink-sql-runner-1.12.0.jar' args: ["/opt/flink/sql-scripts/flinksql.sql"] state: running upgradeMode: savepoint jobManager: replicas: 1 resource: cpu: 2 memory: 4096m taskManager: replicas: 1 resource: cpu: 4 memory: 8192m restartNonce: 0 serviceAccount: flink podTemplate: apiVersion: v1 kind: Pod spec: containers: # don't modify this name - name: flink-main-container volumeMounts: - name: flinksql mountPath: /opt/flink/sql-scripts - name: hadoop-config mountPath: /opt/hadoop/etc/hadoop - name: krb5-config mountPath: /etc/krb5.conf subPath: krb5.conf - name: keytab mountPath: /opt/keytab volumes: - configMap: name: flinksql-hive-test-configmap name: flinksql - configMap: name: hadoop-config-configmap name: hadoop-config - configMap: name: krb5-config-configmap name: krb5-config - secret: secretName: work-keytab name: keytab hostAliases: - ip: "10.8.75.101" hostnames: - "dn1.bmr.cde.cscec8b.com.cn" - ip: "10.8.75.102" hostnames: - "dn2.bmr.cde.cscec8b.com.cn" apiVersion: v1 data: flinksql.sql: |- SET 'hadoop.security.authentication' = 'kerberos'; SET 'hive.metastore.sasl.enabled' = 'true'; SET 'hive.metastore.kerberos.principal' = 'nm/dn1.bmr.cde.cscec8b.com.cn@BAIDU.COM'; -- 替换为实际principal -- 测试部分:验证Hive表读取 CREATE CATALOG hive WITH ( 'type' = 'hive', 'hadoop-conf-dir' = '/opt/hadoop/etc/hadoop', 'hive-conf-dir' = '/opt/hadoop/etc/hadoop' ); -- 测试查询2:抽样10条数据验证字段 SELECT person_no, second_unit_code, end_time, post_status_id FROM hive.dws.dws_user_system_total_user_quantity_log WHERE rfq = (SELECT MAX(rfq) FROM hive.dws.dws_user_system_total_user_quantity_log) LIMIT 10; kind: ConfigMap metadata: name: flinksql-hive-test-configmap 以上2个脚本有什么问题,为什么报错如下: Caused by: java.util.concurrent.ExecutionException: java.io.IOException: Can't get Master Kerberos principal for use as renewer at java.base/java.util.concurrent.FutureTask.report(Unknown Source) at java.base/java.util.concurrent.FutureTask.get(Unknown Source) at org.apache.flink.connectors.hive.MRSplitsGetter.getHiveTablePartitionMRSplits(MRSplitsGetter.java:79) ... 13 more Caused by: java.io.IOException: Can't get Master Kerberos principal for use as renewer at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:134) at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:102) at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:81)
最新发布
07-14
didsp_request - WARNING - response status_code: 200, body text: {"code":1014 02,"failed":true,"msg":"获取hive表结构异常: java.lang.RuntimeException: java.io.IOException: Login fai lure for hive/scqmsy-test-dwhdevanlz-162-158-msxf.host@HADOOP.COM from keytab /home/finance/Data/Clust erConfig/dev/dev-hive.keytab: javax.security.auth.login.LoginException: java.lang.IllegalArgumentExcep tion: Illegal principal name hive/scqmsy-test-dwhdevanlz-162-158-msxf.host@HADOOP.COM: org.apache.hado op.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to hive/scqmsy-test-dwhd evanlz-162-158-msxf.host@HADOOP.COM"} Traceback (most recent call last): File "/home/finance/script/didspScript/didsp/bin/seatunnel_submit_main.py", line 28, in <module> PreSubmit.submit_pre_check(extend_param, base_param) File "/home/finance/script/didspScript/didsp/bin/../../didsp/util/pre_submit.py", line 38, in submit _pre_check raise ValueError(pre_task_result_code_json[&#39;msg&#39;]) ValueError: 获取hive表结构异常: java.lang.RuntimeException: java.io.IOException: Login failure for hiv e/scqmsy-test-dwhdevanlz-162-158-msxf.host@HADOOP.COM from keytab /home/finance/Data/ClusterConfig/dev /dev-hive.keytab: javax.security.auth.login.LoginException: java.lang.IllegalArgumentException: Illega l principal name hive/scqmsy-test-dwhdevanlz-162-158-msxf.host@HADOOP.COM: org.apache.hadoop.security. authentication.util.KerberosName$NoMatchingRule: No rules applied to hive/scqmsy-test-dwhdevanlz-162-1 58-msxf.host@HADOOP.COM
03-31
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值