解决Error running ‘serializers‘: Cannot run program“url...“:CreateProcess error=2, 系统找不到指定的文件。

本文详细解析了在PyCharm环境中运行脚本时遇到的python路径错误问题,并提供了具体的解决步骤,包括如何正确设置Python解释器路径,以确保IDE能正确识别并使用Python环境。

问题描述:

Pycharm环境中运行脚本报错:

Error running 'serializers': Cannot run program "C:\Users\Administrator\AppData\Local\Microsoft\WindowsApps\python.exe" (in directory "E:\django_restful\api"):

原因解析:

在IDE(Pycharm)中的python的地址错误,需要重新加载python版本。

解决方法:

1、打开File中Settings,找到Project Interpreter选项,在地址目录中会提示地址无效,点击Add添加按钮:

2、重新定位到python的路径,添加python路径,问题解决。

 

 

参考https://blog.youkuaiyun.com/qwxwaty/article/details/80711385

 

 

为什么刚刚输入SET spark.kryo.registrationRequired=false;后运行插入数据命令可以成功,退出hive后重新加入hive后,运行插入数据命令后又出现这个问题hive> SET spark.kryo.registrationRequired=false; hive> insert into student values(119,"code19"); Query ID = hadoop_20250807203154_c6723670-067b-449a-8fca-84bd694af0f5 Total jobs = 1 Launching Job 1 out of 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer=<number> In order to limit the maximum number of reducers: set hive.exec.reducers.max=<number> In order to set a constant number of reducers: set mapreduce.job.reduces=<number> Job failed with org.apache.hive.com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 95 Serialization trace: conf (org.apache.hadoop.hive.ql.exec.TableScanOperator) aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork) left (org.apache.commons.lang3.tuple.ImmutablePair) edgeProperties (org.apache.hadoop.hive.ql.plan.SparkWork) at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:137) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:185) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:118) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180) at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161) at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180) at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:153) at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:210) at org.apache.hadoop.hive.ql.exec.spark.KryoSerializer.deserialize(KryoSerializer.java:60) at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient$JobStatusJob.call(RemoteHiveSparkClient.java:342) at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:378) at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:343) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) FAILED: Execution Error, return code 3 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Spark job failed during runtime. Please check stacktrace for the root cause.
08-08
当 Spring Boot 使用 `KafkaAvroSerializer` 序列化通过 Kafka 发送消息时出现 `Unauthorized; error code: 401` 错误,通常是由于认证失败导致的。以下是一些可能的解决办法: ### 检查 Schema Registry 认证配置 `KafkaAvroSerializer` 需要与 Schema Registry 进行交互,如果 Schema Registry 开启了认证,需要正确配置认证信息。在 `application.properties` 或 `application.yml` 中添加认证配置: #### 使用基本认证 ```properties schema.registry.url=http://your-schema-registry-url basic.auth.credentials.source=USER_INFO schema.registry.basic.auth.user.info=your-username:your-password ``` #### 使用 SSL 认证 ```properties schema.registry.url=https://your-schema-registry-url schema.registry.ssl.truststore.location=/path/to/truststore.jks schema.registry.ssl.truststore.password=your-truststore-password schema.registry.ssl.keystore.location=/path/to/keystore.jks schema.registry.ssl.keystore.password=your-keystore-password ``` ### 检查 Kafka 集群认证配置 如果 Kafka 集群也开启了认证,需要在生产者配置中添加相应的认证信息。例如,使用 SASL_PLAINTEXT 认证: ```properties spring.kafka.producer.properties.sasl.mechanism=PLAIN spring.kafka.producer.properties.security.protocol=SASL_PLAINTEXT spring.kafka.producer.properties.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="your-username" password="your-password"; ``` ### 检查网络访问权限 确保应用程序所在的服务器可以访问 Schema Registry 和 Kafka 集群。可以使用 `ping` 和 `telnet` 命令进行测试: ```bash ping your-schema-registry-url telnet your-schema-registry-url 8081 ping your-kafka-broker-url telnet your-kafka-broker-url 9092 ``` ### 检查权限和角色 确保使用的用户名和密码具有足够的权限来访问 Schema Registry 和 Kafka 集群。联系 Kafka 集群管理员,检查用户的角色和权限。 ### 示例代码 以下是一个完整的 `application.properties` 示例,包含了 Schema Registry 和 Kafka 集群的认证配置: ```properties spring.kafka.bootstrap-servers=your-kafka-broker-url:9092 spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer spring.kafka.producer.value-serializer=io.confluent.kafka.serializers.KafkaAvroSerializer schema.registry.url=http://your-schema-registry-url:8081 basic.auth.credentials.source=USER_INFO schema.registry.basic.auth.user.info=your-username:your-password spring.kafka.producer.properties.sasl.mechanism=PLAIN spring.kafka.producer.properties.security.protocol=SASL_PLAINTEXT spring.kafka.producer.properties.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="your-username" password="your-password"; ```
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值