hadoop中出现Call From ubuntu/127.0.1.1 to localhost:8020 failed on connection exception

本文介绍如何在Hadoop中更改core-site.xml文件的fs.defaultFS属性,将默认的HDFS端口从9000更改为8020,以适应不同的集群配置需求。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

更改core-site.xml文件

<configuration>
<property>
        <name>hadoop.tmp.dir</name>
        <value>file:/home/hadoop/app/hadoop/tmp</value>
        <description>Abase for other temporary directories.</description>
</property>
<property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
</property>
</configuration>

把这一行 <value>hdfs://localhost:9000</value>改为<value>hdfs://localhost:8020</value>

/home/hadoop/anaconda3/envs/pyspark/bin/python3.8 /home/hadoop/PycharmProjects/pythonProject1/02/2.py log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Traceback (most recent call last): File "/home/hadoop/PycharmProjects/pythonProject1/02/2.py", line 11, in <module> pv_df = spark.read.csv("/home/hadoop/Documents/UserBehavior.csv", header=True, inferSchema=True) File "/home/hadoop/anaconda3/envs/pyspark/lib/python3.8/site-packages/pyspark/sql/readwriter.py", line 727, in csv return self._df(self._jreader.csv(self._spark._sc._jvm.PythonUtils.toSeq(path))) File "/home/hadoop/anaconda3/envs/pyspark/lib/python3.8/site-packages/py4j/java_gateway.py", line 1322, in __call__ return_value = get_return_value( File "/home/hadoop/anaconda3/envs/pyspark/lib/python3.8/site-packages/pyspark/errors/exceptions/captured.py", line 169, in deco return f(*a, **kw) File "/home/hadoop/anaconda3/envs/pyspark/lib/python3.8/site-packages/py4j/protocol.py", line 326, in get_return_value raise Py4JJavaError( py4j.protocol.Py4JJavaError: An error occurred while calling o27.csv. : java.net.ConnectException: Call From ubuntu/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:930) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:845) at org.apa
最新发布
06-10
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值