Wrong FS expected: file:/// 问题
对于一个简单的hadoop api调用问题,经常出现以下报错:
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(conf);
fs.copyFromLocalFile(new Path(“/home/ptrdu/input/file1”),new Path( "hdfs://namenode:9000/user/ptrdu/move_input/"));
只是一段简单的代码,用于从本地传送数据到hdfs
wrong FS:hdfs://namenode:9000/home/ptrdu/move_input exptected:file:///报错
主要原因是Configuration conf = new Configuration();这一段代码默认获得的是hadoop-default.xml和hadoop-site.xml文件,而未获取core-site.xml文件。
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://namenode:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/home/ptrdu/tmp</value>
</property>
</configuration>
而core-site指定了hdfs的路径,所以会造成wrong fs的报错
修改方法:
Configuration conf = new Configuration();
conf.addResource(new Path("/home/ptrdu/hadoop-0.20.2/conf/core-site.xml"));
FileSystem fs = FileSystem.get(conf);
fs.copyFromLocalFile(new Path(“/home/ptrdu/input/file1”),new Path( "hdfs://namenode:9000/user/ptrdu/move_input/"));
要将"/home/ptrdu/hadoop-0.20.2/conf/core-site.xml"路径转化为Path型,不然仍会报错。