问题:
10/12/08 20:10:31 INFO hdfs.DFSClient: Could not obtain block blk_XXXXXXXXXXXXXXXXXXXXXX_YYYYYYYY from any node: java.io.IOException: No live nodes contain current block
解决:
An Hadoop HDFS datanode has an upper bound on the number of files that it will serve at any one time. The upper bound parameter is called xcievers (yes, this is misspelled). Again, before doing any loading, make sure you have configured Hadoop's conf/hdfs-site.xml setting the xceivers value to at least the following:
<property>
<name>dfs.datanode.max.xcievers</name>
<value>4096</value>
</property>
Be sure to restart your HDFS after making the above configuration.
10/12/08 20:10:31 INFO hdfs.DFSClient: Could not obtain block blk_XXXXXXXXXXXXXXXXXXXXXX_YYYYYYYY from any node: java.io.IOException: No live nodes contain current block
解决:
An Hadoop HDFS datanode has an upper bound on the number of files that it will serve at any one time. The upper bound parameter is called xcievers (yes, this is misspelled). Again, before doing any loading, make sure you have configured Hadoop's conf/hdfs-site.xml setting the xceivers value to at least the following:
<property>
<name>dfs.datanode.max.xcievers</name>
<value>4096</value>
</property>
Be sure to restart your HDFS after making the above configuration.

本文介绍了解决Hadoop HDFS数据节点无法提供足够文件服务的问题。通过调整Hadoop配置文件hdfs-site.xml中的参数dfs.datanode.max.xcievers,可以增加数据节点能够同时服务的文件数量。
3241

被折叠的 条评论
为什么被折叠?



