环境
Windows 7 64bit
JDK 1.6.0_45 (i586)
JDK 1.7.0_51 (i586)
Eclipse Kepler
Eclipse -plugin-1.2.1.jar
Hadoop 1.2.1 (32位的库)
服务器 hadoop与 本机的hadoop版本完全一致。
Note:我设置的Hadoop Master的 用户名是 root
正确配置
本机不需要 改windows用户名
<property>
<name>hadoop.tmp.dir</name>
<value>/home/tmp</value>
<description>A base for other temporary directories.</description>
</property>
12/04/24 15:32:44 ERROR security.UserGroupInformation: PriviledgedActionException as:dell cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-root\mapred\staging\root-519341271\.staging to 0700
Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-root\mapred\staging\root-519341271\.staging to 0700
Configuration conf = new Configuration();
//conf.set("dfs.permissions", "false");
//conf.set("hadoop.job.user", "root");
//conf.set("mapred.job.tracker", "xxx.xxx.xxx:9001");
java.lang.RuntimeException: java.lang.ClassNotFoundException: wdc.WordCount$TokenizerMapper
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:857)
at org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:199)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:718)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
<property>
<name>dfs.permissions</name>
<value>false</value>
<description>
If "true", enable permission checking in HDFS. If "false", permission checking is turned off ,but all other behavior is unchanged. Switching from one parameter value to the other does not change the mode, owner or group of files or directories
</description>
</property>
百度了一下,这是 hadoop-core-1.2.1.jar的问题 Windows下 和 linux用户权限不一致的问题。在linux下没有这个问题。
那么把 hadoop-1.2.1/src/core/org/apache/hadoop/fs/FileUtil.java里面的checkReturnValue方法,注释掉重新编译即可
private staticvoid checkReturnValue(boolean rv, File p,
FsPermission permission) throws IOException {
// if (!rv){
// throw newIOException("Failed to set permissions of path: " + p +
// " to" +
//String.format("%04o", permission.toShort()));
// }
}
我懒得编译这个hadoop-core-1.2.1.jar于是百度 搜了一个 放在本地 hadoop的根目录下。
这个问题是
这个错误在之前就见过。说明 之前改的 dfs.permission 无效,而真正的解决方法是 重新编译 hadoop-core-1.2.1.jar
那么 我又 切换到 Win7的 自己的账号下,依然可以运行 成功。
证明 不需要改Win7的用户名。