使用Git下载Hadoop的到本地Eclipse开发环境

本文记录了在遵循官方指南构建Hadoop Eclipse开发环境时遇到的问题及解决过程。主要问题是由于本地未安装Protocol Buffers导致编译失败,通过安装Protocol Buffers后成功解决了该问题。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

[b][size=x-large]问题场景[/size][/b]
按照官网[url]http://wiki.apache.org/hadoop/EclipseEnvironment[/url]指导,要把Hadoop下载到本地,并构建Eclipse开发环境,只需要三条指令:

$ git clone git://git.apache.org/hadoop-common.git
$ mvn install -DskipTests
$ mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true

即可,但当我在本地执行完第二条后,报出如下错误日志信息:

[INFO]
[INFO] --- maven-antrun-plugin:1.6:run (compile-proto) @ hadoop-common ---
[INFO] Executing tasks

main:
[exec] target/compile-proto.sh: line 17: protoc: command not found
[exec] target/compile-proto.sh: line 17: protoc: command not found
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [2.389s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [0.698s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [1.761s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.729s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.353s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [1.998s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [1.227s]
[INFO] Apache Hadoop Common .............................. FAILURE [1.132s]
[INFO] Apache Hadoop Common Project ...................... SKIPPED
[INFO] Apache Hadoop HDFS ................................ SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] hadoop-yarn ....................................... SKIPPED
[INFO] hadoop-yarn-api ................................... SKIPPED
[INFO] hadoop-yarn-common ................................ SKIPPED
[INFO] hadoop-yarn-server ................................ SKIPPED
[INFO] hadoop-yarn-server-common ......................... SKIPPED
[INFO] hadoop-yarn-server-nodemanager .................... SKIPPED
[INFO] hadoop-yarn-server-web-proxy ...................... SKIPPED
[INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED
[INFO] hadoop-yarn-server-tests .......................... SKIPPED
[INFO] hadoop-mapreduce-client ........................... SKIPPED
[INFO] hadoop-mapreduce-client-core ...................... SKIPPED
[INFO] hadoop-yarn-applications .......................... SKIPPED
[INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED
[INFO] hadoop-yarn-site .................................. SKIPPED
[INFO] hadoop-mapreduce-client-common .................... SKIPPED
[INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED
[INFO] hadoop-mapreduce-client-app ....................... SKIPPED
[INFO] hadoop-mapreduce-client-hs ........................ SKIPPED
[INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] hadoop-mapreduce .................................. SKIPPED
[INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED
[INFO] Apache Hadoop Distributed Copy .................... SKIPPED
[INFO] Apache Hadoop Archives ............................ SKIPPED
[INFO] Apache Hadoop Rumen ............................... SKIPPED
[INFO] Apache Hadoop Extras .............................. SKIPPED
[INFO] Apache Hadoop Tools Dist .......................... SKIPPED
[INFO] Apache Hadoop Tools ............................... SKIPPED
[INFO] Apache Hadoop Distribution ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12.483s
[INFO] Finished at: Mon Jan 30 22:57:23 GMT+08:00 2012
[INFO] Final Memory: 24M/81M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-common


此问题我暂时还未分析具体原因和解决方案,暂时记录下。
[b][size=x-large]展开分析[/size][/b]
通过再此使用命令打出错误信息:
$ mvn install -DskipTests -e

得到详细错误信息为:

[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 9.387s
[INFO] Finished at: Mon Jan 30 23:11:07 GMT+08:00 2012
[INFO] Final Memory: 19M/81M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127 -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 127
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 127
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:283)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
... 19 more
Caused by: /Users/apple/Documents/Hadoop-common-dev/hadoop-common/hadoop-common-project/hadoop-common/target/antrun/build-main.xml:23: exec returned: 127
at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:650)
at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:676)
at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:502)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor16.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:390)
at org.apache.tools.ant.Target.performTasks(Target.java:411)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1397)
at org.apache.tools.ant.Project.executeTarget(Project.java:1366)
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:270)
... 21 more
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]

通过上面错误信息,真方便找到解决方案。

根据上面的提示,访问[url]https://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException[/url]
得到如下提示信息:
Unlike many other errors, this exception is not generated by the Maven core itself but by a plugin. As a rule of thumb, plugins use this error to signal a problem in their configuration or the information they retrieved from the POM.
这里说的意思是:
这个错误不是Maven本身的错误,根据经验,可能是Maven使用的插件通过这个异常来标识它们没有从POM中获得相关的配置信息。

接下来进一步分析,通过Maven构建Hadoop过程中是否使用了插件。
从错误日志分析,编译过程使用了插件:maven-antrun-plugin。由于编译Hadoop-common过程中出错,所以进一步定位到hadoop-common工程下的POM.xml,可到看到下面信息:

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>compile-proto</id>
<phase>generate-sources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<echo file="target/compile-proto.sh">
PROTO_DIR=src/main/proto
JAVA_DIR=target/generated-sources/java
which cygpath 2> /dev/null
if [ $? = 1 ]; then
IS_WIN=false
else
IS_WIN=true
WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR`
WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR`
fi
mkdir -p $JAVA_DIR 2> /dev/null
for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null`
do
if [ "$IS_WIN" = "true" ]; then
protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE
else
protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE
fi
done
</echo>
<exec executable="sh" dir="${basedir}" failonerror="true">
<arg line="target/compile-proto.sh"/>
</exec>
</target>
</configuration>
</execution>
<execution>
<id>compile-test-proto</id>
<phase>generate-test-sources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<echo file="target/compile-test-proto.sh">
PROTO_DIR=src/test/proto
JAVA_DIR=target/generated-test-sources/java
which cygpath 2> /dev/null
if [ $? = 1 ]; then
IS_WIN=false
else
IS_WIN=true
WIN_PROTO_DIR=`cygpath --windows $PROTO_DIR`
WIN_JAVA_DIR=`cygpath --windows $JAVA_DIR`
fi
mkdir -p $JAVA_DIR 2> /dev/null
for PROTO_FILE in `ls $PROTO_DIR/*.proto 2> /dev/null`
do
if [ "$IS_WIN" = "true" ]; then
protoc -I$WIN_PROTO_DIR --java_out=$WIN_JAVA_DIR $PROTO_FILE
else
protoc -I$PROTO_DIR --java_out=$JAVA_DIR $PROTO_FILE
fi
done
</echo>
<exec executable="sh" dir="${basedir}" failonerror="true">
<arg line="target/compile-test-proto.sh"/>
</exec>
</target>
</configuration>
</execution>
<execution>
<id>save-version</id>
<phase>generate-sources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<mkdir dir="${project.build.directory}/generated-sources/java"/>
<exec executable="sh">
<arg
line="${basedir}/dev-support/saveVersion.sh ${project.version} ${project.build.directory}/generated-sources/java"/>
</exec>
</target>
</configuration>
</execution>
<execution>
<id>generate-test-sources</id>
<phase>generate-test-sources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>

<mkdir dir="${project.build.directory}/generated-test-sources/java"/>

<taskdef name="recordcc" classname="org.apache.hadoop.record.compiler.ant.RccTask">
<classpath refid="maven.compile.classpath"/>
</taskdef>
<recordcc destdir="${project.build.directory}/generated-test-sources/java">
<fileset dir="${basedir}/src/test/ddl" includes="**/*.jr"/>
</recordcc>
</target>
</configuration>
</execution>
<execution>
<id>create-log-dir</id>
<phase>process-test-resources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<!--
TODO: there are tests (TestLocalFileSystem#testCopy) that fail if data
TODO: from a previous run is present
-->
<delete dir="${test.build.data}"/>
<mkdir dir="${test.build.data}"/>
<mkdir dir="${hadoop.log.dir}"/>

<copy toDir="${project.build.directory}/test-classes">
<fileset dir="${basedir}/src/main/conf"/>
</copy>
</target>
</configuration>
</execution>
<execution>
<phase>pre-site</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<tasks>
<copy file="src/main/resources/core-default.xml" todir="src/site/resources"/>
<copy file="src/main/xsl/configuration.xsl" todir="src/site/resources"/>
</tasks>
</configuration>
</execution>
</executions>
</plugin>

上面是在Pom.xml文件中使用Maven下使用Ant插件,里面有一行:
<echo file="target/compile-proto.sh">

看着让人不解,联系到HowToContribueToHadoop的文章[url]http://wiki.apache.org/hadoop/HowToContribute[/url]可以推知,有可能由于本地没有安装ProtocolBuffers引起的,因为文章内部特别说明了:[quote]Hadoop 0.23+ must have Google's ProtocolBuffers for compilation to work.[/quote]接下来打算在本地重新安装ProtocolBuffers后再编译部署。


如预期,在本地安装好了Protoc Buffer后,后面两条指令顺利执行完整,剩下的就依据官网把目录下的工程导入Eclipse后,就可以在Eclipse下学习调试源码。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值