1.First ,we should set up Java ,Hadoop,and IDEA Intellij-2016.3 . I have Installed Java-1.8 , Hadoop-2.7.3 and IDEA Intellij-2016.3
2.Start-up HDFS ,YARN
3.We create new Maven Project, then click next
4.Input GroupId and ArtifactId,Then selectnext
5.Input Project name and location and click finish
6.Set Run Configuration with the optionRun ->Edit Configurations,SetProgram argumentswithinput file pathand output file path(It can be in HDFS).Then click OK.
7.Select File -> Project Structure to add hadoop jars,Which are common , yarn , mapreduce , hdfs coming from ~hadoop home/share/hadoop . Click OK.
8.Then you can run your MapReduce program in local.
9.If you want commit project to Hadoop platform, you must setartifacts to createProject Jar. In this step, you should selectFile -> Project Structureto add artifacts.Click" + " and write Name .Then Click"+ " , which is underOutput Layout. Finally modificationMain Class, which is your project' s main class.ClickOK.
10.Build artifact. Select Build ->Build Artifacts ...->MatrixMultiply (which is created by yourself in previous step).
11.Commit project to Hadoop platform. You should add code in your project as :
conf.set("mapred.jar", "./out/artifacts/MatrixMultiply/MatrixMultiply.jar");
conf.set("mapreduce.framework.name", "yarn");12.Look at the result:According to the log , we can get that the job have committed to hadoop platform.
13.We can view the job' s information at http://localhost:8088.
本文详细介绍了如何在本地环境中安装配置Java、Hadoop及IDEA Intellij,并通过具体步骤指导创建并运行MapReduce项目,包括设置运行配置、添加Hadoop依赖库以及构建项目JAR文件。
616

被折叠的 条评论
为什么被折叠?



