今天晚上解决了一下intelic IDEA的使用及配置问题
首先是安装,我是在win7上安装···
IntelliJ IDEA 13.1.4 Community Edition
apache-maven-3.2.3-bin.zip
搭建Spark源码阅读环境:
在使用IntelliJ IDEA之前,需要安装scala的插件。点击Configure 然后点击Plugins,:点击Browse repositories...在搜索框内输入scala,选择Scala插件进行安装。安装完成后,IntelliJ IDEA会要求重启。重启后,点击Create New Project,Project SDK选择jdk安装目录,建议开发环境中的jdk版本与Spark集群上的jdk版本保持一致。点击左侧的Maven,勾选Create from archetype,选择org.scala-tools.archetypes:scala-archetype-simple,点击Next后,可根据需求自行填写GroupId,ArtifactId和Version,点击Next后,如果本机没有安装maven会报错,请保证之前已经安装maven。点击Next后,输入文件名,完成New Project的最后一步。
点击Finish后,maven会自动生成pom.xml和下载依赖包。我们需要修改pom.xml中scala的版本:
1 <properties> 2 <scala.version>2.10.4</scala.version> 3 </properties>
在<dependencies></dependencies>之间添加配置:
1 <!-- Spark --> 2 <dependency> 3 <groupId>org.apache.spark</groupId> 4 <artifactId>spark-core_2.10</artifactId> 5 <version>1.2.0</version> 6 </dependency> 7 8 <!-- HDFS --> 9 <dependency> 10 <groupId>org.apache.hadoop</groupId> 11 <artifactId>hadoop-client</artifactId> 12 <version>2.4.0</version> 13 </dependency>Spark的开发环境至此搭建完成。
????????????????
但这只是便于查看spark源码而非编程
开始File中close project,然后Creat new project,点击scala,点击Non-sbt,next输入Project name,在Set Scala Home一栏中下载相应版本的scala包,确定进入开发界面
选择Project structure,选择modules 在sources中给src添加New Folder:main,给main添加New Folder:scala,main和scala改为sources(右键),Libraries:点击加号选择本地的spark-hadoop的jar包(例如D盘:spark-1.2.0-bin-hadoop2.4/lib/spark-1.2.0-assembly-hadoop2.4),应用,OK,给scala添加Package:text,建立scala函数·······························································································································································································································
使用maven编译源代码。点击左下角,点击右侧package,点击绿色三角形,开始编译。
在target目录下,可以看到maven生成的jar包,需要放到Spark集群上运行的
java -jar xxxxxxxx.jar xxxxxxxx.jar