创建maven工程
首先安装apache maven,选择maven工程
GroupId和ArtifactId是自己设置,通常ArtifactId就是项目名
直接点finish
File -> Project Structure -> Global Libraries里面确认scala版本
看一下现在的目录结构,可以将java改成scala(如果两种语言都有就可以在main下建两个文件夹:java和scala)
scala文件夹下新建一个scala class,命名为WordCount
WordCount代码及配置
import org.apache.spark.{SparkConf, SparkContext}
object WordCount {
def main(args: Array[String]): Unit = {
val sconf = new SparkConf().setAppName("WordCount")
val sc = new SparkContext(sconf)
sc.textFile(args(0)).flatMap(_.split(" ")).map((_, 1)).reduceByKey(_ + _).sortBy(_._2, false).saveAsTextFile(args(1))
sc.stop()
}
}
配置pom.xml(按照里面注释改一下自己的配置),有任何修改都要重新import change
<dependencies>
<!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
<!-- 以下dependency都要修改成自己的scala,spark,hadoop版本-->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.5</version>
</dependency>
</dependencies>
<build>
<!--程序主目录,按照自己的路径修改,如果有测试文件还要加一个testDirectory-->
<sourceDirectory>src/main/scala</sourceDirectory>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<!--<transformers>-->
<!--<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">-->
<!--<mainClass></mainClass>-->
<!--</transformer>-->
<!--</transformers>-->
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<useUniqueVersions>false</useUniqueVersions>
<classpathPrefix>lib/</classpathPrefix>
<!--修改为自己的包名.类名,右键类->copy reference-->
<mainClass>WordCount</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</build>
项目打包
双击package
打包成功信息
可以看到打包好的WordCount-1.0.jar
运行
在Linux下运行Spark集群
输入命令:
/usr/local/apps/spark-2.2.0-bin-hadoop2.6/bin/spark-submit \ (改成spark安装路径/bin/spark-submit \)
--class WordCount \ (类的reference,一般是包名+类名,直接copy reference)
--master spark:/hdp-node-01:7077 \ (设置master)
--executor-memory 512m \
--total-executor-cores 2 \
/home/WordCount-1.0.jar \ (jar包路径)
hdfs:/hdp-node-01:9000/aaa.txt hdfs:/hdp-node-01:9000/test1 (输入文件 输出目录(运行时会自己创建,不需要提前创建))
运行结果