源代码:https://github.com/jimingkang/spark
环境:
scala 2.11.8
:spark-2.0.2-bin-hadoop2.7
IDEA配置参考:http://blog.youkuaiyun.com/javastart/article/details/43372977‘’
关键代码:
val conf = new SparkConf().setAppName("Spark Pi").setMaster("spark://192.168.1.104:7077") .setJars(Array("C:\\Users\\asys\\IdeaProjects\\TestScala\\out\\artifacts\\TestScala_jar\\TestScala.jar")) val spark = new SparkContext(conf)
当然要先编译使得有jar文件在
C:\\Users\\asys\\IdeaProjects\\TestScala\\out\\artifacts\\TestScala_jar\\TestScala.jar
以及启动mac机器上的spark,然后添加以下一句在代码开头
new SparkConf().setAppName("Spark Pi").setMaster("spark://192.168.1.104:7077"
运行结果:
Pi is roughly 3.14646
16/12/10 17:33:50 INFO SparkUI: Stopped Spark web UI at http://192.168.1.103:4040