1,安装配置好jdk,mysql和scala
2,安装配置hadoop
3,安装配置好hive
4,安装配置好spark
5,把hive的conf目录下hive-site.xml文件拷贝到spark的conf目录下
cp hive-site.xml /home/hadoop/app/spark-2.4.6-bin-hadoop2.7/conf
6,spark启动,需指定mysql驱动包,进入scala命令行
spark-shell local[2] --jars ~/software/mysql-connector-java-5.1.27-bin.jar
7.执行spark-sql指令,比查看某个表
scala> spark.sql("select * from emp1").show
20/07/17 20:26:43 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
+---+---------+
| id| name|
+---+---------+
| 1| Rafferty|
| 2| Jones|
| 3|Steinberg|
| 4| Robinson|
| 5| Smith|
| 6| John|
+---+---------+