
spark
kevin_37723298
这个作者很懒,什么都没留下…
展开
-
Debezium获取MySQL Binlog同步到kafka,sparkStreaming实现实时计算
Debezium获取MySQL Binlog同步到kafka,sparkStreaming实现实时计算版本信息:kafka(kafka_2.11-2.1.1)zookeeper(zookeeper-3.4.10)sprak(spark-2.1.0-bin-hadoop2.7)Debezium(debezium-connector-mysql-0.9.5.Final-plugin.tar.gz)Debezium下载地址https://repo1.maven.org/maven2/io/debezi原创 2020-09-02 10:46:28 · 1139 阅读 · 0 评论 -
SparkSQL创建表的几种方式
数据格式:7654,MARTIN,SALESMAN,7698,1981/9/28,1250,1400,30// 需要导入的包import org.apache.spark.SparkConfimport org.apache.spark.SparkContextimport org.apache.spark.sql.catalyst.encoders.ExpressionEncoderi...原创 2018-11-25 20:42:46 · 16878 阅读 · 0 评论 -
SparkSql操作表的两种方式
package com.kk.sparksqlimport org.apache.spark.SparkConfimport org.apache.spark.SparkContextimport org.apache.spark.sql.catalyst.encoders.ExpressionEncoderimport org.apache.spark.sql.Encoderimpor...原创 2018-11-25 21:02:15 · 1276 阅读 · 0 评论 -
Spark Streaming整合Flume,Mysql(基于Flume的Push模式),实时保存数据到Mysql
集群分配如下:192.168.58.11 spark01192.168.58.12 spark02192.168.58.13 spark03spark版本:spark-2.1.0-bin-hadoop2.7flume版本:apache-flume-1.7.0-binflume配置如下:#flume启动命令#bin/flume-ng agent -n a4 -f conf/a4.c...原创 2018-11-30 16:58:58 · 311 阅读 · 0 评论 -
Spark Streaming整合Flume,Mysql(基于Flume的Pull模式),实时保存数据到Mysql
集群分配如下:192.168.58.11 spark01192.168.58.12 spark02192.168.58.13 spark03spark版本:spark-2.1.0-bin-hadoop2.7flume版本:apache-flume-1.7.0-binflume配置如下:#flume启动命令#bin/flume-ng agent -n a1 -f myagent/a...原创 2018-11-30 16:59:52 · 331 阅读 · 0 评论 -
Spark Streaming整合Kafka,Mysql,实时保存数据到Mysql(直接读取方式)
集群分配如下:192.168.58.11 spark01192.168.58.12 spark02192.168.58.13 spark03spark版本:spark-2.1.0-bin-hadoop2.7kafka版本:kafka_2.11-2.0.0Spark Streaming程序package com.kk.sparkstreaming.kafkaimport org....原创 2018-12-03 10:42:31 · 5599 阅读 · 3 评论 -
Spark Streaming整合Kafka,Mysql,实时保存数据到Mysql(基于Receiver的方式)
集群分配如下:192.168.58.11 spark01192.168.58.12 spark02192.168.58.13 spark03spark版本:spark-2.1.0-bin-hadoop2.7kafka版本:kafka_2.11-2.0.0Spark Streaming程序package com.kk.sparkstreaming.kafkaimport org....原创 2018-12-03 10:43:11 · 1004 阅读 · 0 评论