原创
哆啦丶赫
这个作者很懒,什么都没留下…
展开
专栏收录文章
- 默认排序
- 最新发布
- 最早发布
- 最多阅读
- 最少阅读
-
客户推荐离线模块
package com.sdg.statisticsimport java.text.SimpleDateFormatimport java.util.Dateimport org.apache.spark.SparkConfimport org.apache.spark.sql.{Dataset, SparkSession}object StatisticsRecommender {val MONGO_URI: String = “mongodb://exam1:27017/recom3”v原创 2020-06-08 22:36:06 · 302 阅读 · 0 评论 -
HBase的增删改
import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.hbase.;import org.apache.hadoop.hbase.client.;import org.apache.hadoop.hbase.util.Bytes;import java.io.IOException;public class HBaseUtil {//创建链接对象public static Connection connecti原创 2020-05-12 11:26:13 · 210 阅读 · 0 评论 -
Kafka-To-Hbase之util工具包PropertiesUtil
import java.io.InputStreamimport java.util.Properties//读取配置文件信息object PropertiesUtil {val is: InputStream = ClassLoader.getSystemResourceAsStream(“hbase_consumer.properties”)var properties = new Propertiesproperties.load(is)//根据key 取出来对应的值def getPr原创 2020-05-11 22:17:47 · 232 阅读 · 0 评论 -
Kafka-To-Hbase之util工具包HBaseUtil
import java.text.DecimalFormatimport java.utilimport org.apache.hadoop.conf.Configurationimport org.apache.hadoop.hbase.client.{Admin, Connection, ConnectionFactory}import org.apache.hadoop.hbase.util.Bytesimport org.apache.hadoop.hbase.{HColumnDescri原创 2020-05-11 22:16:50 · 252 阅读 · 0 评论 -
Kafka-To-Hbase之util工具类ConnectionInstance
import org.apache.hadoop.conf.Configurationimport org.apache.hadoop.hbase.client.{Connection, ConnectionFactory}//获取连接的object ConnectionInstance {private var conn: Connection = nulldef getConnection(conf: Configuration): Connection = {if (conn == nul原创 2020-05-11 22:15:33 · 154 阅读 · 0 评论 -
Kafka-To-Hbase之发送消费数据至hbase
//命名包名的时候不要冲突import org.apache.kafka.clients.consumer.{ConsumerRecords, KafkaConsumer}import java.utilimport com.sdg.consumer.myutils.PropertiesUtilimport com.sdg.consumer.myhbase.HbaseDaoimport scala.collection.JavaConversions._/**KafkaToHbase把数据原创 2020-05-11 22:11:31 · 311 阅读 · 0 评论 -
Kafka-To-Hbase之HBaseDao层
import java.text.SimpleDateFormatimport java.utilimport com.sdg.consumer.myutils.{ConnectionInstance, HBaseUtil, PropertiesUtil}import org.apache.hadoop.conf.Configurationimport org.apache.hadoop.hbase.{HBaseConfiguration, TableName}import org.apache.原创 2020-05-11 22:09:27 · 202 阅读 · 0 评论 -
HBaseSink的测试板块(二)
case class SensorReading(id: String, timestamp: Long, temperature: Double)object HBaseSinkTest {def main(args: Array[String]): Unit = {val env = StreamExecutionEnvironment.getExecutionEnvironmente...原创 2020-04-14 02:19:49 · 180 阅读 · 0 评论 -
HBaseSink测试模块
//输入数据的样例类case class Stu(id:Int,name:String,course:String,score:Int)object HBseSinkTest {def main(args: Array[String]): Unit = {val env = StreamExecutionEnvironment.getExecutionEnvironmentenv.set...原创 2020-04-14 01:47:49 · 690 阅读 · 0 评论 -
flink基于JDBC的Source案例
object JDBCSourceTest {def main(args: Array[String]): Unit = {val env = ExecutionEnvironment.getExecutionEnvironmentval inputMysql: DataSet[Row] = MyJDBCRead(env)inputMysql.map(r=>(r.getField(...原创 2020-04-14 00:31:47 · 1312 阅读 · 0 评论 -
CEP支付监控模块
//定义输入样例类case class OrderEvent(orderid:Long,eventType:String,eventTime:Long)//输出检测结果得样例类case class OrderResult(orderid:Long,resultMsg:String)object OrderTimeout {def main(args: Array[String]): Un...原创 2020-04-13 22:55:20 · 312 阅读 · 0 评论 -
恶意登录监控。同一用户(可以是不同IP)在2秒内连续两次登录失败,则报警。
//输入事件样例类case class LoginEvent(userid:Long,ip:String,eventType:String,eventTime:Long)//中间输出报警的样例类case class Warning(userid:Long,firstFailTime:Long,lastFailTime:Long,warning: String)object LoginFai...原创 2020-04-13 20:15:51 · 564 阅读 · 0 评论 -
基于用户行为日志分析热门实时商品统计,统计近1小时内的热门商品,每5分钟更新一次,热门度用浏览次数(“pv”)衡量
//生产者object KafkaProducerTask {//执行方法def main(args: Array[String]): Unit = {writeToKafka(“hotitem”)}def writeToKafka(topic: String): Unit = {val pro = new Properties()pro.setProperty(“bootstra...原创 2020-04-13 20:12:04 · 1074 阅读 · 0 评论 -
使用Flume采集流式数据发送到Kafka,再由Flink消费Kafka数据,实现电话号码统计(第二种)
object PhoneTest {def main(args: Array[String]): Unit = {val env = StreamExecutionEnvironment.getExecutionEnvironmentenv.setParallelism(1)val pro = new Properties()pro.setProperty(“bootstrap.serv...原创 2020-04-13 20:08:12 · 291 阅读 · 0 评论 -
使用Flume采集流式数据发送到Kafka,再由Flink消费Kafka数据,实现电话号码统计
Flume采集流式数据发送到Kafkabin/flume-ng agent --conf-file ./job/flume-kafka.conf -c conf/ --name a1 -Dflume.root.logger=DEBUG,consolecase class Tel(iphone:Long,timestamp:Long)object zuoye1 {def main(args:...原创 2020-04-13 20:03:04 · 677 阅读 · 0 评论
分享