flume学习(三)

flume使用实例

实例一

说明

监听客户端的连接然后打印

使用

netcat source---------》监听端口
memory channel  ----》数据通道
logger sink-----------》打印监听到的数据

配置

  • 在conf中新建agent目录
    mkdir -p conf/agent
  • 在agent目录中新建example-net-mem-log.conf
    touch example-net-mem-log.conf
  • 在example-net-mem-log.conf中配置source channel sink

    a1.sources = r1
    a1.channels = c1 
    a1.sinks = k1
    
    a1.sources.r1.type = netcat                              
    a1.sources.r1.bind = dev-hadoop-single.com               
    a1.sources.r1.port = 4444
    
    a1.channels.c1.type                  = memory
        #channel 能够容纳的event数量
    a1.channels.c1.capacity              = 1000
        #channel 一个事务中允许的最大的evnet数量
    a1.channels.c1.transactionCapacity = 100                 
    
    a1.sinks.k1.type = logger
    
    a1.sources.r1.channels = c1                              
    a1.sinks.k1.channel = c1   

    测试

    • 启动flume
      $ flume-ng agent --conf /opt/modules/apache-flume-1.5.0-cdh5.3.6-bin/conf/agent/ --conf-file /opt/modules/apache-flume-1.5.0-cdh5.3.6-bin/conf/agent/example-net-mem-log.conf --name a1 -Dflume.root.logger=INFO,console
    • 使用telnet发送消息

      $ telnet dev-hadoop-single.com 4444
      Trying 192.168.56.101...
      Connected to dev-hadoop-single.com.
      Escape character is '^]'.
      test
      OK
      message
      OK

      flume输出如下

      16/10/19 18:40:23 INFO sink.LoggerSink: Event: { headers:{} body: 74 65 73 74 0D test. }
      16/10/19 18:40:26 INFO sink.LoggerSink: Event: { headers:{} body: 6D 65 73 73 61 67 65 0D message. }

实例二

说明

Nginx + Flume + Hdfs

使用

exec source     1个
memory channel  1个
hdfs sink       1个

配置

flume配置
agent.sources = r1
agent.channels = c1
agent.sinks = k1

##
agent.sources.r1.channels = c1
agent.sinks.k1.channel    = c1

#exec source
agent.sources.r1.type          = exec
agent.sources.r1.command       = tail -F /home/hadoop/access.log
agent.sources.r1.selector.type = replicating


#memory channel
agent.channels.c1.type                = memory
agent.channels.c1.capacity            = 1000
agent.channels.c1.transactionCapacity = 100 
#agent.channels.c1.byteCapacityBufferPercentage = 60
#agent.channels.c1.byteCapacity        = 12800000000
agent.channels.c1.keep-alive          = 60


agent.sinks.k1.type = hdfs
agent.sinks.k1.hdfs.path = hdfs://dev-hadoop-single.com:8020/flume/events-01/%Y-%m-%d
agent.sinks.k1.hdfs.fileType=DataStream
#default:FlumeData
agent.sinks.k1.hdfs.filePrefix = log-spool
agent.sinks.k1.hdfs.fileShuffix = .log
agent.sinks.k1.hdfs.minBlockReplicas = 1 
flume测试

启动flume

$ flume-ng agent --conf /opt/modules/apache-flume-1.5.0-cdh5.3.6-bin/conf/agent/ --conf-file /opt/modules/apache-flume-1.5.0-cdh5.3.6-bin/conf/agent/example-exec-mem-hdfs.conf --name agent -Dflume.root.logger=INFO,console

写入数据
access_log >> access.log
测试结果

16/10/19 19:03:46 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026338.tmp
16/10/19 19:05:36 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026338.tmp
16/10/19 19:05:36 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026338.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026338
16/10/19 19:05:36 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026339.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026339.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026339.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026339
16/10/19 19:05:37 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026340.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026340.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026340.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026340
16/10/19 19:05:37 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026341.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026341.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026341.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026341
16/10/19 19:05:37 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026342.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026342.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026342.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026342
16/10/19 19:05:37 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026343.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026343.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026343.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026343
16/10/19 19:05:37 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026344.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026344.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026344.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026344
16/10/19 19:05:37 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026345.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026345.tmp
16/10/19 19:05:37 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026345.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026345
16/10/19 19:05:38 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026346.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026346.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026346.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026346
16/10/19 19:05:38 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026347.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026347.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026347.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026347
16/10/19 19:05:38 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026348.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026348.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026348.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026348
16/10/19 19:05:38 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026349.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026349.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026349.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026349
16/10/19 19:05:38 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026350.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026350.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026350.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026350
16/10/19 19:05:38 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026351.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026351.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026351.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026351
16/10/19 19:05:38 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026352.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026352.tmp
16/10/19 19:05:38 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026352.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026352
16/10/19 19:05:38 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026353.tmp
16/10/19 19:05:39 INFO hdfs.BucketWriter: Closing hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026353.tmp
16/10/19 19:05:39 INFO hdfs.BucketWriter: Renaming hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026353.tmp to hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026353
16/10/19 19:05:39 INFO hdfs.BucketWriter: Creating hdfs://dev-hadoop-single.com:8020/flume/events-01/2016-10-19/log-spool.1476875026354.tmp
基于径向基函数神经网络RBFNN的自适应滑模控制学习(Matlab代码实现)内容概要:本文介绍了基于径向基函数神经网络(RBFNN)的自适应滑模控制方法,并提供了相应的Matlab代码实现。该方法结合了RBF神经网络的非线性逼近能力和滑模控制的强鲁棒性,用于解决复杂系统的控制问题,尤其适用于存在不确定性和外部干扰的动态系统。文中详细阐述了控制算法的设计思路、RBFNN的结构与权重更新机制、滑模面的构建以及自适应律的推导过程,并通过Matlab仿真验证了所提方法的有效性和稳定性。此外,文档还列举了大量相关的科研方向和技术应用,涵盖智能优化算法、机器学习、电力系统、路径规划等多个领域,展示了该技术的广泛应用前景。; 适合人群:具备一定自动控制理论基础和Matlab编程能力的研究生、科研人员及工程技术人员,特别是从事智能控制、非线性系统控制及相关领域的研究人员; 使用场景及目标:①学习和掌握RBF神经网络与滑模控制相结合的自适应控制策略设计方法;②应用于电机控制、机器人轨迹跟踪、电力电子系统等存在模型不确定性或外界扰动的实际控制系统中,提升控制精度与鲁棒性; 阅读建议:建议读者结合提供的Matlab代码进行仿真实践,深入理解算法实现细节,同时可参考文中提及的相关技术方向拓展研究思路,注重理论分析与仿真验证相结合。
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值