官网下载地址:点击下载
附上一个中文文档地址:点击跳转
上传服务器并且解压:
tar -xzvf logstash-5.5.1.tar.gz
注意:必须要实用jdk1.8以上的版本
- 1.先来个简单的标准输入输出
创建一个文件std_in_out.conf:
input{
stdin{
}
}
output{
stdout{
}
}
启动logstash:
../bin/logstash -f std_in_out.conf
如下图,当你标准输入是啥,标准输出就是啥:
- 2.日志文件到标准输出
首先创建一个配置文件tomcat.conf
input{
file{
path =>"/usr/java/elk/logstash-5.5.1/mytest/tomcat_test.log"
start_position=>"beginning"
}
}
filter{
grok{
match=>{
"message"=>"%{DATA:clientIp} - - \[%{HTTPDATE:accessTime}\] \"%{DATA:method} %{DATA:requestPath} %{DATA:httpversion}\" %{DATA:retcode} %{DATA:size} \"%{DATA:fromHtml}\" \"%{DATA:useragent}\""
}
remove_field=>"message"
}
date{
match=>["accessTime","dd/MMM/yyyy:HH:mm:ss Z"]
}
}
output{
stdout{
codec=>rubydebug
}
}
然后在上面指定目录下创建一个文件tomcat_test.log
启动logstash,并且模拟日志的生成,讲如下日志重定向到tomcat_test.log中
192.168.80.123 - - [19/Oct/2017:13:45:29 +0800] "GET /mytest/rest/api01/v1.4.0?workIds=10086 HTTP/1.1" 200 78 "http://www.baidu.com/s?wd=www" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)"
192.168.80.123 - - [19/Oct/2017:13:50:29 +0800] "GET /mytest/rest/api02/v1.4.0?workIds=10086 HTTP/1.1" 200 78 "http://www.baidu.com/s?wd=www" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)"
192.168.80.123 - - [19/Oct/2017:14:50:29 +0800] "GET /mytest/rest/api03/v1.4.0?workIds=10086 HTTP/1.1" 200 78 "http://www.baidu.com/s?wd=www" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)"
启动:
../bin/logstash -f tomcat.conf
重定向:
echo "192.168.80.123 - - [19/Oct/2017:13:45:29 +0800] \"GET /mytest/rest/api01/v1.4.0?workIds=10086 HTTP/1.1\" 200 78 \"http://www.baidu.com/s?wd=www\" \"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)\"">> tomcat_test.log
查看标准输出:
{
"method" => "GET",
"useragent" => "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)",
"path" => "/usr/java/elk/logstash-5.5.1/mytest/tomcat_test.log",
"@timestamp" => 2017-10-19T05:45:29.000Z,
"size" => "78",
"clientIp" => "192.168.80.123",
"@version" => "1",
"host" => "lijie",
"httpversion" => "HTTP/1.1",
"requestPath" => "/mytest/rest/api01/v1.4.0?workIds=10086",
"accessTime" => "19/Oct/2017:13:45:29 +0800",
"retcode" => "200",
"fromHtml" => "http://www.baidu.com/s?wd=www"
}
- 3.logstash从文件到es
同样先创建配置文件tomcat2es.conf:
input{
file{
path =>"/usr/java/elk/logstash-5.5.1/mytest/tomcat_test.log"
start_position=>"beginning"
}
}
filter{
grok{
match=>{
"message"=>"%{DATA:clientIp} - - \[%{HTTPDATE:accessTime}\] \"%{DATA:method} %{DATA:requestPath} %{DATA:httpversion}\" %{DATA:retcode} %{DATA:size} \"%{DATA:fromHtml}\" \"%{DATA:useragent}\""
}
remove_field=>"message"
}
date{
match=>["accessTime","dd/MMM/yyyy:HH:mm:ss Z"]
}
}
output{
elasticsearch {
hosts => "192.168.80.123"
}
stdout{
codec=>rubydebug
}
}
启动logstash:
../bin/logstash -f tomcat2es.conf
查看es,这里直接用kibana查看了,如下图:
- 4.logstash从文件到kafka
启动kafka:
#启动zk
/usr/java/zookeeper/bin/zkServer.sh start
#启动kafka
/usr/java/kafka_2.11-0.10.1.1/bin/kafka-server-start.sh -daemon /usr/java/kafka_2.11-0.10.1.1/config/server.properties
#创建topic
/usr/java/kafka_2.11-0.10.1.1/bin/kafka-topics.sh --create --zookeeper 192.168.80.123:2181 --replication-factor 1 --partitions 3 --topic logstash01
编辑logstash的配置文件:
input {
file {
path => "/home/hadoop/monitor/*.txt"
discover_interval => 5
start_position => "beginning"
}
}
output {
kafka {
topic_id => "logstash01"
codec => plain {
format => "%{message}"
charset => "UTF-8"
}
bootstrap_servers => "192.168.80.123:9092"
}
}
运行kafka的消费者:
/usr/java/kafka_2.11-0.10.1.1/bin/kafka-console-consumer.sh --zookeeper 192.168.80.123:2181 --topic logstash01 --from-beginning
运行logstash:
/usr/java/elk/logstash-5.5.1/bin/logstash -f /usr/java/elk/logstash-5.5.1/execconf/log2kafka.conf
结果:
原本监控的文件中有些数据,然后自己用如下命令添加一行数据:
echo "hehe haha lala lele" >> log01.txt
logstash也能监控到并写入到kafka中.