本文主要使用的是grep,awk,cut等工具来对nginx日志进行统计和分析,具体如下:
1,列出当天访问最多次数的ip地址
cut -d- -f 1 /usr/local/nginx/logs/20160329/access_2016032913.log |uniq -c | sort -rn | head -20
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
|
[root@httpservera 20160329]# cut -d- -f 1 /usr/local/nginx/logs/20160329/access_2016032913.log |uniq -c | sort -rn | head -20
69 180.116.214.31
45 180.116.214.31
45 180.116.214.31
36 49.80.54.111
35 183.206.185.204
35 180.116.214.31
32 49.80.54.111
32 49.80.54.111
32 180.116.214.31
31 117.136.45.101
29 180.116.214.31
28 218.205.19.112
28 180.116.214.31
28 180.116.214.31
27 49.80.54.111
27 222.185.248.242
24 49.80.54.111
24 175.0.8.161
23 49.80.54.111
23 49.80.54.111
|
2,查看某一个页面被访问的次数
[root@httpservera 20160329]#grep "/index.php" log_file | wc -l
3,查看每一个IP访问了多少页面并排序
awk '{++S[$1]} END {for (a in S) print a,S[a]}' access_2016032913.log |uniq|sort -rn|more
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
|
[root@httpservera 20160329]# awk '{++S[$1]} END {for (a in S) print a,S[a]}' access_2016032913.log |uniq|sort -rn|more
223.94.229.51 148223.73.166.191 1223.68.252.103 156223.68.167.66 2223.68.106.138 43223.67.99.72 7223.67.153.173 12223.66.93.152 15223.66.38.31 103223.65.191.181 1223.65.191.135 11223.65.190.71 13223.65.141.78 3223.64.63.71 31223.64.63.229 7223.64.62.242 59223.64.62.23 27223.64.62.216 1223.64.62.160 40223.64.61.136 28223.64.60.80 13223.64.60.21 12223.64.237.37 187223.64.209.247 2223.64.158.4 15 |
其中,sort -rn 按照数字从大到小排序,uniq 将重复行去除。
4,查看某一个ip访问了那些页面:grep ^xx.xx.xx.xx log_file |awk '{print $1,$7}'
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
|
[root@httpservera 20160329]# grep ^223.147.39.194 17 access_2016032913.log |awk '{print $1,$7}' grep: 17: No such file or directory
access_2016032913.log:223.147.39.194 //customer/customerInfo/getCustUnReadMsgInfo.json
access_2016032913.log:223.147.39.194 //customer/customerInfo/getCustUnReadMsgInfo.json
access_2016032913.log:223.147.39.194 //remind/redDot/checkRedDot.json
access_2016032913.log:223.147.39.194 //remind/redDot/checkRedDot.json
access_2016032913.log:223.147.39.194 //thirdpartyapi/appaction/app_action/action_send_batch.json
access_2016032913.log:223.147.39.194 //customer/customerInfo/getCustUnReadMsgInfo.json
access_2016032913.log:223.147.39.194 //customer/customerInfo/getCustUnReadMsgInfo.json
access_2016032913.log:223.147.39.194 //remind/redDot/checkRedDot.json
access_2016032913.log:223.147.39.194 //remind/redDot/checkRedDot.json
access_2016032913.log:223.147.39.194 //customer/customerInfo/getCustUnReadMsgInfo.json
access_2016032913.log:223.147.39.194 //customer/customerInfo/getCustUnReadMsgInfo.json
access_2016032913.log:223.147.39.194 //remind/redDot/checkRedDot.json
access_2016032913.log:223.147.39.194 //remind/redDot/checkRedDot.json
access_2016032913.log:223.147.39.194 //customer/customerInfo/getCustUnReadMsgInfo.json
access_2016032913.log:223.147.39.194 //customer/customerInfo/getCustUnReadMsgInfo.json
access_2016032913.log:223.147.39.194 //remind/redDot/checkRedDot.json
access_2016032913.log:223.147.39.194 //remind/redDot/checkRedDot.json
|
5,去掉搜索引擎统计当天的页面:awk '{print $12,$1}' access_2016032913.log | grep ^\"Mozilla | awk '{print $2}' |sort | uniq | wc -l
|
1
2
|
[root@httpservera 20160329]# awk '{print $12,$1}' access_2016032913.log | grep ^\"Mozilla | awk '{print $2}' |sort | uniq | wc -l
35 |
6,查看一个小时内有多少ip访问:
|
1
2
|
[root@httpservera 20160329]# awk '{print $4,$1}' access_2016032913.log | grep 29/Mar/2016:13 | awk '{print $2}'| sort | uniq | wc -l
1926 |
2.访问量统计
1.根据访问IP统计UV
awk '{print $1}' access.log|sort | uniq -c |wc -l
2.统计访问URL统计PV
awk '{print $7}' access.log|wc -l
3.查询访问最频繁的URL
awk '{print $7}' access.log|sort | uniq -c |sort -n -k 1 -r|more
4.查询访问最频繁的IP
awk '{print $1}' access.log|sort | uniq -c |sort -n -k 1 -r|more
5.根据时间段统计查看日志
cat access.log| sed -n '/14\/Mar\/2015:21/,/14\/Mar\/2015:22/p'|more
|
1
2
3
4
5
6
7
8
9
10
|
1.根据访问IP统计UVawk '{print $1}' access.log|sort | uniq -c |wc -l
2.统计访问URL统计PVawk '{print $7}' access.log|wc -l
3.查询访问最频繁的URLawk '{print $7}' access.log|sort | uniq -c |sort -n -k 1 -r|more
4.查询访问最频繁的IPawk '{print $1}' access.log|sort | uniq -c |sort -n -k 1 -r|more
5.根据时间段统计查看日志 cat access.log| sed -n '/14\/Mar\/2015:21/,/14\/Mar\/2015:22/p'|more
|
备注:nginx 日志分割脚本
|
1
2
3
4
5
6
7
8
9
10
11
12
|
[root@iZ237lzm354Z logs]# vim /opt/shell/nginx_log.sh
#! /bin/bash#Power by guojinbaodate=`date +%Y-%m-%d-%H-%M-%S`
logfile="/guojinbao/nginx/logs/access.log"
logdir=/guojinbao/nginx/logs
pid=`cat /usr/local/nginx/logs/nginx.pid`
if [ ! -d $logdir ]; then
mkdir -p $logdir
fi/bin/mv $logfile $logdir/access_${date}.log
kill -HUP $pid
|
Nginx日志分析技巧
2350

被折叠的 条评论
为什么被折叠?



