最近在深入系统运维的事情,从服务器配置、调优、维护,到各种脚本编写。实现自动化智能运维的目标还要很远。
从nginx的日志中分析每日有效的pv和各搜索引擎爬虫的请求数。脚本用awk实现。
函数库文件 stat_func.sh
[c-sharp] view plain copy print ?
- #!/bin/bash
- stat_log_path=/usr/local/qqsa/result
- stat_nginx_log()
- {
- local basename=`basename $1`
- local newfile="${stat_log_path}/${basename%%.*}.pv.`date +'%Y-%m-%d'`"
- awk -F/" '
- {
- if( $2 ~ /css|txt|ico/ ) res["static"]++;
- else if ( $6 ~ /Baiduspider/ ) {res["baidu"]++;res["spider"]++;}
- else if ( $6 ~ /Googlebot/ ) {res["google"]++;res["spider"]++;}
- else if ( $6 ~ /Yahoo! Slurp/) {res["yahoo"]++;res["spider"]++;}
- else if ( $6 ~ /Sogou web spider/) {res["sogou"]++;res["spider"]++;}
- else if ( $6 ~ /Sosospider/) {res["soso"]++;res["spider"]++;}
- else if ( $6 ~ /YodaoBot/) {res["youdao"]++;res["spider"]++;}
- else if ( $6 ~ /msnbot/) {res["msnbot"]++;res["spider"]++;}
- else if ( $6 ~ /iaskspider/) {res["sina"]++;res["spider"]++;}
- else res["good"]++;
- }
- END {
- for(item in res) print item ":" res[item]
- }' $1 > $newfile
- }
#!/bin/bash stat_log_path=/usr/local/qqsa/result stat_nginx_log() { local basename=`basename $1` local newfile="${stat_log_path}/${basename%%.*}.pv.`date +'%Y-%m-%d'`" awk -F/" ' { if( $2 ~ /css|txt|ico/ ) res["static"]++; else if ( $6 ~ /Baiduspider/ ) {res["baidu"]++;res["spider"]++;} else if ( $6 ~ /Googlebot/ ) {res["google"]++;res["spider"]++;} else if ( $6 ~ /Yahoo! Slurp/) {res["yahoo"]++;res["spider"]++;} else if ( $6 ~ /Sogou web spider/) {res["sogou"]++;res["spider"]++;} else if ( $6 ~ /Sosospider/) {res["soso"]++;res["spider"]++;} else if ( $6 ~ /YodaoBot/) {res["youdao"]++;res["spider"]++;} else if ( $6 ~ /msnbot/) {res["msnbot"]++;res["spider"]++;} else if ( $6 ~ /iaskspider/) {res["sina"]++;res["spider"]++;} else res["good"]++; } END { for(item in res) print item ":" res[item] }' $1 > $newfile }
执行文件 stat_every_day.sh
[c-sharp] view plain copy print ?
- #!/bin/bash
- . ./stat_func.sh
- date=`date +'%Y-%m-%d'`
- find /data/cs_log_backup/${date} -name "*.access.*" | /
- while read file
- do
- $( stat_nginx_log "$file" )
- done
#!/bin/bash . ./stat_func.sh date=`date +'%Y-%m-%d'` find /data/cs_log_backup/${date} -name "*.access.*" | / while read file do $( stat_nginx_log "$file" ) done
crontab -e
最下面增加一行
00 3 * * * /usr/local/maintain/stat_every_day.sh > /dev/null 2 >& 1
参考文献:
http://tech.foolpig.com/2008/07/09/linux-shell-char/ shell字符串的截取
http://storysky.blog.51cto.com/628458/270671 用AWK来过滤nginx日志中的特定值
http://storysky.blog.51cto.com/628458/271560 用SED+AWK来分析NGINX日志
http://book.douban.com/subject/1236944/ sed与awk
转载http://blog.csdn.net/LongMarch12/archive/2011/02/24/6204550.aspx