nginx日志分割与分析脚本实例分享

发布时间:2020-10-30编辑:脚本学堂
本文介绍了nginx下日志分割与日志分析的几个脚本,有关nginx日志处理的一些shell脚本,需要的朋友参考下。

假设nginx日志log的格式如下:
 

log_format  access_format_with_resp_time  '$remote_addr - $remote_user [$time_local] "$request" '
              '$status $body_bytes_sent "$http_referer" '
              '"$http_user_agent" $http_x_forwarded_for  "$request_time" $request_length';

一,nginx日志切割脚本
vi /root/cutlog.sh
 

复制代码 代码示例:

#!/bin/bash
I=`ps aux | grep nginx | grep root | grep -v 'grep nginx' | linuxjishu/13830.html target=_blank class=infotextkey>awk '{print $14}'`    #查找nginx进程
if [ $I == /usr/local/nginx/sbin/nginx ];then
ACCLOG=`cat /usr/local/nginx/conf/nginx.conf | grep  ' access_log' | awk '{print $2}'`  #如果nginx进程在,就找到配置文件,读取accesslog路径
ERRLOG=`cat /usr/local/nginx/conf/nginx.conf| grep  ^error  | awk '{print $2}'| cut  -d";" -f1`  #错误日志的路径
ls $ACCLOG     #查看是否有此文件
if [ $? -eq 0 ];then    #如果有
mv $ACCLOG  $ACCLOG.`date -d "-1 day" +%F`  #重命名当前日志
mv $ERRLOG $ERRLOG.`date -d "-1 day" +%F`
touch $ACCLOG    #创建空日志
touch $ERRLOG
chown nginx:root  $ACCLOG   #修改属主
chown nginx:root  $ERRLOG
[ -f /usr/local/nginx/logs/nginx.pid ] && kill -USR1 `cat /usr/local/nginx/logs/nginx.pid`     #判断进程,并重新加载(这里的kill -USR1会使nginx将新产生的日志写到刚创建的新日志里面。)
/mnt/logs/checklog.sh $ACCLOG.`date "-1 day" +%F` #这个是日志分析脚本
gzip $ACCLOG.`date -d "-1 day" +%F`  #压缩日志
gzip $ERRLOG.`date -d "-1 day" +%F`

mv  $ACCLOG.`date -d "-10 day" +%F`.*  /mnt/history.nginx.log/   #将10天前的老日志清理到其他地方,(你们如果想删除的可以自己改成删除)
mv  $ERRLOG.`date -d "-10 day" +%F`.*  /mnt/history.nginx.log/
fi
fi

二,nginx日志分析脚本:
vi /mnt/logs/checklog.sh
 

复制代码 代码示例:

#!/bin/bash
echo -e  "####################`date +%F`" >> /mnt/logs/400.txt
echo -e  "####################`date +%F`" >> /mnt/logs/URL.txt
echo -e  "####################`date +%F`" >> /mnt/logs/IP.txt
cat $1 | wc -l >> /mnt/logs/IP.txt   #分析IP
cat  $1  | awk -F'"'  '{print $3}' | awk '{print $1}' | sort | uniq -c| sort -rn >  /mnt/logs/CODE.txt   #分析返回值
cat $1 |  awk   '{print $1}' |  sort | uniq -c| sort -rn | head -n20  >> /mnt/logs/IP.txt 
N=`cat /mnt/logs/CODE.txt | wc -l`
for I in $(seq 1 $N)
do
M=`head -n$I /mnt/logs/CODE.txt | tail -n1 | awk '{print $2}'`
if [ $M -ge 400 ]
then

echo "#####FIND $M###############">>/mnt/logs/400.txt   #分析错误请求
cat $1 | grep "" $M "  | grep -v ' "-" "-" - ' | sort | awk '{print $1 $2 $3 $6 $7 $8 $9 $10 $11 $12 $13 $14 $15 $16 $17 $18 $19 $20 $21}' | sort | uniq -c  | sort -rn  | head -n5 >> /mnt/logs/400.txt
fi
done
cat  $1 | grep -v ' "-" "-" - ' | awk -F'T' '{print $2}' | awk -F'?' '{print $1}' | sort |awk '{print $1}' | sed  's/(/review/file/download/).*/1/g'   | sort | uniq -c | sort -rn | head -n20 >> /mnt/logs/URL.txt    

三,统计30天nginx日志,特定URL中上传下载的请求的平均速度,和平均一次请求的时间
 

复制代码 代码示例:

#!/bin/bash
for I in {1..30}
do
cat /mnt/logs/host.access.log.`date -d  "-$I day" +%F` | grep /review/file/previewload/ | awk -F'"' '{print "previewload",$3,$8 }' >> ./sulv.txt
cat /mnt/logs/host.access.log.`date -d  "-$I day" +%F` | grep /review/file/pconvert/ | awk -F'"' '{print "pconvert",$3,$8,$9 }' >> ./sulv.txt
cat /mnt/logs/host.access.log.`date -d  "-$I day" +%F` | grep /review/file/download/ | awk -F'"' '{print "download",$3,$8 }'| grep 200  >> ./sulv.txt
cat /mnt/logs/host.access.log.`date -d  "-$I day" +%F` | grep /review/file/upload/ | awk -F'"' '{print "upload",$3,$8,$9 }' | grep 200  >> ./sulv.txt
done
cat ./sulv.txt | grep previewload |  awk '(tottime+=$4)(totsize+=$3);END{ print tottime/NR,totsize/tottime}' | tail -n1 | awk '{print  "previewload average time:" $1 ,"average speed:"$2}'

cat ./sulv.txt | grep pconvert | awk '(tottime+=$4);END{ print tottime/NR}' | tail -n1 | awk '{print  "pconvert average time:" $1 }'
cat ./sulv.txt | grep download | awk '(tottime+=$4)(totsize+=$3);END{ print totsize/tottime}' | tail -n1 | awk '{print  "download average speed:"$1}'

cat ./sulv.txt | grep upload | awk '(tottime+=$4)(totsize+=$5);END{ print totsize/tottime}' | tail -n1 | awk '{print  "upload average speed:"$1}'

四,nginx等日志的qps统计:
 

复制代码 代码示例:

#!/bin/bash

while true;do
tail /mnt/logs/host.access.log -f --pid=19139| grep `date '+%T'`|wc -l
done

五,nginx等日志的qps统计1分钟:
 

复制代码 代码示例:
#!/bin/bash
while true;do
tail /mnt/logs/host.access.log -f  -s 60 --pid=19139| grep `date '+%T'`|wc -l
done