IMCAFS

Home

security analysis skills of web log

Posted by punzalan at 2020-02-25
all

Ox01 Web log

Web access log records all kinds of original information, such as web server receiving and processing requests and runtime errors. Through the security analysis of Web log, not only can we locate the attacker, but also can help us restore the attack path, find the security vulnerability of the website and repair it.

Let's look at an Apache access log:

127.0.0.1 - - [11/Jun/2018:12:47:22 +0800] "GET /login.html HTTP/1.1" 200 786 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.139 Safari/537.36"

Through this web access log, we can clearly know what IP address, what time, what operating system and what browser the user has visited on which page of your website, and whether the visit is successful.

This paper introduces the thinking and common skills of Web log security analysis.

0x02 log analysis skills

In the security analysis of Web logs, we can generally follow two ways of thinking, gradually deepen and restore the whole attack process.

The first is to determine the time range of the intrusion, and take this as a clue to find the suspicious logs within this time range, further investigate, finally determine the attacker, and restore the attack process.

Second, after an attacker intrudes into a website, he or she usually leaves a back door to maintain his or her permission, so that he or she can easily access it again. We can find the file and use it as a clue to analyze it.

Common analysis tools:

Under window, emeditor is recommended for log analysis. It supports large text and has good search efficiency.

Under Linux, use shell command to combine query analysis.

Shell + Linux command realizes log analysis, which is generally combined with grep, awk and other commands to achieve several common log analysis and statistics skills.

Apache log analysis tips:

1、列出当天访问次数最多的IP命令: cut -d- -f 1 log_file|uniq -c | sort -rn | head -20 2、查看当天有多少个IP访问: awk '{print $1}' log_file|sort|uniq|wc -l 3、查看某一个页面被访问的次数: grep "/index.php" log_file | wc -l 4、查看每一个IP访问了多少个页面: awk '{++S[$1]} END {for (a in S) print a,S[a]}' log_file 5、将每个IP访问的页面数进行从小到大排序: awk '{++S[$1]} END {for (a in S) print S[a],a}' log_file | sort -n 6、查看某一个IP访问了哪些页面: grep ^111.111.111.111 log_file| awk '{print $1,$7}' 7、去掉搜索引擎统计当天的页面: awk '{print $12,$1}' log_file | grep ^\"Mozilla | awk '{print $2}' |sort | uniq | wc -l 8、查看2018年6月21日14时这一个小时内有多少IP访问: awk '{print $4,$1}' log_file | grep 21/Jun/2018:14 | awk '{print $2}'| sort | uniq | wc -l

0x03 log analysis case

Example of Web Log Analysis: it is forwarded to a server on the intranet through nginx agent. There are many Trojans uploaded in the directory of a site on the intranet server. Although they can't be resolved under ii7, they still want to find out who uploaded through what path.

Here, we encounter a problem: because of the setting of proxy forwarding, only the IP of proxy server is recorded, and no visitor IP is recorded? At this time, how to identify different visitors and attack sources?

This is the problem of improper configuration of administrator log, but the good thing is that we can locate different access sources and restore the attack path through browser fingerprint.

1. Locate the source of attack

First of all, we found only one record of the picture Trojan. Because all the access logs only record the proxy IP, and can't restore the attack path through IP, at this time, we can use the browser fingerprint to locate.

Browser fingerprint:

Mozilla/4.0+(compatible;+MSIE+7.0;+Windows+NT+6.1;+WOW64;+Trident/7.0;+SLCC2;+.NET+CLR+2.0.50727;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30729;+.NET4.0C;+.NET4.0E)

2. Search related logs

By filtering the log records related to the fingerprint of the browser, the attacker's attack path can be clearly seen.

3. Read the access log found. The attacker's access path is as follows:

A、攻击者访问首页和登录页 B、攻击者访问MsgSjlb.aspxMsgSebd.aspx C、攻击者访问Xzuser.aspx D、攻击者多次POST(怀疑通过这个页面上传模块缺陷) E、攻击者访问了图片木马

Open the website and visit xzuser.aspx to confirm that the attacker uploaded the picture Trojan horse through the progress file of the page. At the same time, it is found that there is an unauthorized access vulnerability in the website. The attacker can access a specific URL and enter the background interface without logging in. Through log analysis to find the vulnerability location of the website and repair it.

0x04 log statistical analysis skills

Statistics crawler:

grep -E 'Googlebot|Baiduspider' /www/logs/access.2019-02-23.log | awk '{ print $1 }' | sort | uniq

Statistics browser:

cat /www/logs/access.2019-02-23.log | grep -v -E 'MSIE|Firefox|Chrome|Opera|Safari|Gecko|Maxthon' | sort | uniq -c | sort -r -n | head -n 100

IP statistics:

grep '23/May/2019' /www/logs/access.2019-02-23.log | awk '{print $1}' | awk -F'.' '{print $1"."$2"."$3"."$4}' | sort | uniq -c | sort -r -n | head -n 10 2206 219.136.134.13 1497 182.34.15.248 1431 211.140.143.100 1431 119.145.149.106 1427 61.183.15.179 1427 218.6.8.189 1422 124.232.150.171 1421 106.187.47.224 1420 61.160.220.252 1418 114.80.201.18

Statistical network segment:

cat /www/logs/access.2019-02-23.log | awk '{print $1}' | awk -F'.' '{print $1"."$2"."$3".0"}' | sort | uniq -c | sort -r -n | head -n 200

Statistics domain name:

cat /www/logs/access.2019-02-23.log |awk '{print $2}'|sort|uniq -c|sort -rn|more cat /www/logs/access.2019-02-23.log |awk '{print $9}'|sort|uniq -c|sort -rn|more 5056585 304 1125579 200 7602 400 5 301 cat /www/logs/access.2019-02-23.log |awk '{print $7}'|sort|uniq -c|sort -rn|more

File traffic statistics:

cat /www/logs/access.2019-02-23.log |awk '{sum[$7]+=$10}END{for(i in sum){print sum[i],i}}'|sort -rn|more grep ' 200 ' /www/logs/access.2019-02-23.log |awk '{sum[$7]+=$10}END{for(i in sum){print sum[i],i}}'|sort -rn|more

URL traffic statistics:

cat /www/logs/access.2019-02-23.log | awk '{print $7}' | egrep '\?|&' | sort | uniq -c | sort -rn | more

Script speed:

Find out the slowest scripts

grep -v 0$ /www/logs/access.2019-02-23.log | awk -F '\" ' '{print $4" " $1}' web.log | awk '{print $1" "$8}' | sort -n -k 1 -r | uniq > /tmp/slow_url.txt

IP, URL extraction:

tail -f /www/logs/access.2019-02-23.log | grep '/test.html' | awk '{print $1" "$7}'

I have created a free small secret circle, and sincerely invite you to join in sharing knowledge.