{"id":43,"date":"2021-05-03T08:35:17","date_gmt":"2021-05-03T12:35:17","guid":{"rendered":"http:\/\/www.dev-notes.com\/blog\/?p=43"},"modified":"2021-05-03T16:34:43","modified_gmt":"2021-05-03T20:34:43","slug":"parsing-apache-logs-with-tail-cut-sort-and-uniq","status":"publish","type":"post","link":"https:\/\/www.dev-notes.com\/blog\/2021\/05\/03\/parsing-apache-logs-with-tail-cut-sort-and-uniq\/","title":{"rendered":"Parsing Apache Logs with tail, cut, sort, and uniq"},"content":{"rendered":"<p>A client experienced some intermittent website down time last week during the final few days of April 2021, and sent over that month&#8217;s Apache logs for me to see if there is anything out of the ordinary &#8211; excessive crawling, excessive probing, brute force password attacks, things of that nature.  Below are a few commands I have used that I thought would be nice to keep handy for future uses.  I am currently using Ubuntu 20.04 LTS.<\/p>\n<p>While unrelated, just to form a complete picture, my client sent the logs to me in gz-compressed format.  If you are not familiar on how to uncompress it, it is fairly straight forward:<\/p>\n<pre class=\"code\">\r\ngunzip 2021-APR.gz\r\n<\/pre>\n<p>Back on topic&#8230; I ended on parsing the file in three separate ways for me to get an overall view of things.  I found that the final few days of April are represented in the roughly final 15,000 lines of the log file, so I decided to use the tail command as my main tool.<\/p>\n<p>First, I did following command below to find which IP addresses hit the server the most:<\/p>\n<pre class=\"code\">\r\ntail -n 15000 filename.log | cut -f 1 -d ' ' | sort | uniq -c | sort -nr | more\r\n<\/pre>\n<p>Quick explanation:<\/p>\n<ul>\n<li>The tail command pulls the final 15,000 lines from the log file (final few days of the month)<\/li>\n<li>The cut command parses each line using a space delimiter and returns the first field (the IP addres)<\/li>\n<li>The sort command sorts the results thus far<\/li>\n<li>The uniq command groups the results thus far and provides a count<\/li>\n<li>The second sort command reverse the sort so the highest result is on top<\/li>\n<li>Finally, the more command creates screen-sized pagination so it&#8217;s easier to read<\/li>\n<\/ul>\n<p>There is always more than one way to do something in Linux, of course.  Just as an aside, the following functions very similarly:<\/p>\n<pre class=\"code\">\r\ncat filename.log | awk '{print $1}' | sort | uniq -c | sort -nr | tail -15000 | more\r\n<\/pre>\n<p>Then, I thought it would be nice to get an idea of how many requests were made per hour.  This can be achieved with the command below.<\/p>\n<pre class=\"code\">\r\ntail -n 15000 filename.log | cut -f 4 -d ' ' | cut -f 1,2 -d ':' | sort | uniq -c | more\r\n<\/pre>\n<p>The main difference here is that I opted for the 4th (rather than 1st) result of the cut command, which gets me the timestamp element (rather than IP address), and then a second cut command parses it on the colon symbol and returns the first (date) and second (hour) for further grouping.<\/p>\n<p>Finally, I tweaked it a little bit more so I get an idea of whether there was excessive requests within any minute-span.  This can be achieved by expanding the second cut command slightly, as per below.<\/p>\n<pre class=\"code\">\r\ntail -n 15000 filename.log | cut -f 4 -d ' ' | cut -f 1,2,3 -d ':' | sort | uniq -c | more\r\n<\/pre>\n","protected":false},"excerpt":{"rendered":"<p>A client experienced some intermittent website down time last week during the final few days of April 2021, and sent over that month&#8217;s Apache logs for me to see if there is anything out of the ordinary &#8211; excessive crawling, excessive probing, brute force password attacks, things of that nature. Below are a few commands &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/www.dev-notes.com\/blog\/2021\/05\/03\/parsing-apache-logs-with-tail-cut-sort-and-uniq\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Parsing Apache Logs with tail, cut, sort, and uniq&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[25,30],"tags":[],"class_list":["post-43","post","type-post","status-publish","format-standard","hentry","category-apache","category-ubuntu"],"_links":{"self":[{"href":"https:\/\/www.dev-notes.com\/blog\/wp-json\/wp\/v2\/posts\/43","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.dev-notes.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.dev-notes.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.dev-notes.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.dev-notes.com\/blog\/wp-json\/wp\/v2\/comments?post=43"}],"version-history":[{"count":4,"href":"https:\/\/www.dev-notes.com\/blog\/wp-json\/wp\/v2\/posts\/43\/revisions"}],"predecessor-version":[{"id":397,"href":"https:\/\/www.dev-notes.com\/blog\/wp-json\/wp\/v2\/posts\/43\/revisions\/397"}],"wp:attachment":[{"href":"https:\/\/www.dev-notes.com\/blog\/wp-json\/wp\/v2\/media?parent=43"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.dev-notes.com\/blog\/wp-json\/wp\/v2\/categories?post=43"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.dev-notes.com\/blog\/wp-json\/wp\/v2\/tags?post=43"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}