Sawing Linux Logs with Simple Tools

So there you are with all of your Linux servers humming along happily. You have tested, tweaked, and configured until they are performing at their peak of perfection. Users are hardly whining at all. Life is good. You may relax and indulge in some nice, relaxing rounds of TuxKart. After all, you earned it.

Except for one little remaining chore: monitoring your log files. [insert horrible alarming music of your choice here.] You’re conscientious, so you know you can’t just ignore the logs until there’s a problem, especially for public services like Web and mail. Somewhere up in the pointy-haired suites, they may even be plotting to require you to track and analyze all sorts of server statistics.

Crafting clever, complex regular expressions is quite fun, but there are many simple searches that do the job just fine.

Not to worry, for there are many ways to implement data reduction, which is what log parsing is all about. You want to slice and dice your logs to present only the data you’re interested in viewing. Unless you wish to devote your entire life to manually analyzing log files. Even if you only pay attention to logfiles when you’re debugging a problem, having some tools to weed out the noise is helpful.

Good Ole grep

The simplest method is a keyword search. Suppose you want to separate out the 404 errors in your Apache log, and see if you have any missing files:

$ grep 404
... - - [30/Aug/2004:02:25:13 -0700] "GET /robots.txt HTTP/1.0" 404 - "-"
Pompos/1.3" - - [30/Aug/2004:10:32:26 -0700] "GET /robots.txt HTTP/1.0" 404 - "-"
"msnbot/0.11 (+" - - [12/Aug/2004:06:49:11 -0700] "GET /favicon.ico HTTP/1.1" 404 - "-"
"Opera/7.21 (X11; Linux i686; U) [en]"

These entries are typical. This site has no robots.txt or favicon, so any requests for these files generate a 404 error. The first two are Web bots. The third entry is probably some random surfer. You can ignore these. So let’s screen out robots.txt and favicon, and see what is left:

$ grep 404 | grep -v -E "favicon.ico|robots.txt"
.... - - [29/Aug/2004:20:59:27 -0700] "GET /images/142spacer.gif HTTP/1.0"
404 - "" "Mozilla/5.0 Galeon/1.2.7 (X11; Linux i686; U;) Gecko/20030131" - - [29/Aug/2004:21:00:08 -0700] "GET /email_crimes.html HTTP/1.0" 404
- "" "Mozilla/5.0 Galeon/1.2.7 (X11; Linux i686; U;) Gecko/20030131"

Now we’re getting somewhere. These two files — images/142spacer.gif and email_crimes.html — are referenced somewhere on the Web site, but they do not exist. This is something that should be fixed. How to find the URLs that refer to these files? grep can do this too. Suppose all the site files are in /var/www/bratgrrl:

$ grep -R "142spacer.gif" /var/www/bratgrrl

Here’s another cool grep trick for Apache logs. You doubtless noticed that the above examples were referred from When you’re checking to see where your traffic is coming from, you don’t care about local referrals. Weed them out with this:

$ cat | fgrep -v bratgrrl | cut -d" -f4 | grep -v ^-

Now you can see where traffic to your site is coming from, uncluttered by local references. Here’s how it works, piece by piece:

  • fgrep -v bratgrrl means “look for the literal string bratgrrl, then exclude lines that contain it.”
  • cut -d" -f4 means “using quotation marks as the delimiter, print only the text in the fourth field.” The fourth field is the text between the third and fourth quotation marks.
  • grep -v ^- means “exclude lines that start with a hyphen.” Try running the command without this to see why.

More Simple Stuff

Crafting clever, complex regular expressions is quite fun, and a more worthy use of one’s time than comatose drooling in front of “Reality TV.” However, there are many simple searches that do the job just fine. You can search /var/log/auth.log quickly to see if anyone has made an inordinate number of failed login attempts. The -i option does a case-insensitive search:

$ grep -i "fail" /var/log/auth.log
Sep 13 16:26:34 server02 PAM_unix[27462]: authentication failure; (uid=0) -> root for
ssh service
Sep 13 16:26:36 server02 sshd[27462]: Failed password for root from port
3210 ssh2
Sep 13 16:26:38 server02 PAM_unix[27464]: authentication failure; (uid=0) -> root for
ssh service
Sep 13 16:26:40 server02 sshd[27464]: Failed password for root from port
3210 ssh2

Well well, someone came a’ knockin’ on the SSH (secure shell) door. Knowledge is power — at this point, you could fine-tune your iptables to drop packets from the originating IP, or you could do a little sleuthing to find the source, or you could create a nice honeypot and amuse yourself trapping the no-good person trying to get into your system. You can even count the number of attempts:

$ grep "" /var/log/auth.log | wc -l

That’s a rather persistent little twit, I’d say.

Syslog, The Dumping Ground

The syslog — /var/log/syslog — is a dumping ground for log entries from all kinds of daemons, such as Samba and cron:

$ grep -i samba /var/log/syslog
Sep 13 08:50:47 windbag nmbd[1123]:   become_logon_server_success: Samba is now a 
logon server for workgroup HOMENET on subnet Sep 13 08:50:51 windbag nmbd[1123]: Samba server WINDBAG is now a domain master
browser for workgroup HOMENET on subnet Sep 13 08:51:06 windbag nmbd[1123]: Samba name server WINDBAG is now a local
master browser for workgroup HOMENET on subnet $ grep -i cron /var/log/syslog Aug 18 21:18:01 windbag /USR/SBIN/CRON[1752]: (amavis) CMD (test -e /usr/bin/sa-
learn && test -e /usr/sbin/amavisd-new && /usr/bin/sa-learn —rebuild >/dev/null 2>&1)

These two snippets demonstrate that you can verify that certain Samba functions are working correctly, and that your cron jobs are running when you want.

Another useful item in /var/log/syslog is those strange-looking MARK messages:

Sep 13 19:10:30 windbag — MARK —
Sep 13 19:30:30 windbag — MARK —
Sep 13 19:50:30 windbag — MARK —

This is where you find out if your system rebooted during the night when it wasn’t supposed to; the MARK sequence will be interrupted, and you’ll see shutdown and startup messages.

Next month’s Scripting Clinic will show how to set up automated email alerts, so when something nasty that requires your attention shows up in your logs, you won’t be left in the dark.


  • See the man pages for grep, cut, and wc.
  • Linux in a Nutshell, by Ellen Siever, is my #1 indispensable Linux command reference

Latest Articles

Follow Us On Social Media

Explore More