War stories important to defence

By on
War stories important to defence

Disclosure is a must so that the industry can learn from its mistakes.

The media is full of security horror stories of companies being breached by attackers, but we know very little information on the attacks.

As an incident responder, I'm always looking for factual details on the attacks, rather than conjecture, hearsay or pure guess work.

"Drinking that morning coffee while flicking through the highlights of systems should be part of the job description"

Back in April, Barracuda Networks was compromised and lost names and email addresses.

They disclosed the breach and in an admirable step published details of how the breach occurred, including screen shots of logs, and their lessons learnt from the attack.

Such disclosure is an act I hope others unfortunate enough to be breached follow so the rest of us may learn from it and harden our own systems.

We need the security professionals at the coalface of these breached companies to step up and take the time to write up a breach post mortem.

Because the attackers share their tips and tricks, as anyone looking at the uploaded chat logs to public sites like pastebin can attest.

If attacks are recorded in logs it means we as defenders have done the right thing: If the attack couldn't or wasn't blocked, then being able to replay how a system was compromised is the only way to stop it occurring again. 

Logs review should be a intrinsic routine performed by everyone, daily if possible. You can review line by line, use the simple batch script Grep, or a top notch Security Information and Event Management System to parse the logs in to an easy to read digest that a novice could understand.

This should be part of the working day process for all levels of support and security staff; drinking that morning coffee while flicking through the highlights of systems should be part of the job description.

Log files need to be informative and easy to understand. As someone who works with huge Windows IIS logs files, I can tell you that automation is your friend.

Jason Fossen's Search_Text_Log script is a great starting point for scripters, while Microsoft's log parser is a more dynamic analysis tool that is well worth taking the time to master.

Microsoft has a great example of how easy it is to pull pertinent data from IIS logs and a separate blog that details how to get visual trending IIS data. If log analysis isn't your thing, then check out Honeynet.org challenge.

It's important that logging is enabled on your systems and reviewed to ensure it produces useful information.

Multiple logs must use the same time source to make correlation easy, so make sure your environment is configured and logging correctly. It will same you heartache when it comes time to review an incident.

Chris Mohan is a researcher with the SANS Institute's Internet Storm Center.

Got a news tip for our journalists? Share it with us anonymously here.

Copyright © SC Magazine, Australia

Tags:

Most Read Articles

Log In

  |  Forgot your password?