The Top Ten of File-Integrity Monitoring

The PCI DSS (Payment Card Industry Data Security Standard) specifies the following

“Use file-integrity monitoring or change-detection software on logs to ensure that existing log data cannot be changed without generating alerts (although new data being added should not cause an alert)”

File or host integrity monitoring software can serve a significant and distinct role in your security policy. Host integrity monitoring software serves as another hurdle for an attacker to defeat and can provide the first indication of a break-in or compromised host. When properly configured and deployed, this type of software is a powerful addition to the layers that defend your infrastructure in depth.

File-integrity monitoring is vitally important from a security standpoint for the following reasons

1. File-integrity monitoring must always be combined with other practices such as event log analysis, anti virus, firewalling and intrusion detection/protection systems, remote logging, and keeping your hosts up to date with security patches.

2. Host-based monitoring tools such as anti-virus and host intrusion protection systems (HIPS) providing firewall and intrusion protection give granularity that makes attacks visible on the host on which they are installed. However, no one system or application by itself can be trusted with the task of providing assurance of host integrity. For instance, Zero Day Attacks ie newly introduced security vulnerabilities which are either systemic (eg part of the host OS or application) or from malware, mean your AV and HIPS systems cannot always provide protection

3. Whitelisting of processes is an approach which restricts the Host to only run a pre-approved list of processes. Similar to AV and HIPS systems, this is an effective measure to protect your host systems but is not infallible. Whitelists need to be maintained for all versions of all applications which provides a management overhead. In-house developed applications provide a separate challenge.

4. Host systems running secure application environments as required for a PCI DSS estate need to be ‘locked down’. File-Integrity Monitoring means that any new files being introduced to, or removed from, the host are detected and alerted. This provides protection from any malware being introduced (eg a Trojan) or any other modification to the host set-up which could introduce a vulnerability.

5. File-Integrity monitoring provides protection not just from malware being introduced to the system, and not just from a hacker attack, where an application has been modified and a vulnerability unwittingly introduced, but also from an internal threat where a trusted employee with administrator rights can bypass your AV and HIPS systems to either introduce a backdoor to your system, or packet sniffing software, or sql injection or cross-site scripting attack. Don’t think this could ever happen to you? Read about Heartland Systems about Albert Gonzalez here http://en.wikipedia.org/wiki/Albert_Gonzalez

6. File-integrity monitoring can be used for desktops and servers although in a PCI DSS scenario, the technology is typically aimed at servers handling cardholder data. As a minimum, the System32 folder should be governed as well as key application program folders.

7. It is important to verify all adds, changes and deletions of files as any change may be significant in compromising the security of a host. Changes to monitor for should be any attributes changes and the size of the file.
8. The hash for files should also be verified as a unique indentifier. A Secure Hash Algorithm, such as SHA1, is analogous to a DNA Fingerprint of the file. This is important as an application can be changed programmatically while maintaining the filesize. SHA1 produces a unique, 160 bit hash based on the contents of the file.

9. What is the file-integrity baseline? Any file-integrity monitoring system works by comparing file attributes, filesizes and SHA1 hash signatures from one time to another. The assumption therefore is that the initial baseline is for a vulnerability-free, completely uncompromised host and application.

10. Zero Tolerance to unplanned changes is required, so any file-integrity change must be investigated and authorised as a matter of urgency. However, files will need to changed on a regular basis – windows updates appear to arrive at a rate of ten per week, every week, and anti-virus signatures can easily require daily updates. Therefore tightly managed Release Management and Change Management processes need to be in place which is why these processes are also a key dimension of the PCI DSS, section 6.4

“Follow change control procedures for all changes to system components. The procedures must include the following: Documentation of impact, Management sign-off by appropriate parties, Testing of operational functionality, Back-out procedures”

The ITIL Change Management process is an ideal framework to adopt.

Leave a Comment