Community:Splunk for PCI Compliance

From Splunk Wiki

Jump to: navigation, search

The Splunk PCI application offers a set of reports, saved searches, and dashboards, as well as corresponding alerts that you can use to satisfy PCI requirements such as secure remote access, file integrity monitoring, secure log collection, daily log review, audit trail retention, and PCI control reporting. Watch the Splunk for PCI Compliance video.


  • 57 reports
  • more than 91 saved searches
  • dashboard and corresponding control objective monitors (alerts)


PCI Screenshot 1

The Splunk PCI dashboard shows your current PCI compliance posture. Control objective monitors are used to monitor the environment and alert on each violation. The Overview dashboard then summarizes all the violations. Each of the requirement areas also has its own dashboard where the violations of that requirement are summarized. This makes it easy to quickly assess the impact of the violations.

PCI Screenshot 2

Splunk can easily show unsuccessful logins to cardholder systems to address PCI Requirement 10. This graph shows the top failed logins by username, with failed root logins occurring most frequently. The high number of failed root logins is alarming. This indicates a possible misconfiguration or a serious security attack which needs immediate investigation.

PCI Screenshot 3

Splunk's ability to correlate events by time provides powerful and concise views of activity on cardholder systems. This graph shows the failed logins over time to fulfill PCI Requirement 10. Clearly, there are individual spikes in failed root logins, which warrant investigation. The user names are also worrisome. Several of the account names should not show up, which indicates clear attack behavior.

PCI Screenshot 4

Splunk fully addresses requirements for file integrity monitoring (PCI Requirement 11.5) through continuous monitoring of critical system files and re-indexing files on every change. This graph reveals updates, additions or deletions to files on systems containing cardholder data, providing an audit trail for file access activities. This bar chart shows when file system changes were detected and delineates changes by host. Each file change should be mapped against a change request. If no change request exists, the file change should be regarded as suspicious.

PCI Screenshot 5

The Splunk PCI dashboard shows your current PCI compliance posture. Control objective monitors are used to monitor the environment and alert on each violation. This is another Overview dashboard that summarizes all the violations.

Frequently Asked Questions about PCI DSS

Q: Requirement 10.5 is Secure audit trails so they cannot be altered. There is no requirement for log collection to be centralized, so why do I need Splunk?

A: 10.5.3 instructs auditors to verify that current audit trail files are promptly backed up to a centralized log server or media that is difficult to alter.

Q: Per audit procedure documentation controls 10.5.1 & 10.5.2 can be achieved by file level permissions & regular OS access controls. Why do I need Splunk?

A: This is the case. However, once it's on a tape, the data is not really usable. Splunk keeps your logs usable while at the same time meeting the requirement for central and secure log storage. And, if you want to have a tool that can manage and give visibility into your data, you have to secure the data in that tool, which Splunk does.

Q: How does Splunk meet requirement 10.6 (obtain and examine security policies and procedures to verify that they include procedures to review security logs. Verify that regular log reviews are performed for all system components)?

A: Requirement 10.6.a talks about the security policy. Understandably, Splunk cannot affect your organization's security policy. However, requirement 10.6.b also asks the auditor to make sure that regular log review has been performed. Currently, auditors are not verifying this very effectively or consistently. Sometimes, auditors simply interview the customer about whether log review has been performed, and only occasionally do some spot checking. With Splunk, you can indisputably prove that you have actually done log review. Splunk then takes it a level further--it doesn't just help you become compliant, but also gives you an operational tool that can yield benefits from this required log review. We see a lot of customers who are just doing the minimum to pass the audit. From an operational perspective there is a lot of unrealized value in that kind of situation. By implementing the guidelines, you can close significant cost and exposure gaps..

Q: Why do I need Splunk for requirement 10.5.3 if it can be achieved by tape backups?

A: Same as the question above. Once the data is archived, it's not usable anymore.

Q: Requirement 10.7: Why is centralized log monitoring required for this?

A: You must have three months of logs available for immediate analysis. Centralization is one way of guaranteeing that.

Q: Requirement 7.1: It seems as though it could be satisfied by host & file access controls, right?

A: Correct. However, how do you verify these controls are enforced? And if you can't enforce those controls, what do you do? You must consider log files as part of your cardholder critical information, especially if transaction records are showing up in the logs. Therefore you must make sure these logs are secure. In addition, you want to know who has accessed them at what time. It's a loose interpretation and Splunk definitely does not make the entire organization compliant with 7.1. However, Splunk does help implement compensating controls for this requirement.


Install the application

To install the Splunk PCI application, unpack the tarball inside $SPLUNK_HOME/etc/apps.

To feed data into the PCI application, make sure you install some of the other applications, such as Splunk for UNIX, Splunk for PIX, etc. The data fed into the applications needs to comply with the common information model to drive all the reports. This means that events need to be tagged with the corresponding eventtype tags from the common information model.

Configure the application

There are four issues you must consider when you configure this application: data sources, host tags, event types, and alerts.

Note: The PCI application may require additional services to be customized for your environment.

Each search is bound to the last day, unless indicated in the name of the search itself. You might want to change that to reflect the time frame you are interested in.

There are two types of searches included:

  • Regular searches that are owned by admin.
  • Summary indexing searches (ones that start with "PCI-SI") are not scheduled by default. You need to manually enable all the searches that you need for your environment.

The dashboard included in the application splits searches into a group of active and a group of inactive searches. That way you can easily partition all of the searches into ones that you are interested in (and you have data for) and ones that you do not care about. To turn a search into an inactive one, change its name to start with "PCI - " instead of "PCI-".

Apply host tags

You must apply host tags that correspond to a host's duties where they intersect with the PCI standard. To apply these tags, do the following:

  1. Determine which of your hosts are subject to PCI compliance, and which work with cardholder data.
  2. For those hosts that are subject to PCI compliance, add the host tag pci.
  3. For those hosts that deal with cardholder information, add the host tag cardholder, cardholder-dest, and cardholder-src.

The full list of host tags this application uses are as follows:

Tag Description
cardholder All systems that store or process cardholder data
cardholder-dest Systems that are destinations for cardholder data.
cardholder-src Systems that are sources for cardholder data.
dns_server DNS servers
domain-controller Domain controllers
internal All machines on the internal network
mail_server Mail servers
pci All systems subject to PCI compliance.
network Layer 2 or layer 3 network appliances such as firewalls and switches
server All servers
web_server Web servers

Adjust event types

You may have to adjust some event types to ensure that they properly match your data. To edit these, open the application's eventtypes.conf file. Be sure to read the comments in the file that explain each group of stanzas and edit where needed to make the event types meet your own needs.

Set up alerts

Some of the saved searches in this application have alerts associated with them. All of the alerts are disabled by default. You need to enable the ones that you need. These alerts are set to send an email to an administrator, by default Change this address to the address of your system administrator(s). The easiest way to do this is to use the savedsearches.conf_local file and place it in $SPLUNK_HOME/etc/apps/local/savedsearches.conf. Edit the file with the following information:

  • on lines starting with action_email, change to the email address of the administrator who should be receiving the respective alerts.
  • change the enableSched=0 of the searches that you want to enable to enableSched=1
  • update the schedule lines of the searches that you want to schedule and configure the schedule that you need

Adjust search owners

All the searches are owned by the user with ID 1, which is the admin user. You can update the savedsearches.conf file that you copied into your local directory (see "set up alerts") by changing the lines with userid to the user that should own the searches.

PCI on Windows

IMPORTANT If you are running on Windows, you have to change indexes.conf from:

  coldPath = $SPLUNK_DB/pci/colddb
  homePath = $SPLUNK_DB/pci/db
  thawedPath = $SPLUNK_DB/pci/thawedb


  coldPath = $SPLUNK_DB\pci\colddb 
  homePath = $SPLUNK_DB\pci\db
  thawedPath = $SPLUNK_DB\pci\thawedb

Basically using back slashed instead of forward slashed.

Further Links

Personal tools
Hot Wiki Topics

About Splunk >
  • Search and navigate IT data from applications, servers and network devices in real-time.
  • Download Splunk