Despite the growing interest in container technology, information is sparse about using containers with applications involving credit card data.

The latest version of the Payment Card Industry Data Security Standard, version 3.2, doesn’t mention containers at all and the only mention of containers its cloud computing guidelines refers to virtual machines.

Gartner analyst Joerg Fritsch, who has said that applications deployed in containers are more secure than applications deployed on the bare OS, also has noted that containers “add complexity” to compliance efforts. (The PCI Security Standards Council, which created the PCI standard, passed on our request to talk specifically about containers and PCI.)

It takes time for any but the bleeding edge first adopters to warm up to any new technology, and regulators and standards bodies tend to lag behind, notes Avi Deitcher, president of IT management consultancy Atomic Inc. The comfort level with containers and public clouds, especially, for sensitive data is low, he said.

“I’ve seen absolutely zero people comfortable with pure separation using containers only. They’re not convinced that my container that’s sensitive and your container that is not that are in the same place can live together nicely. That will take some time,” he said.

He maintains that container technology and the comfort level with it must improve for this to happen. He points to security at the kernel level, tools and management as well as reliability concerns as limitations now.

Different Challenges

PCI DSS is designed to protect cardholder data and reduce credit card fraud. It was developed by the council, whose members include the major credit cards, which require vendor compliance.

The standard applies to any company that stores, processes or transmits cardholder data. PCI DSS version 3.2, released last April, extended the requirement for multifactor authentication, revised sunset dates for Secure Sockets Layer (SSL) and early Transport Layer Security (TLS) protocols and called for greater focus on people, process and policy in card data security.

Read More:   Update How Time-Series Databases Can Benefit Video Game Developers

Though the major cloud vendors have undergone independent audits to ensure they comply with PCI DSS, that might not extend to all their services. Amazon Web Services, for example, in touting being the first cloud service to attain version 3.2 compliance announced that 26 of its services meet the standard, though that doesn’t mean all do.

Basically, PCI-compliant cloud vendors enable customers to set up their own cardholder data environment or CDE. However, using a PCI-compliant cloud vendor doesn’t equate to compliance for their clients. In the end, each organization is responsible for its own compliance. Small merchants that use a PCI-compliant service provider, however, might only be responsible for submitting a simple self-assessment questionnaire (SAQ), with the service provider responsible for demonstrating full compliance.

“Virtual machines are well understood from a security perspective, as well as from a PCI audit perspective.  Virtual Environments have been a part of PCI DSS measurement for a long time.  Containers, being much newer, lack the same level security and PCI scrutiny, which is why you see a lot of startups scrambling to solve container security and venture capitalists placing bets [on it],” said Andrew Nielsen, a former member of the the PCI Security Standards Council, and director of enterprise security at cloud information management vendor Druva.

Container security vendor Twistlock issued its own guide to containers and PCI compliance, and more recently a similar guide to containers and HIPAA, the healthcare privacy regulation. These guides go step-by-step through the requirements but largely focus on using Twistlock’s own solution.

Containers pose different challenges in the way people need to operate and secure them, but it’s really not PCI-specific, according to John Morello, Twistlock chief technology officer.

Typically, there are many more parts involved with containers, he said. An app that uses two or four virtual machines might use 20, 30 or more containers.

“If you have a PCI environment, you would not want to have the customer-facing front end of the application on the Internet with typical web ports on the same on the same host where you’re storing all the back end data. You want to have some segmentation there” — John Morello, Twistlock.

“One of the reasons people like containers is that they enable DevOps, the rapid iteration of software, but the environment and the software base that you’re managing change with a higher rate of frequency than it would with VMs,” he said.

Read More:   Update BOSH Hooks Cloud Foundry into the Google Cloud Platform, Offers Large-Scale VM Lifecycle Management

And since the developer builds and hands over this sealed container image to run in production, the responsibility for securing the application is a lot further upstream in the development lifecycle than it is with VMs. Securing the application and managing it traditionally fall to the ops team, while the developer has to secure the container image and update it as necessary.

“So it’s not so much a flaw with containers, but you have to adopt a new way of operating to secure the application,” he said.

Not Just About Databases

While most PCI data is stored in databases, container use for databases so far has not become widespread, he said. It’s more common to see containers used for front-end applications and analytics tiers that use and store PCI-compliant data.

“PCI is not just about data at rest, it’s about data in motion, data being processed. It’s not really about a database, it’s about the entire application,” he said, adding that it can apply to the user interface and tools used to detect fraud, monitor performance, for caching and transactional processing.

It even applies to logging, with Sumo Logic boasting of PCI 3.2 compliance as part of its multitenant SaaS security analytics solution.

Securing all the Components

The six goals of the standard are:

  • building and maintaining a secure network.
  • protecting cardholder data.
  • maintaining a vulnerability management program.
  • implementing strong access controls.
  • regularly monitoring and testing networks.
  • maintaining information security policy.

It describes a baseline for how to secure system components in the cardholder data environment (CDE) that include systems that handle cardholder data, systems that segregate the CDE, and the systems that control access to the CDE.

The first step in a PCI DSS assessment is identifying all those components, which could include a containerized Node.js app that serves an e-commerce website, a containerized MongoDB database that stores cardholder data to streamline checkout for current customers, or a containerized OpenLDAP service that authenticates access to hosts and applications in the CDE, the Twistlock guide points out.

The required strict access controls can reduce points of entry for attackers and network segmentation, though not required, can reduce your compliance scope, hence the complexity, audit costs and compliance burden overall, according to Kurt Hagerman, chief information officer at cloud security vendor Armor.

The standard doesn’t prohibit the use of multitenant services, but stipulates, “Mechanisms to ensure appropriate isolation may be required at the network, operating system, and application layers; and most importantly, there should be guaranteed isolation of data that is stored.”

Read More:   Update How to Implement a Hybrid Architecture for Open Source Geographic Information Systems

Containers ride on an operating system and common hardware similar to what you see with a Type 2 hypervisor and then provide multiple different applications in a much smaller footprint than VMs, Nielsen explained. And given the transient nature of containers, organizations really need to get a better understanding of the attack surface when it comes to containers and how to secure these environments.

Because multiple containers live on a single host, perhaps one of the most container-specific PCI concerns would be Requirement 2.2.1: Implement only one primary function per server to prevent functions that require different security levels from co-existing on the same server.

A similar requirement in HIPAA means companies that deal with sensitive healthcare care data on AWS, for instance, must have dedicated hosts, according to Deitcher.

Because it’s considered an operating system entity, a virtual machine would be considered a viable level of segmentation, according to Morello.

“You want to keep containers of the same classification level grouped together. If you have a PCI environment, you would not want to have the customer-facing front end of the application on the Internet with typical web ports on the same on the same host where you’re storing all the back end data. You want to have some segmentation there,” he said.

Twistlock contributed access control capability to the Docker open source project that’s used in the Docker authorization plugin, he said, and Twistlock’s commercial product uses that framework to provide granular role-based access controls.

For example, you could define a rule in Twistlock that would allow only containers labeled data tier to run on the data tier host.

“If someone tries to run a web-tier container on that host, it wouldn’t be allowed. You can also use that framework to prevent accidentally mix dev or production or run software that’s unauthorized for that environment,” he said. It allows users to define specifically which images are authorized and which registries can be used.

Compliance comes down to good security no matter what technology is being used, Nielsen said.

He urges organizations using containers to focus on limiting unnecessary network traffic and connections to systems that are running containers, to leverage centralized authentication and authorization systems wherever possible, implement the principle of least privilege when it comes to user accounts and processes, and look for third party tools that can scan and enforce security policies specifically on container technologies.

Twistlock is a sponsor of InApps.

Feature Image: “Online Shopping Security” by Blue Coat Photos, licensed under CC BY-SA 2.0.

InApps is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Docker.