Don’t Let Big Data Become a Big Problem: How Encryption of Sensitive Data Can Address Apache Hadoop Security Concerns

Last updated: 16 May 2016

Hadoop File Encryption - Banner

More meaningful business insights. More informed decision making. More revenue opportunity.

These are just a few ways companies are capitalizing on the big data opportunity.

However, mining value from big data is easier said than done. One popular solution is Apache Hadoop, an open source framework that provides scalable, cost-effective storage and fast processing of large data sets.

Free. Fast. Scalable. What more could you want from a big data management solution?

I can think of a pretty important feature – security. According to the latest Breach Level Index report, more than 200 million data records were stolen in the first quarter of 2014, a 233 percent increase over the same period last year.

Data breaches are clearly on the rise, yet enterprise grade data security is not inherent in Hadoop. If your organization has deployed or is planning to deploy Hadoop for big data management, you need a plan to protect your sensitive information from being mined by unauthorized and malicious users and services, particularly if you need to meet strict compliance and regulatory mandates. Here’s why.

Hadoop stores data in clusters that it distributes across hundreds, and sometimes thousands, of data nodes. Each of these nodes represents a potential entry point for a rogue insider or malicious threat. If an unauthorized Hadoop user or service assumes direct access to these nodes, your company’s sensitive data-at-rest is in clear view. This presents a tremendous, and potentially costly, risk for your organization.

But there’s good news. A new solution is available to address your Hadoop security concerns.
File Encryption ImageToday, we announced the extension of SafeNet ProtectFile solution to include the transparent and seamless encryption of sensitive data-at-rest in Apache Hadoop clusters. With ProtectFile, organizations can now secure high-value information without significantly impacting Hadoop performance or end-user experience.

Additional features of ProtectFile include:

  • Rapid deployment and implementation: Automation tools provide fast and easy roll-out and standard deployment to multiple data nodes in a Hadoop cluster.
  • No rearchitecting required: No changes are needed to your existing big data implementation.
  • Hardware-based centralized key and policy management: Maintain control of encryption keys for added security, and define tight access controls to guard against unauthorized or rogue mining of high value data in a Hadoop cluster.
  • Compliance-ready capabilities: Support and enforce compliance mandates, such as HIPAA and PCI DSS, in your big data implementation.

There’s no doubt big data can provide big benefits to your business, and Hadoop is a powerful framework that can provide a faster, easier, more efficient way to manage your data. However, none of that matters if your sensitive data is compromised.

In addition to ProtectFile for Hadoop, we’ve also released an updated version of ProtectFile for Windows. This solution provides transparent and automated file-system level encryption of server data in the distributed enterprise in Windows environments.

Read the press release to learn more about the latest ProtectFile offerings or download the solution brief, Securing Sensitive Data in Hadoop Clusters with SafeNet ProtectFile.

Leave a Reply

Your email address will not be published. Required fields are marked *