Big Data Protector for Google Cloud Dataproc
As organizations leverage Big Data in the cloud to analyze ever larger quantities of data, the challenge of effectively protecting sensitive data while maintaining usability becomes increasingly difficult. Protegrity, the leading innovator of advanced data security solutions, offers the most comprehensive package of Google Dataproc security available to protect assets and meet regulatory compliance while preserving the performance and analytics vital to Big Data.
Protegrity Big Data Protector for Google Cloud Dataproc secures all sensitive data in Hadoop utilizing advanced tokenization and encryption – at rest in the Hadoop Distributed File System (HDFS); in use during MapReduce, Hive, and Pig processing; and in transit to and from other data systems. This continuous protection ensures the data is secure throughout its life cycle, no matter where it is or how it’s used. The actual sensitive data is transparently protected with policy-based controls, while non-sensitive data can remain in the clear. This enables maximum usability for users and processes to continue to mine the data for trans formative decision-making insights.
- Apply comprehensive protection on sensitive data fields and files within Google Dataproc and HDFS
- Protect data in HDFS, MapReduce, Hive, Pig, Spark, Flume and throughout the Dataproc ecosystem
- Utilize Protegrity Vaultless Tokenization, encryption, and masking for protection of the data itself
- Enable secure business analytics with transparent, high-performance protection, optimized for Google Cloud Dataproc
- Monitor and report on all activity on sensitive data throughout Dataproc
- Integrate Big Data protection into a centrally managed enterprise security solution for cloud, enterprise, or heterogeneous environments