When Peiter “Mudge” Zatko testified before a U.S. Senate committee last September about Twitter’s security shortcomings, he laid out a few concerns that should get the attention of businesses everywhere.
Zatko, Twitter’s former security lead, became known as the “Twitter Whistleblower” after he filed a complaint with the federal government listing a raft of security lapses at Twitter, saying the company willfully ignored them. He singled out two related issues that many companies probably face, even if they don’t know it: “They don’t know what data they have, where it lives or where it came from, so unsurprisingly they can’t protect it,” he warned. And further, he added: “Employees then have to have access to too much data in too many systems.”
In today’s business environment, enterprises have reams of data—terabytes and petabytes of information—sprawled across servers both on-premises and in the cloud, and any number of applications and storage solutions. The responsibility for safeguarding that information also gets fragmented across groups including security, privacy, and data governance, often working with different tools. But traditional solutions were not designed to monitor and control data in large-scale multi-cloud environments, nor handle the rapidly evolving regulatory landscape. This has resulted in a disconnected set of data analysis and control capabilities that lead to increased costs and complexity, as well as inconsistent data classification, lack of visibility, and limited scalability.
We can attack these issues by rethinking how to manage data and establishing a unified data controls framework based on a central view of all sensitive data that includes granular insights to support requirements for data security, privacy, governance, and compliance. This harmonized approach can help enterprises enforce controls and policies across distributed data sets, including applying zero- trust and least privilege principles that can prevent unauthorized access. As Zatko told the Senate: “It doesn’t matter who has the keys if you have no locks on the doors.”
A unified data controls framework, which spans many technologies and systems, can improve security practices across the enterprise in the following ways:
- Deliver a consistent, accurate view of all sensitive data for use by data security, privacy, governance, and compliance teams.
- Abstract policies that integrate into various systems to monitor and control access to sensitive data.
- Automate data mapping, privacy assessments and other privacy requirements based on personal information.
- Automate compliance with privacy mandates, including data subject requests (DSRs) that have become a part of regulations such as GDPR and CCPA.
- Automate incident analysis and response before and after a breach.
- Track cross-border data handling issues to comply with multiple regulations.
How to implement unified data controls
A few steps can help build the foundation for a unified data controls framework:
- Know the company’s assets: Identify all data systems in multi-cloud environments, both cloud- native and non-native. This should include “dark data'' assets generated by data systems that have been migrated to the cloud.
- Find and classify sensitive data: Discover and classify all sensitive and personal data, both structured and unstructured including special categories defined by GDPR, CCPA and other regulations.
- Enrich the data: Add in all the metadata, labels and context needed for security, privacy, and governance use cases. This will enable the automated creation and enforcement of security and privacy policies and other governance activities.
- Map data: Map all personal data to its individual owner to support individual privacy rights.
- Know user access points: Understand users and roles, and how their access entitlements map to sensitive data, so appropriate guardrails can be put in place.
- Manage data risk: Track risk across the organization’s data landscape and prioritize remediation.
Change always faces internal resistance, including financial arguments against spending on new technology solutions, and turf issues related to spending on projects across the budget lines of different organizational silos. To overcome those challenges, consider these best practices:
- Education: All the groups involved in the organization need to know and understand why this integration matters. Communicate how and when this will happen so they can all prepare.
- Architecture: Implement shared processes around data that the company can leverage by security, privacy and data governance practices across the board.
- Integration: The team must integrate data, people and processes to improve efficiency and hold down costs.
- Financial: Communications should spotlight the financial benefits, such as offsetting tool costs, reducing liability for data incidents, lowering operational expenses and a faster implementation for other use cases in the future.
- Collaboration: Demonstrate how enhanced collaboration will open up channels of communication and information-sharing between groups internally.
New integration technologies make it possible to replace siloed data management approaches with a unified architecture purpose-built for today’s large-scale cloud environments. If the reaction in Congress around the Twitter whistleblower’s revelations is any indication, enterprises could soon face new requirements for data compliance and security policy enforcement. Now’s the time to rein-in disconnected data controls with a more rationalized model.
Mark Shainman, senior director, data governance practice area, Securiti