Omer Tene, VP & Chief Knowledge Officer, International Association of Privacy Professionals (IAPP)
Hollywood writers could not have scripted it better. Merely a month before the implementation date of the General Data Protection Regulation (GDPR) in May this year, a data protection scandal roils the world. A whistleblower reveals the leakage of personal data from Facebook through Cambridge Analytica to malevolent actors aiming to influence the U.S. presidential elections. What could possibly better illustrate the crucial role of GDPR in an age where data drives not only marketing and online commerce but also fateful issues for democracy and world peace?
Now, five months after the implementation date of the most fundamental legal reform of a generation, we can begin to assess what has changed and what still needs to improve. The main lesson learned from the Cambridge Analytica scandal is the key role of accountable data practices in an environment characterized by continuous, cross-border, multiparty information flows. Facebook’s brand and reputation suffered – and CEO Mark Zuckerberg was forced to testify in Congress and in European Parliament – as a result of the company’s insufficient controls over the propagation of data through its chain of suppliers and customers.
Under GDPR, data controllers are required to map, document, control, audit and oversee the flow of data through multiple layers of their value chain. It is no longer enough to simply sign an agreement with a customer or a vendor, like the one Facebook signed with a data researcher who went on to forward data to Cambridge Analytica, which monetized it in unwholesome ways. Companies now need to put in place additional mechanisms, technological, organizational and legal, to prevent wrongdoing further down the chain.
To account for and control data, which zips across the globe at the speed of light and can be copied and processed in different places at the same time, companies must put in place privacy management programs. Of course, Germany pioneered the role of data protection officers (DPOs) in organizations well before the onset of GDPR. But more is now required than this traditional, mid-level management role. In the U.S. and Europe, a new profession has emerged to staff privacy teams numbering dozens—and in some tech companies even hundreds—of employees. The most senior of them, Chief Privacy Officers, report to the highest levels of management, sometimes directly to the CEO, and communicate with boards of directors. They oversee teams of professionals distributed throughout the organization in different groups, including engineers and designers, and in various geographic locations.
These privacy professionals personify the new data economy. They require skills not only in law and policy but also in technology and business processes. It is no longer enough for a privacy professional to fill registration forms or draft privacy policies, he or she also needs to deploy de-identification, to understand the blockchain and to master encryption techniques. Importantly, as new technologies and business models proliferate, including AI, machine learning and the Internet of Things, privacy professionals need to venture beyond the law into ethics. What is the right process for robots to make fateful decisions about the lives of drivers and pedestrians? How should society’s interest in advancing science and research measure against individuals’ privacy concerns?
As GDPR matures, so will the new profession of the privacy experts, modern-day Renaissance women and men who occupy a critical focal point in our digital economy.