With Privacy: Do or Do Not, There Is No ‘Try’
Making the technological leap from privacy policy to enforcement guarantees
A recent Associated Press poll indicates that most Americans think their personal information is vulnerable online. What’s more, 71% of Americans believe that individuals’ data privacy should be treated as a national security issue.
In other words, the American people get it: data privacy and security are sadly lacking across the digital ecosystem and consumers are suffering the consequences. Now they want more than lip service about the respect companies afford their privacy. Consumers are demanding meaningful privacy guarantees from digital purveyors of all types. Here’s how industry can deliver.
The Problem Isn’t the Policy
Look at nearly any prominent digital provider and you can find a detailed privacy policy. Facebook has one but nonetheless, the social media giant has lurched from the Cambridge Analytica scandal to April’s leak of over half a billion records, all without serious repercussions. Google has its own privacy policies but they didn’t stop the company from violating the privacy of WhatsApp customers or monitoring users’ internet traffic while their browsers were in so-called ‘private mode.’
These are only a few examples of a widespread problem. Even as privacy advocates have made great strides in defining privacy expectations for a digital world — Privacy By Design (PBD) being an excellent embodiment—privacy infringement has run rampant.
How can that be?
The fact is that PBD, as effective as it is in establishing privacy goals, does not make specific, technical recommendations about how those goals are to be achieved. The same can be said for the EU’s General Data Protection Regulation, California’s Consumer Privacy Act, the Biden Administration’s guidance on Zero Trust architectures, and the variety of other regulatory, audit, and certification regimes under which organizations today operate.
The remedy here isn’t more policy. The most robust law, regulation, or company statement cannot in itself protect privacy. As mere words, policies are little more than a promise that an organization will ‘try’ to avoid privacy infringements. Although these promises aren’t necessarily empty — they may reflect good intentions — their impact is only to suggest best practices, behaviors, and after-the-fact penalties for intentional or negligent failures. They don’t actually guarantee privacy. For that, we need technology.
Software Can ‘Lock’ Data
Daily life provides any number of commonsense examples illustrating the difference between a policy and a guarantee. A convicted drunk driver is not permitted by law to operate a motor vehicle while under the influence of alcohol but may, when impaired, still choose to do so. Because the law alone does not protect the public from repeated offenses, we take extra, technological steps. Cars equipped with breathalyzer ignition locks, for instance, are designed to eliminate the option to drive drunk by barring an individual from starting their vehicle unless their blood alcohol level falls below a certain threshold.
Fortunately, data is just as much a physical entity as a car and similar ‘locks’ can be applied to it. When a name, password, social security number, photo, email thread —any information — is digitized, it becomes physical matter. It might take the form of pits on a DVD, magnetized particles on an HDD, or a stream of light waves transmitted over fiber-optic cable. Data may change form or be replicated in additional forms, but it is physical all the while. And the only thing that enables humans to act on data is software.
That means that software developers behind every digital product we use can exert control over any byte of data created or processed by their applications. They can build into software the privacy rules that companies say they will follow, that governments require in their regulations, and that consumers are demanding with increasing urgency. When correctly integrated in the software, these rules become impossible not to follow.
Privacy By Default and By Design
The model we advocate is one in which applications programmatically manufacture data with clear instructions that can be enforced by software throughout the data’s lifecycle so that information can’t be used by unauthorized people or for unintended purposes by default. This enables data to act as its owner’s proxy and software applications to automatically and reliably enforce data owner’s wishes wherever the data goes.
This is a better alternative to current operations, in which software routinely unleashes massive quantities of unprotected data into the digital wilds and humans working in cybersecurity are left to somehow find and safeguard it at some later point in time. Their task is time-consuming, expensive, and often impossible once data proliferates outside the originating system or organization.
The reality is that no after-the-fact data protections will be universally effective. If personal data is not inherently protected, it is vulnerable to unpermitted access, misuse, and theft — in other words, to privacy violation.
It is worth noting that to demand application-level privacy protection is not to call for an end to privacy policy. Far from it. Application developers will continue to rely on policy experts to establish, clarify, and enhance the rules to be built into their software and will need intermediaries to help ‘translate’ such policy statements into code specifications.
In an era of increasing consumer concern over privacy, however, solutions with privacy built in, by default and by design, will win out in the marketplace. Consumers will turn to the Yodas of the digital economy, those who know that when it comes to privacy, they must ‘do or do not, there is no try.’