The Physicality of Data And the Road to Inherently Safer Authentication
By David Kruger (featured on Forbes.com)
Two different classes of identifiers must be tested to reliably authenticate things and people: assigned identifiers, such as names, addresses and social security numbers, and some number of physical characteristics. For example, driver’s licenses list assigned identifiers (name, address and driver’s license number) and physical characteristics (picture, age, height, eye and hair color and digitized fingerprints). Authentication requires examining both the license and the person to verify the match. Identical things are distinguished by unique assigned identities such as a serial number. For especially hazardous or valuable things, we supplement authentication with checking provenance — proof of origin and proof tampering hasn’t occurred.
Hazardous Data
In previous articles of this series, we established that computers are miniaturized manufacturing plants that produce physical data objects. Data objects representing confidential information were equated to highly hazardous chemicals (HHCs). If control of either HHCs or physical data objects is lost, great harm can result. Additionally, those with murky intentions being able to access your shipping, processing and storage areas was equated to poor authentication in computing. Are these equivalencies accurate?
Thought Exercise
Imagine that HHC manufacturing plants are everywhere. HHC plant employee gates are opaque boxes; persons are admitted based on an employee’s name and number, but guards can’t match the person to their badge. Plants have numerous shipping terminals. Inbound raw material shipments arrive after a cursory partial authentication and no provenance check. Finished HHCs are routinely shipped to a name and address without verifying who the addressee is or their plans for the HHCs.
Given this scenario, would you be surprised if thieves and terrorists frequently impersonated employees? If HHCs were regularly stolen or ransomed? If sabotage via tainted raw materials or compromised process controls were common? If bad guys used purchased and stolen HHCs for bad purposes? Nope.
Imagine such HHC plants were reality. Hordes of furious citizens, litigators and legislators would be rightfully demanding that plant operators and owners “fix it or face the consequences.”
To verify equivalency, in the scenario above, substitute uncontrolled data for HHCs, computers for manufacturing plants, usernames and passwords for employee names and numbers, and internet connections and portable media for shipping methods. I’ve just described the quality of authentication in most of computing, as evidenced by the following:
- Cyberattackers frequently use stolen credentials to take control of software, computers or networks.
- Cyberattackers frequently exploit poor or absent authentication of inbound data such as links, patches, updates and new apps to onboard malware.
- Uncontrolled data stolen by malware or malicious insider can be shipped, copied and reshipped to anyone anywhere and used for any purpose.
Authentication Deficit
To do work, computers require three physical actors working in unison: First, a user instructing, second, an instance of software managing, third, an instance of hardware. Each of these actors, like all physical things, can be authenticated by a combination of assigned values, physical characteristics and provenance when required. Since co-opting only one actor can result in losing control of data, it’s logical to assume that authenticating all three is the norm — but unbelievably, from a safety perspective, it’s rare.
Inherently Safer Authentication
Inherently safer design (ISD) prescribes building in safety, not bolting it on. Why? To assure consistent safe operation and reduces costs. Per ISD, software (inclusive of apps, operating systems and firmware) and hardware would provide authenticatable identities and provenance as required — and require the users, hardware and software they interact with to reciprocate. What if this were the norm?
- Remote cyberattackers possessing stolen credentials couldn’t log in because the software and hardware on the computer staging the attack couldn’t authenticate. Cyberattackers possessing a stolen computer and credentials couldn’t log in because biometrics wouldn’t match. Stolen credentials couldn’t enable access.
- When “operative” data, data capable of functioning independently, or modifying or instructing existing software (i.e., apps, patches or malware), is received, it cannot operate unless it authenticates. Malware, if present, couldn’t function.
- When sending confidential self-protecting, self-directing data, the data contains its own authentication requirements; it’s unusable unless the recipient authenticates. Stolen data objects couldn’t yield usable information.
Is this possible? Of course, it is. Biometric user authentication is broadly available and calls to use it are frequent. Authenticatable identities for individual instances of software and hardware are less common but do exist. There are software license management tools that authenticate instances of software and makers that prevent malware insertion by authenticating and checking the provenance of patches and upgrades. There are hardware makers that provide authenticatable identities. So, the technology for inherently safer authentication exists, but its application is spotty and inconsistent. Why?
Not My Job
The first reason is that people don’t ask software and hardware manufacturers to build in authenticatable identities because they don’t know to ask. The second reason is that, unlike HHC plants, software and hardware makers aren’t typically held accountable for the harms that absent or poorly designed authentication causes. How do they pull that off? First, their license agreements usually require users to waive recovering damages as a condition of use — something no HHC plant could get away with.
Unnewsworthy
Second, uncontrolled data objects are hazardous and harmful (about $1 trillion of losses in 2020), but they are invisible. There are no compelling videos of toxic clouds of uncontrolled data or its victims filling hospitals, and computers don’t explode when their data is stolen, ransomed, destroyed or corrupted. Third, breaches are so common we’re all suffering from breach fatigue. Thus, even though the world is arguably far more at risk from uncontrolled data than from uncontrolled HHCs, there are no hordes of people demanding solutions — yet.
Be The Horde
If you’re reading this series, you’ll have noticed that each article calls leaders to action. Why? Because the cybersecurity pandemic is curable, but it won’t be without business and government leaders gaining new insights (thus these articles) — and acting on them.