When it comes to fixing a root cause, there are two questions. The first is “Who is able to apply the fix?”, and the second is “who is responsible for applying the fix?”
This article explains what we get wrong about cybersecurity, how and why we get it wrong, and what it’s going to take to fix it. Fair warning: it’s going to be a long and bumpy ride. Those bumps include a healthy dose of counterintuitive assertions, cybersecurity heresy, and no mincing of words.
Last year, the Biden administration issued an executive order, and later additional guidance, aimed at improving the nation’s cybersecurity. Agencies are now required to deploy Zero Trust architectures by 2024. As things go in government, so they tend to go in the private sector. Zero Trust is, therefore, the cybersecurity buzzword of the day.
Organizations are continually striving to assess and mitigate their cybersecurity risk, working to minimize the likelihood that their brand name will be splashed across newspapers nationwide because they’ve fallen prey to a high-profile hack. In cybersecurity, at least the way it is currently practiced, risk is not quantifiable.
We’re pretty mission-driven here at Absio. We believe there is a real problem (or problems) in cybersecurity that reaches back to the first computers. We’re eager to help organizations resolve the issues that arise when sensitive data created or processed by software doesn’t enjoy full-lifecycle protection. A big part of the solution to today’s seemingly endless cybersecurity breaches and privacy infringements is to reengineer applications to adequately, reliably, and automatically protect data, by default and by design.
A recent Associated Press poll indicates that most Americans think their personal information is vulnerable online. What’s more, 71% of Americans believe that individuals’ data privacy should be treated as a national security issue. In other words, the American people get it: data privacy and security are sadly lacking across the digital ecosystem and consumers are suffering the consequences.
As digital solutions have become nearly ubiquitous, few terms have taken a more central place in our conversations than data privacy and data security. Consumers, businesses, and organizations of various types are tiring of the barrage of data breaches and process failures resulting in unauthorized distribution of their sensitive information.
Two different classes of identifiers must be tested to reliably authenticate things and people: assigned identifiers, such as names, addresses and social security numbers, and some number of physical characteristics. For example, driver’s licenses list assigned identifiers (name, address and driver’s license number) and physical characteristics (picture, age, height, eye and hair color and digitized fingerprints). Authentication requires examining both the license and the person to verify the match. Identical things are distinguished by unique assigned identities such as a serial number. For especially hazardous or valuable things, we supplement authentication with checking provenance — proof of origin and proof tampering hasn’t occurred.
In previous blogs, we discussed the fact that data is physical and inherently controllable. Much like I can move a candy bar from the left side of my keyboard to the right, leave it there in anticipation, and slap away a hand intent on stealing it, it’s possible to physically control where data goes, where it remains at rest, and who can access it. What does this say about data ownership? Quite a bit, as it turns out.
In every field of engineering, there is a grace period when the engineers doing the heroic work of making a complex and highly valuable new technology work can escape liability for poor performance, failures, or damages caused by what they build. That grace erodes as the technology becomes commonplace. Eventually, usually through a combination of litigation, legislation, regulation, and evolving insurance requirements, liability and responsibility for failure starts being pinned to the engineers who designed and built the failed system.
- 1
- 2