The PivotNine Blog

The IT Tyrant Test

alexis-fauvet-Ne5SFwlUEGw-unsplash-scaled.jpg
15 February 2022
Justin Warren

IT has a security problem, but why is our default solution more authoritarian control?

Why do we demand our employees give up their privacy in order to secure our IT systems when we resist this impulse by governments? Are other solutions not possible, or are they simply more challenging, and we are too lazy?

Do we care so little for our fellows worker that we are willing to place ourselves above them as somehow inherently more trustworthy?

What if we're wrong?

What if, instead of securing our company's systems we are building a handy way for a skilled adversary to take over? What if we are systematically removing any way for other employees so help us defend against invaders? What if we are creating more and more work for ourselves in a futile search for the perfect system, impervious to all attack, yet uniquely vulnerable to an attack from within, an attack from us?

As I asked in an article last year (ostensibly about DNS monitoring and security): What if we become the baddies?

The Tyrant Test

I turned my mind to this idea after reading an article in the Georgetown Law Journal: Surveillance and the Tyrant Test by Professor Andrew Guthrie Ferguson of American University Washington College of Law. I'd just spent several weeks deep in the public policy weeds of Australia's electronic surveillance regime, and the article was timely. Its focus was on surveillance by governments, the State, but if you view a company as a State there are some useful parallels, I think.

As Professor Ferguson says “The tyrant lens assumes that governmental power, including police power, must be checked (and checked again) because the government will misuse it against the less powerful.” Let us swap out the notion of government and police power for “IT power”, including the power of automated systems. For as we automate systems we also need to consider what happens if they behave in ways we did not anticipate and guard against.

The power nexus of centrally-planned IT systems makes it much easier to crack open the whole thing if you can get the domain administrator password. If you manage to elevate your privileges to backup admin, you can overwrite every file.

Terraform The Tyrant

What if Terraform became a tyrant? Authoritarian control might be more efficient, but it's also quite brittle. What happens if the king goes mad?

We have this situation today, with malware and ransomware. Systems designed to do our bidding turn against us, and we are rendered powerless to stop them when the very tools we use to stop the tyrant are the same tools the tyrant uses to oppress us.

These are not new issues, and though we are seeing some movement towards addressing them with the idea of Zero Trust, we are still approaching the problem from a perspective built on centralised, authoritarian control. I think we need to interrogate that impulse a little, because I'm not convinced its position as the default option is evidence-based and properly justified.

Our systems are built on trust, but swapping out our misguided trust in poorly secured infrastructure for trust in the breathless claims of vendor marketing seems ill-advised.

Imagining an Alternative

What if, instead of building a better tyrant, we were to design our systems to be tyrant-resistent?

This would mean giving up some of our power and control, and entrusting more people with a role in defending the system as a whole. It would mean giving tools and power to the users themselves to better protect themselves. It would mean designing systems that inherently resist blindly doing as they're told. Just following orders would no longer be acceptable. Instead of “trust, but verify” we would need to design a system in which trust was earned, and re-earned, constantly.

Mutually untrusting systems, constantly checking each other. Is Dave Evil Today? would become our starting position, requiring an affirmative verification that “No, Dave is not evil… yet” each time Dave asks us to acquiesce.

It might not be possible, or not worth the trouble, but surely we should at least consider an alternative to the methods that have already failed us?