Sun Tzu wouldn’t like the cybersecurity industry
Cybersecurity loves its Sun Tzu quotes. What could feel more self-aggrandizing than imagining yourself as the commander of a stalwart army in a battle of erudition and cunning? We fancy ourselves as the “good guys" defending a battlefield of green and gold.
The military fetish soaks into the traditional cybersecurity discourse and doesn’t help our reputation as bellicose obstructionists. In my new book, I shun metaphors of warfare in violence in favor of nature, nurturing, and nourishing to vanquish these aggressive tendencies.
But I also recognize that cybersecurity humans are often stubborn and so if they love Sun Tzu quotes, then I’ll show them how much their beloved would scorn traditional cybersecurity programs.
Whether they intend it or not, many security leaders are the “weak” generals Sun Tzu disdained, the wimpish princes he spoonfed “Baby’s First Warfare Advice.” When we obstruct our organization, we are “the bad guys,” the weak-willed generals who strip their own city of liveliness and liberty as defense against an approaching foe.
In The Art of War, Sun Tzu propounds three ways a ruler can bring misfortune on their army:
-
“…being ignorant of the fact that [the army] cannot obey. This is called hobbling the army.”
-
“…being ignorant of the conditions which obtain in an army. This causes restlessness in the soldiers’ minds.”
-
“…through ignorance of the military principle of adaptation to circumstances.”
All three of these pervade traditional cybersecurity. Let’s examine each in turn.
1. Ignorance of feasibility
Cybersecurity teams create policies and define procedures their colleagues can’t obey, or else it hobbles their productivity. The organization would flounder and lose competitive advantage if people actually followed those rules 100% of the time.
2. Ignorance of conditions
Cybersecurity programs are ignorant of conditions “on the ground.” They aren’t curious about work being done; they care only about work as they imagine it done. They don’t do user research. They blame “human error” without considering context.
3. Ignorance of adaptation
Sticking with status quo ways, proposing container firewalls and slandering “shadow IaC” rather than adapting to evolving conditions. Rather than seizing new opportunities for better defense from modern software tooling, infrastructure, and practices, they cling to what they know – their cognitive comforts – and claim it won’t work for them because of their “unique” business contexts (which mostly amounts to being emotionally resistant to change).
Flinging victory away
These three mistakes, as Sun Tzu says, cause “the army” – or our colleagues, as is the case in cybersecurity – to become “restless and distrustful.” It is “flinging victory away.” Cybersecurity flings victory away everyday. Better systems security is possible but it requires a remodeling of both principles and practices – and a willingness and ability to learn about how software and systems work in our modern era.
Elsewhere, Sun Tzu says that:
If you know the enemy and know yourself, you need not fear the result of a hundred battles.
In cybersecurity, we don’t even know ourselves1, and we maintain a menagerie of mildewy myths about attackers. We fumble in self-imposed darkness, clumsily grasping at how software and systems work; we can neither understand our own terrain nor the attackers’ calculus. The number of security tools with hooks into everything, running as root on critical systems, betrays this naiveté.
Cybersecurity programs stumble their way through each year, spirits sagging under the burdens of these three blunders. They spiral around the battlefield, plodding to the insidious percussion by vendors, research analysts, and journalists.
And it is insidious. The most remunerative way to sell out in cybersecurity is convincing security leaders that they can overcome each of Sun Tzu’s three ignorances to achieve victory (if you can accomplish this via generative AI then congrats on your seed funding!). It is a soothing message: you don’t have to change your thinking or methods – convenience is what you deserve and we’ll help you make it appear like security is still being done – ascend amongst the cardboard cutout fleshlings with your cardboard cutout security stack – blame the fast, ever-evolving winds of change for your failures.
We architect illusions; like Sun Tzu advised, deception is key to victory, but the victory in this case is avoiding accountability rather than successfully outmaneuvering attackers. If being a CISO is the “hardest job in corporate America” (it is not) when your success metrics are imaginary friends like “% of risk coverage” or “time to detect” (because who cares about actually acting?) then it is worth asking whether we are our own source of hardship.
We deceive ourselves – the most human endeavor of all – to maintain this convenience. Security leaders wail that more modern ways are impossible for “real” businesses, despite similar businesses – including those with legacy systems – transforming. It already feels so difficult; how much more difficult will it be once tangible outcomes are required? Will we still be relevant? Who are we if we are not needed?
This dynamic is creating a significant divide in security efficacy, an iteration (admittedly a rather esoteric one) on the “Two Americas” theme. If you feel that you cannot keep up with the business, technology trends, or software velocity, you are not alone; but you and your compatriots are languishing while others march onward towards victory. And it is this stagnancy that presents an opportunity for SRE and platform engineering teams to swoop in, better versed in software and execution alike.
How do we stop flinging victory away? We dismantle our ignorance to avoid these three mistakes:
1. Understanding of work-as-done
We understand how work is actually performed and how our systems actually behave so we can craft security solutions that eliminate hazards by design or reduce hazardous methods and materials by design.
2. Understanding of local conditions
We research the local context of the users we care about, like those interacting with a system (whether developers, accountants, and so forth). We never implement a solution – especially an administrative control, like a policy or training – without performing user research to understand the experiences of those who will be subjected to our prescribed solution.
3. Understanding of adaptation
We cultivate adaptive capacity, prioritizing investment in our ability to learn and drive change. We expend effort on iterative improvement rather than attempting perfect prevention. We embrace speed and all the practices, patterns, and tools that can help us quickly adapt to evolving conditions.
Transforming towards resilience
All three of these strategies can be summarized as transforming towards resilience. To connect the final stars in our concept constellation, I’ll close with a definition of Security Chaos Engineering to reveal why I’m pretty sure Sun Tzu would think it slaps.
Security Chaos Engineering (SCE) is a socio-technical transformation that enables the ability to gracefully respond to failure and adapt to evolving conditions. It embraces resilience as its philosophical foundation: that our digital world will constantly evolve and we must therefore “change to stay the same.” For that reason, I also describe SCE as “platform resilience engineering.”
What matters in the SCE world of resilience is continuously refined mental models of our systems; a learning culture and feedback loops; and willingness and capacity to change. This is precisely what Sun Tzu espoused: mental and strategic flexibility to seize opportunities while outmaneuvering adversaries to reduce their opportunities.
This isn’t something brand new I divined out of the ether. SCE is seeped in decades of scholarship and practice from other complex systems domains in the realm of resilience engineering and, as Sun Tzu shows, it draws on ancient wisdom (not just in warfare, but in matters of socioecological systems, too).
Such a transformation is entirely within the grasp of any organization; but the wild problem looming ahead is whether you’ll pursue resilience or convenience. Many humans prefer playing pretend, performing ritualistic motions as scaffolding for an otherwise aimless life. Craving convenience is only natural for most humans.
If that is your choice, I wish you well – but please do kindly step aside for the rest of us so we may sate our hunger to achieve real security outcomes and sustain systems resilience. A life of checkbox compliance may suit your tastes, but we demand more from life.
Enjoy this post? You might like my book, Security Chaos Engineering: Sustaining Resilience in Software and Systems, available at Amazon, Bookshop, and other major retailers online.
-
This made me think of the iconic clip from Mariah Carey: https://www.youtube.com/watch?v=-lposG3n5u4 ↩︎