A couple of weeks ago, my wife and I faced a decision that every technology executive knows intimately: choosing between two paths when both could lead to disaster, armed with incomplete information and running out of time.
Our cat, Mister Pickles, was dying. The immediate decision seemed binary: fight for his life or help him die comfortably. But the real torment wasn’t the choice itself, it was the uncertainty. If we knew he would recover, we’d fight. If we knew he would die, we’d focus on comfort. But we didn’t know, and that’s precisely when decision-making becomes excruciating.
Sound familiar? Replace “dying pet” with “failing core system,” “aggressive treatment” with “emergency migration,” and “comfortable death” with “controlled shutdown,” and you have the technologist’s dilemma during any major crisis.
The Paralysis of Imperfect Information
In that veterinary clinic, I experienced what every technology leader faces during a critical incident: the paralyzing weight of making irreversible decisions with incomplete data. You know the scenario: Your primary database is showing signs of corruption. Do you take it offline now, disrupting business operations but potentially saving data? Or do you keep it running while you gather more information, risking catastrophic failure?
Like Pickles’ condition, system failures rarely announce their intentions clearly. They give us symptoms, not certainties. We see performance degradation but not root causes. We detect anomalies but not their trajectories. We measure impact but not ultimate outcomes.
The hard truth I learned in those final days: It’s not just having information that matters; it’s believing it. How many times have we ignored early warning signs because they didn’t align with our preferred narrative? How often do we delay action because we’re optimizing for the wrong outcome?
The Certainty Trap
Katherine Collins observed, “We are fearful creatures and go to great lengths to preserve our sense of certainty, even when we know it to be false.” This perfectly captures what I experienced, and what I’ve seen in countless technology war rooms.
We cling to false certainties because they’re comfortable:
- “The system has survived similar loads before”
- “Our vendor assured us this patch was tested”
- “We’ve never had a breach through that vector”
Just as I told myself “Pickles bounced back before; he’ll do it again,” we create narratives that preserve our illusion of control. But high-stakes decisions demand that we abandon comfortable fictions for uncomfortable truths.
Beyond Human Error
The book Accelerate notes something profound about how organizations handle failures: “Accident investigations that stop at ‘human error’ are not just bad but dangerous. Human error should, instead, be the start of the investigation.”
When Pickles died, my first instinct was to find fault. Should we have taken him to the emergency vet sooner? Did we miss warning signs? But like system failures, death in complex adaptive systems rarely has a single cause. Our decisions weren’t wrong, they were made within a system that provided inadequate information flow.
The same applies to technology failures. When your cloud migration fails or your security is breached, the question isn’t “Who screwed up?” but “How did our information systems fail to provide better signals for decision-making?”
A Framework for Uncertainty
Here’s what I’ve distilled from this experience; a framework for making high-stakes decisions when you can’t wait for perfect information:
- Acknowledge the Fog of War Accept that you’re operating with incomplete information. The clarity you’re waiting for may never come, or it may come too late to matter.
- Define Your True Optimization Function Are you optimizing for system uptime or data integrity? For customer experience or security? For immediate relief or long-term health? We thought we were optimizing for Pickles’ survival, but we were really optimizing for our own hope.
- Set Decision Triggers, Not Timelines Instead of “We’ll decide by 5 PM,” establish “We’ll act if we see X, Y, or Z.” This removes the artificial pressure of arbitrary deadlines while ensuring you don’t wait indefinitely.
- Separate Reversible from Irreversible Some decisions can be unwound; others cannot. Focus your limited certainty on the irreversible choices.
- Build Better Information Flows Most failures aren’t prediction problems, they’re detection problems. Invest in systems that surface weak signals early rather than confirming strong signals late.
The Decision You Can’t Unmake
On that Friday evening, we made the decision to let Pickles fight through the night at home rather than rushing him to the emergency vet. By Saturday morning, it was too late. I’ll never know if that decision mattered, and that’s the point.
Every technologist has their own version of this story; the migration that corrupted data, the patch that brought down production, the security alert that was dismissed as a false positive. The details differ, but the underlying challenge remains: How do we make good decisions when we can’t know enough?
The answer isn’t to eliminate uncertainty, it’s to make peace with it. To build systems and processes that help us act decisively despite incomplete information. To recognize that in high-stakes situations, perfect information is a luxury we rarely have and waiting for it is often the worst decision of all.
Pickles taught me that the hardest decisions aren’t between good and bad options; they’re between two goods (fight vs. comfort) or two bads (system failure vs. business disruption) when you can’t know which is which. That’s the reality of leadership in complex systems, whether biological or technological.
The question isn’t whether you’ll face these impossible decisions. You will. The question is whether you’ll have built the frameworks, processes, and emotional resilience to make them well.
Because unlike in our personal lives, in technology leadership, there’s always another crisis coming. And the next time you’re in that war room, facing that impossible choice, you’ll need more than hope and hindsight. You’ll need a way to act decisively in the face of radical uncertainty.
That’s the real lesson Pickles left me with: Sometimes there is no ‘right’ decision waiting to be discovered. There are only the decisions we make with incomplete information, guided by our values, and the grace to accept that we did our best with what we knew at the time.