The Report that Wasn't There
Cyclic Disruption: A Series on Technology and Human Nature
The Situation
In the mid-1990s, I joined a telecom startup in pre-launch mode. Everything was new: networks being built, stores opening, acquisitions negotiated, revenue streams projected but not yet realized. The pace was frantic, and everyone—from operations to finance to marketing—demanded one thing above all: insight.
I was part of a decision-support team responsible for making that insight possible. In 1994, graphical user interfaces were still novel. Excel was barely used outside finance departments, and most executives relied on static reports delivered in three-ring binders or spreadsheets printed on green-and-white tractor paper.
We decided to push further. Instead of waiting a week for data to be compiled, why not build a system that could deliver performance metrics in near real time? The concept was called an Executive Information System, or EIS, and it promised a single window into the health of the business: network uptime, point-of-sale transactions, customer acquisition rates, revenue performance.
The idea was simple. The execution was not.
We spent months designing and building, convinced we were creating something transformative. If executives could see the business unfolding in real time, they could move from reactive planning to dynamic decision-making. No more waiting for Monday’s meeting to discuss Friday’s numbers. Decisions could happen in the flow of work.
When the system was finally ready, we unveiled it to senior leadership. Monitors lit up with dashboards and charts, showing the heartbeat of the business as it happened. We demonstrated drill-downs, real-time updates, interactive filters. The technology worked flawlessly.
We waited for excitement. Instead, we were met with polite nods, muted interest—and then a familiar question:
“Where’s the report?”
The Pattern Repeats
At first, we thought we had simply missed the mark on design. This was the era before agile methods, when development meant long cycles of heads-down building followed by a big reveal. We hadn’t engaged executives enough along the way.
But that wasn’t the whole story. The deeper issue wasn’t the interface. It was the process it disrupted.
Executives were used to decision-making that followed a specific rhythm: reports delivered at the end of the week, meetings scheduled to review them, planning cycles measured in months. The EIS didn’t fit into that cadence. It demanded a new way of working—one that was faster, more fluid, and less formal than what leaders were comfortable with.
The data was accurate. The technology worked. The adoption didn’t happen.
Twenty-five years later, I watched the same pattern unfold with generative AI. Organizations racing to adopt ChatGPT, Claude, and other systems, only to discover that the technology wasn’t the barrier—culture was. Teams asking for AI-generated insights but wanting them formatted as traditional reports. Leaders requesting automation but insisting on manual approval at every step. The capability was there. The readiness was not.
I was struck by the realization that this wasn’t a new challenge at all. It was the same story, playing out with different technology. The tools had evolved. The human hesitation had not.
From Executive Information Systems in the ’90s to business intelligence platforms in the 2000s to AI assistants today, the pattern has been very much the same: capability arrives, expectation lags, then adoption comes once the form becomes familiar enough to trust.
And here’s the deeper truth: we don’t just resist new tools. We resist new formats. We build systems that deliver insight in revolutionary ways, then ask where the three-ring binder went. We create the future, then wait for it to look like the past.
A Familiar Reflex
History makes this pattern clear.
When Gutenberg’s press made books cheap and abundant, many scholars dismissed them as inferior to hand-copied manuscripts—not because the content was wrong, but because the form felt cheap. Early telephones were mistrusted for serious business because a phone call lacked the permanence of a written memo. Even the first spreadsheets were met with skepticism by finance leaders who trusted their ledgers more than glowing cells on a screen.
It seems that each generation experiences its disruption as a format violation. What repeats isn’t the novelty of the capability, but the familiar human reflex to trust what we recognize over what works better.
That reflex makes sense. For most of human history, novelty carried genuine risk. The same neural circuitry that helped our ancestors hesitate at the edge of a dark cave now fires when we face unfamiliar dashboards or AI-generated summaries. It buys us time to assess danger.
But when the danger is imagined and the opportunity real, hesitation becomes self-sabotage. What was once protection can become paralysis. For leaders, the challenge is to distinguish between format and function—to create space for caution without allowing familiarity to veto progress.
Why This Series
Four decades of working inside adoption cycles has shown me this pattern from multiple angles: as a developer building systems that leaders wouldn’t use, as a consultant watching organizations struggle to adopt tools they’d paid for, and as an executive navigating the gap between what technology promised and what culture could absorb.
What stands out isn’t just the resistance itself, but the way we misdiagnose it. We blame the technology for failing when culture isn’t ready. We confuse unfamiliar format with insufficient capability. We forget that adoption isn’t about the tool—it’s about whether we can recognize value when it arrives in an unexpected form.
This series, Cyclic Disruption, explores those moments. Each essay connects a story from my career to the dilemmas of today’s AI era. Not to predict what’s coming—but to recognize the recurring pattern that determines whether we’re ready when it arrives.
Because the future isn’t decided by the tools we create. It’s decided by whether we can recognize them when they don’t look like what we expected.
The Next Encounter
The next time you face a new tool or system and find yourself hesitating, pause and ask: Is it the capability I don’t trust, or just the form it takes; because in my experience, the most advanced systems rarely fail on function. They fail when we’re not ready to see them for what they are.
Better tools will come. Whether we recognize them depends on whether we can see past the format. Sometimes the report you’re waiting for doesn’t exist. Sometimes the insight is already here—just not in the format you expected.
Previous post: The Machine that Learned
Next in this series: “The Fear We Forget” — Why every generation convinces itself its disruption is unprecedented.
About this series: Cyclic Disruption explores patterns in how humans adapt to transformative technology, drawn from four decades of experience in AI development, enterprise consulting, and leadership. Each essay examines a moment when capability meets hesitation—and what we can learn when we stop treating our own creations as strangers.