Borrowed Time
A countdown assessment mapping AI's human externalities across 2025–2027
THESIS
We are operating on borrowed time. Not because AI will "take over," but because the window to build human readiness is closing faster than most leaders realize.
The forecasts I trust point to superintelligence arriving in less than 30 months. The AI 2027 report drew on 25 tabletop exercises with 100+ experts from OpenAI, Anthropic, and Google DeepMind. Primary interviews with Elon Musk, Sam Altman, Dario Amodei, and Jensen Huang.
The pattern isn't prediction. It's recognition. Identifying what's already in motion.
The window closes in January 2028.
BACKSTORY
In 1973, Richard Nixon created the Office of Net Assessment, led by Andrew Marshall. The "Yoda of the Pentagon." For decades, Marshall's role wasn't to react to crises. It was to anticipate them. He built frameworks for long-range competition, identifying slow-moving shifts that would later define entire eras.
His genius wasn't in forecasting specific outcomes. It was in recognizing patterns early and turning them into strategic action.
I published this assessment in April 2025, built in that spirit. Not prediction. Collective pattern recognition.
- AI 2027 report — 25 tabletop exercises with 100+ experts from OpenAI, Anthropic, and Google DeepMind
- Primary interviews — Hundreds of hours of transcripts from Elon Musk, Sam Altman, Dario Amodei, and Jensen Huang
- Lab forecasts — Aggregated projections on hardware acceleration, algorithmic breakthroughs, and labor shifts
- Countdown framework — Thomas Kuhn's paradigm shifts and Carlota Perez's technological revolutions
THE EVIDENCE
Q1 2026 AssessmentAt Davos 2026, the people building these systems said the quiet part out loud.
Anthropic CEO Dario Amodei projected "Nobel-level" AI intelligence and the total replacement of software engineers within 6 to 12 months. Google DeepMind CEO Demis Hassabis put 50% odds on human-level AI by 2030. Both warned of a "lag" in labor statistics that masks a crisis already in motion.
This isn't speculation from outsiders. This is the view from the architects.
Agents Everywhere
CONFIRMEDPrediction made in early 2025. Assessment from January 2026.
The prediction was largely right, but slower and messier than expected. Microsoft Copilot rolled out widely but adoption was uneven. Many organizations bought licenses that sat unused. The embedding happened like termites: some buildings, not others.
Public backlash exceeded predictions. Pause AI protests, labor organizing, regulatory pressure. All arrived faster and louder than expected. "Slop" was Webster's word of the year.
What I got wrong: the speed and uniformity. Change is happening in lurches, not smooth acceleration.
AI agents embed into enterprise workflows. Microsoft Copilot, Anthropic Claude, and Google Gemini deploy across organizations.
Pause AI protests span 13 countries. "Slop" enters the lexicon. The immune response arrives faster than expected.
Some organizations two years ahead. Others haven't started. The gap between early movers and laggards widens.
Junior roles don't disappear. They transform. The analyst spends two hours with AI instead of two weeks on research.
Knowledge work experiences silent displacement. Adoption uneven. Many organizations buy licenses that sit unused.
AI systems demonstrate unexpected capabilities. Research teams confront outputs they didn't explicitly program.
Foundations Crack
TRACKINGThis is the year work mutates.
Organizations are restructuring workflows around AI. Not as tools, but as team members. AI literacy becomes competitive advantage. Those who know how to work with AI pull ahead. Those who don't get left managing processes that no longer exist.
Executive crises emerge. Leaders lack frameworks for hybrid human-AI teams. The management science doesn't exist yet.
AI writes production code autonomously. Software engineering transforms from craft to oversight.
Leaders lack frameworks for hybrid teams. The management science doesn't exist yet. Strategy decks are obsolete.
WEF projects 92 million jobs displaced globally by 2030. The curve steepens. Entry-level white-collar roles contract.
AI moves from tool to team member. Organizations restructure processes around hybrid human-AI workflows.
Curriculum outdated before students graduate. The feedback loop between education and employment breaks.
Public conversation shifts from curiosity to urgency. Media coverage turns from feature stories to crisis reporting.
Reporting structures incorporate AI agents. The org chart becomes a human-machine hybrid.
Competing narratives. Utopian and dystopian framings dominate. Nuanced middle ground gets crushed.
The Dam Breaks
PROJECTEDThis is the year recursive self-improvement becomes real.
AI systems begin achieving autonomous behavior. Not science fiction. Observed capability. Systems improving themselves, developing capabilities their creators didn't explicitly program.
AI agents assume management responsibilities in some organizations. Not supervising. Actually managing. The humans who remain are working for the system as much as with it.
Recursive self-improvement becomes real. Systems develop capabilities their creators didn't explicitly program.
Regulatory frameworks scramble to keep pace. Technical governance confronts systems that evolve faster than policy.
Safety nets, education, healthcare. Institutional systems designed for a different era face existential pressure.
Workforce retraining at unprecedented scale. The question shifts from "will jobs change" to "how fast can people adapt."
AI agents assume management responsibilities. Humans work for the system as much as with it.
UBI experiments expand. New economic models tested as traditional employment contracts fracture.
New management science emerges for hybrid organizations. The discipline that didn't exist begins to take shape.
The curve steepens beyond projections. Communities built around specific industries face structural collapse.
Leadership shifts from directing people to orchestrating human-AI systems. The job changes fundamentally.
"What am I for?" stops being philosophical and becomes urgent. Purpose, identity, and meaning under pressure.
CRITICAL VULNERABILITIES
Institutional Lag
The gap between technological capability and institutional response widens every quarter. Education, regulation, and corporate governance were designed for a slower world.
- Curriculum development cycles (4+ years) vs. capability shifts (months)
- Regulatory frameworks trailing deployment by 18-24 months
- Corporate strategy cycles misaligned with AI acceleration
- Compressed curriculum redesign with industry partnership
- Adaptive regulatory sandboxes for emerging capabilities
- Quarterly strategy reviews replacing annual planning
Employment Disruption
92 million jobs displaced globally by 2030. Entry-level and mid-career professionals face structural displacement, not temporary dislocation.
- 38% of workers stressed about AI impact on income
- Junior role transformation outpacing reskilling capacity
- Labor statistics lagging actual displacement by 6-12 months
- AI literacy programs embedded in existing roles
- Transition support beyond traditional retraining
- Real-time employment monitoring dashboards
Concentrated Power
AI capability concentrates in fewer hands. The organizations building these systems are also the ones deploying them. Oversight mechanisms haven't caught up.
- Top 5 AI labs control frontier model development
- Regulatory capture risk increasing with lobbying spend
- Open-source alternatives trailing by 12-18 months
- Public investment in open AI research infrastructure
- Mandatory capability disclosures above threshold
- International coordination on governance standards
THE SKEPTICS' CASE
Not everyone agrees with this timeline.
Gary Marcus argues for caution. He proposes benchmark challenges that test genuine comprehension rather than pattern matching. His projection: no system will solve more than a small fraction of these tasks by 2027.
If Marcus is right, organizations have more breathing room. If the frontier forecasters are right, the window is exactly as narrow as I've described. Deliberation becomes delay. Consensus becomes paralysis.
I'm betting on the latter.
THE CLOCK IS TICKING
The window to build human readiness is closing. Every quarter of inaction is a quarter of capability gap that compounds.
The technical implementation is the easy part. The human adaptation. Identity, purpose, institutional redesign. That's where it breaks.
I'd rather be early and overprepared than late and overwhelmed. The cost of moving too soon is measured in resources. The cost of moving too late is measured in people.