This is the second of three pieces on how AI is restructuring the foundations of how we work, organize, and compete. Part 1 covers the mechanics — what AI actually does to speed and scale. Part 2 examines what breaks when those mechanics meet the structures we've built. Part 3 addresses what to invest in and protect.
I've spent most of my career inside or adjacent to various types of organizations — government, corporate, startup. Every one of them was built on the same basic assumption: humans are limited, so build structures to compensate. Corporations organize work. Governments organize society. And startups organize to disrupt and find marketshare. Hierarchies exist because no single person can hold all the information, make all the decisions, execute all the tasks, or exploit all of the opportunities. The structures are scaffolding for human limitation.
AI doesn't replace the human. It removes the limitations the scaffolding was built for.
The Corporation Was a Workaround
The modern corporation exists to solve a coordination problem. You need to organize labor, allocate resources, aggregate information, and make decisions faster than a loose collection of individuals could. The whole apparatus — management layers, departments, reporting structures, the Board — consolidates agency in the hands of a few because distributing it was too expensive and too slow.
That math is changing. When a single person with the right AI infrastructure can access information at the scale of an enterprise research team, synthesize it at the speed of a strategy department, and take action with the precision of a well-coordinated operations unit — the rationale for the corporation as an organizing structure starts to thin.
I'm not arguing corporations disappear. I'm arguing the case for them weakens in ways that will reshape who builds what and how. Individual agency is no longer bounded by how many people you can hire or how much capital you can raise to organize them. A person with an AI stack, domain expertise, and good judgment can now marshal resources that used to require a 50-person team, and the reach of the individual is only growing as we learn more about the methods and potential AI offers. The economics of the one-person or three-person operation aren't a lifestyle choice anymore — they're a competitive structure.
The corporation's remaining advantage is coordination at scale for physical things: supply chains, manufacturing, large-scale infrastructure. For knowledge work, information work, analytical work? The individual is becoming more powerful than the institution. That should make every executive ask: what am I actually organizing, and does it still need to be organized this way? (See Jack Dorsey's post on how he is thinking about this opportunity.)
Governments and the Geography Problem
Governments were built around geography. You protect borders. You tax economic activity within them. You regulate industries that operate inside your jurisdiction. You provide services to people who live in a defined physical space. The entire model assumes that the things worth governing — people, money, risk, information — are geographically bounded.
AI (and its siblings in crypto and distributed systems) is dissolving those boundaries. Risk is now digital — a threat actor in one country can compromise critical infrastructure in another without crossing a border. Economic activity increasingly happens in spaces that don't map to jurisdictions. Information flows at the speed of light and doesn't stop at customs. (Check out Balaji Srinivasan's Network State idea where he prefers where this is all heading.)
The structural problem for governments is that they were designed to operate at a tempo and within a scope that no longer matches where risk and value actually live. A city government dealing with cybersecurity threats to its transit system — something I work on directly — is facing adversaries who operate at machine speed across international boundaries, while the city's procurement cycle runs on 18-month budget horizons. The mismatch isn't a policy failure. It's architectural.
The same acceleration that empowers the individual also empowers the adversary. And the institutions designed to protect us are structurally too slow to keep up, as they were built for a different physics of information and action.
The Human Remainder
Here's where I part ways with both the AI optimists and the doomers.
There is a facet of humanity that this technology cannot touch. Not "hasn't yet" — cannot. The thing that drives art, curiosity, love, the experience of being alive in a body with a consciousness that can wonder about itself. AI can generate a painting, but it cannot experience the impulse to paint. It can compose music, but the ache or joy that makes a human reach for an instrument doesn't exist in the model.
This isn't a sentimental argument. It's a structural one.
If AI compresses effort and amplifies output — which it does, as I argued in Part 1 — then the activities that cannot be compressed or amplified by AI become relatively more valuable. Not in an economic sense (though that too), but in a human sense. The promise of AI, if we build it right, is that more people get to spend more of their time on the things that make life worth living rather than the operational scaffolding that used to consume most of it.
AI doesn't replace the human — it replaces the workarounds humans built because they couldn't operate at the speed and scale the world demanded. Corporations were one such workaround. Bureaucracies were another. Entire professions built around information aggregation and synthesis were workarounds.
The human remains. The scaffolding shifts.
But that future only materializes if people are safe enough, economically stable enough, and educated enough to reach for it. "AI frees you to be more human" is a hollow promise if it also eliminates your livelihood and nobody builds the bridge to the other side. The augmentative case is real, but it's conditional. It doesn't happen by default.
What Holds, What Breaks
AI affects all systems — digital, physical, human. The digital ones are obvious: software, data, automation. The physical ones are next: supply chains, manufacturing, infrastructure where AI-driven optimization meets real-world constraints. The human systems are where it gets hard and where most organizations aren't yet looking.
The structures that hold are the ones built around judgment, relationships, and context that can't be fully digitized. The ones that break are the ones built primarily to compensate for human limitations in speed, scale, and information processing — because those limitations are being removed.
If your organization is mostly scaffolding, AI is coming for your architecture. If it's mostly judgment and human capital, AI is about to make you very dangerous.
The question then becomes: what do you actually invest in? What assets appreciate when the scaffolding collapses?
That's Part 3.