According to Fortune, Sir Martin Sorrell, founder and executive chairman of S4 Capital, declared at the Fortune Global Forum in Riyadh that meaningful AI regulation may already be impossible. “The cat is out of the bag. We’ve missed the Oppenheimer moment,” Sorrell stated, comparing the situation to nuclear weapons control. He argued that tech companies and figures like Elon Musk have become too large and influential for governments to effectively regulate, citing Apple’s $4 trillion valuation and Musk’s compensation plan geared toward a $10 trillion Tesla valuation. Sorrell predicted synthetic people in advertising will become commonplace within 2-3 years, driven by 80-90% reductions in production costs. This comes as Nvidia recently became history’s first $5 trillion company following comments from former President Donald Trump about meeting with CEO Jensen Huang.
Table of Contents
When Corporations Become Nation-States
Sorrell’s comments highlight a fundamental shift in global power dynamics that’s been accelerating for decades. When a single company’s market capitalization approaches the GDP of all but two countries, traditional regulatory frameworks become almost meaningless. The advertising veteran understands this better than most—he built WPP into a global communications empire and witnessed firsthand how multinational corporations can operate across borders with agility that nation-states can’t match. What’s different now is the sheer scale and concentration of power in the hands of tech companies that control not just markets, but the fundamental infrastructure of modern society—from AI models to cloud computing to the chips that power everything.
The Coming Synthetic Media Revolution
Sorrell’s prediction about synthetic people in advertising reveals how deeply AI will transform creative industries. The 80-90% cost reduction he mentions isn’t just about efficiency—it represents a complete restructuring of how content gets made. We’re moving from an era where production required physical sets, crews, and talent to one where entire campaigns can be generated algorithmically. This goes far beyond simple CGI or deepfakes—we’re talking about fully synthetic personas with backstories, personalities, and audience relationships. The economic incentive is so powerful that resistance becomes futile. Brands will be able to create perfect, customizable influencers who never age, never cause scandals, and work for the cost of compute cycles.
Why Self-Regulation Is Inevitable
The uncomfortable truth Sorrell articulates is that traditional regulation requires leverage, and governments are rapidly losing what little leverage they had. When a company’s valuation exceeds the GDP of Germany or Japan, fines become rounding errors. When the same companies control critical infrastructure that governments depend on, the power dynamic flips. We’re already seeing this play out in the EU’s struggles to enforce the Digital Markets Act and Digital Services Act—the very companies being regulated often provide the technical infrastructure needed for compliance. This creates a paradoxical situation where regulators depend on the regulated to implement regulations. The executive leadership at these tech giants understands this dynamic perfectly.
The Oppenheimer Comparison That Matters
While Sorrell references J. Robert Oppenheimer in the context of missed regulatory opportunities, the more relevant parallel might be the nuclear industry’s development. Like atomic energy, AI offers tremendous benefits alongside existential risks. But unlike the Manhattan Project, AI development isn’t centralized within government control—it’s happening simultaneously across multiple companies and countries in a competitive race where slowing down means losing. The first-mover advantages are so substantial that ethical considerations become secondary. This creates what economists call a “race to the bottom” on safety standards, where the most aggressive developers set the pace for everyone else.
The Economic Earthquake Ahead
The cost reductions Sorrell mentions for advertising production are just the tip of the iceberg. When creative work that previously required teams of specialists can be accomplished by algorithms, entire industries face restructuring. The advertising industry that Sorrell helped build will look radically different in five years. But more importantly, the concentration of wealth and power in the hands of a few tech companies creates systemic risks. When a handful of companies control both the AI infrastructure and the content creation tools, they effectively become gatekeepers for entire sectors of the economy. The business landscape is shifting toward a winner-take-all dynamic that makes previous tech monopolies look modest by comparison.
The Unregulated Future
Looking ahead, we’re likely to see the emergence of what I’d call “corporate sovereignty”—where tech companies operate as de facto independent entities with their own governance structures, economic systems, and even cultural influence. The synthetic people Sorrell predicts are just one manifestation of this trend. As these companies develop their own AI ethics frameworks, content policies, and community standards, they’re effectively creating parallel legal systems. The real question isn’t whether governments can regulate AI, but whether any single entity can control technology that’s becoming increasingly decentralized and autonomous. Sorrell may be right that the regulatory moment has passed—but the consequences of that reality are only beginning to unfold.
 
			 
			 
			