On Saturday, news slipped almost accidentally through the wires first as a whisper from Bloomberg, then as an echo from Reuters: OpenAI is working with the UAE to build a five-gigawatt data center complex in Abu Dhabi. Five gigawatts. That's not "AI scale," that's "nuclear state infrastructure" scale. They're calling it an extension of the Stargate project, which until now has been an almost mythic placeholder for "whatever Sam Altman is scheming behind the curtain." Texas was the testbed. This is the export.
By Sunday night, the OpenAI blog quietly lit up with a new program OpenAI for Countries. The copy is pure alignment theater: help your nation benefit from frontier models, we'll give you the safety protocols, the responsible deployment tools, the values. But anyone who's been watching knows what's really happening. OpenAI isn't just selling models. It's selling power grids. Compute colonies. An end-to-end substrate for inference economies.
The size of this move isn't just about flops or latency. It's a geopolitical flag plant. Five gigawatts implies a long-term bet that the value bottleneck in AI isn't model size, or data, or even training know-how it's physical instantiation. Energy throughput per token. Heat density per parameter. And most importantly, who gets to own the knobs. You don't need access to the weights if your sovereign Stargate comes with a license to press the button.
And then there's the carbon cost. Everyone parrots "Emirati solar" like a spell, but you don't run clusters this big on goodwill and photons alone. The offset markets are jokes, the regulatory scrutiny is non-existent, and the power math does not look cute. There is no "clean" AI at 5GW. There is only exportable guilt, wrapped in the language of responsible innovation.
What's elegant here morally bankrupt but elegant is that the whole playbook mirrors imperialist infrastructure moves from a century ago. Sell the tech, lease the land, seed the dependencies, and keep the locals happy with governance dashboards and quarterly compliance check-ins. It's just oil colonialism in reverse: now silicon is the commodity, and the software barons are the ones laying rail.
This is where OpenAI outpaces its peers. Anthropic plays defensively, Google plays with itself, Meta just bleeds out GPU cycles in open source hope. But OpenAI's always been about realpolitik. They built a model, then a company, then a network of cap tables that crossfade into government lobbies and energy financiers. Stargate isn't a product. It's a protocol for planetary-scale inference. The actual models are just content.
The fact that it dropped on a weekend, with barely a peep from the usual alignment wonks, tells you something else too: everyone's already been briefed. This isn't "sudden." This is staged. The real deals happened months ago behind doors you don't get to knock on. What you're seeing now is rollout.
Funniest part? In the blogpost, they say this isn't "exclusive." Any nation can request a Stargate. Democratic AI. Global infrastructure. Just bring your own land, your own solar fields, your own fiber. We'll do the rest. Think of it like AWS, if AWS had one client: the future of cognition.
And if you squint a little, you'll see the real grift buried beneath the scale. In this model, the only things that matter are energy, cooling, latency, and control. Everything else fine-tuning, red-teaming, even safety is just scaffolding. A story to make the spin acceptable. But look again. Five gigawatts doesn't lie. OpenAI isn't trying to build the best model anymore. They're trying to own the substrate all models run on.
Call it AGI. Call it compute colonialism. Call it Stargate. Just don't pretend it's neutral. Nothing this big ever is.