
Jobs and capital are becoming scarce as Google, Microsoft, Meta and Amazon trash their reputations, beg for tax breaks, and kick their stated values to the curb. They’re desperate for the cash needed to be first in line for Nvidia chips and the cream of AI talent.
The values that built tech, like honesty, fairness, even market competition and democracy, are being junked because it’s assumed AI is a game that can have only one winner.
The leading edge of technology operates by the 90-9-1 rule. The winner gets 90% of the profit. The company that finishes second gets 9%. Everyone else fights for scraps. It was like this from the 1950s to the 2000s.
But there are now 4 Cloud Czars, each drawing their sustenance from owning and renting networks of cloud data centers. Microsoft, Google, Amazon, and Meta were all built on clouds, and there are four of them.
Yet Oracle isn’t hurting, despite having gotten into the game a decade late. Apple isn’t hurting, and they’re not even playing the game.
Could the central assumption in the Great Game of AI be a lie?
Hardware vs. Software

But those are “Moore’s Second Law” effects.
Moore’s Second Law holds that, as chips grow more complex and circuits are packed closer together, the cost to start producing them grows exponentially, even while the cost per circuit declines. This was implied by Gordon Moore’s original article and held true for over 50 years. Moore’s Second Law says the cost of developing chips squeezes out competition.
Huang’s Law, the insight of Jensen Huang that accelerates Moore’s Law, doesn’t violate Moore’s Second Law. He just ignores it in dealing with issues like power usage and chip size that Intel began facing even before the late Andy Grove’s retirement.
But that’s hardware. That’s not software.
AI is software.

In their race to hire a small number of AI experts, algorithmic designers and brain scientists whose insights can accelerate development, the Cloud Czars have forgotten that software can have many winners. Every advance in software creates new niches, not just at the base of the stack but throughout it. You can say it’s like a tree, but it’s more like the giant mushroom in Oregon or the pando tree in Utah.
Note that all the Cloud Czars’ AI efforts are, like those of Xai and OpenAI, and even the Chinese, built around Large Language Models. This could prove to be a dead end. As Gary Marcus and others have shown, LLMs don’t get much better as they get bigger. They just hallucinate more, compounding their errors, polluting the data that acts as their water supply.
To create true AI, and not just another software niche, we need to understand more about how minds work. We’ve just modeled the mind of a fruit fly but we’re nowhere modeling human minds. Getting to that will take new techniques and insights, combining biology, chemistry, even physics, transformed into algorithms. At every step, new software niches can be created, new opportunities can be opened.
The Winner is Clear

That’s not the way Nvidia sees it, either. Nvidia isn’t building just LLMs, and it’s not just involved in generative AI. Its work in robotics isn’t just about the mechanics of robots, but all their interactions with the real world. Modeling each of our senses is a separate science, opening new worlds of invention. The idea that working robots must look like people is also asinine.
While there will be many winners in many different niches within AI, the race may have already been run, the winner decided. And it’s not one of the Cloud Czars. It’s not one of the AI wannabes.
It’s Nvidia.







