The second Cold War will divide the world in two down to the silicon it runs on
It’s been clear for some time that the second Cold War has started between China and the US. Europe and most of the democratic capitalist countries are in the US camp, while the new Axis powers are China, Russia, Iran, and North Korea.
The new geopolitical rivalry is already well established, but it’s less well understood how far-reaching implications this will have when the battleground is the most powerful technology the world has ever developed and one that is likely to power most of society going forward.
The FT writes: The latest purchasing rules represent China’s most significant step yet to build up domestic substitutes for foreign technology and echo moves in the US as tensions increase between the two countries. Washington has imposed sanctions on a growing number of Chinese companies on national security grounds, legislated to encourage more tech to be produced in the US and blocked exports of advanced chips and related tools to China.
During the Cold War, the atomic bomb was a deterrent on both sides. Its existence ensured that it would not be used. AI is different. AI will be used for everything we do, and the world will see two very different versions of it, from use cases down to the silicon it runs on. The world is entering uncharted territory in geopolitical rivalry due to the unpredictability of how fast and far AI will progress and what it will enable—good and bad.1 Expect the race to only intensify. Soon, the battle will expand to energy production as AI usage explodes when it powers new domains. Ireland is the canary in the coal mine. Data centers currently use around 18% of Ireland's total metered electricity, almost equivalent to all urban households, and this percentage is rising rapidly year-over-year and could potentially reach 29% by 2028 or even up to 70% by 2030.
So far, with Transformer-based models, the quality of the models has scaled linearly with the size of the model. In turn, we can train and run inference on ever larger models by adding compute. This means that AI gets smarter as we add more compute. Practically everyone assumed this relationship would not hold nearly as far as it has, so we don’t know what lies ahead as we build more fabs and add ever more compute. This assumes that we keep designing ever more powerful chips à la Moore’s Law and won’t hit the limits in laws of physics.