AI’s $600B Question
🌈 Abstract
The article discusses the current state of the AI bubble and the challenges it faces, including the growing gap between revenue expectations and actual revenue growth, the supply shortage of GPUs, the dominance of OpenAI, and the potential for further investment incineration.
🙋 Q&A
[01] The AI Bubble and Revenue Expectations
1. What is the key question the author is trying to answer? The author is trying to answer the question "Where is all the revenue?" in the AI ecosystem, as there is a significant gap between the revenue expectations implied by the AI infrastructure build-out and the actual revenue growth.
2. How has the "AI's $200B Question" evolved over time? The author's previous analysis showed a "$125B hole that needs to be filled for each year of CapEx at today's levels." However, the author now estimates that the "AI's $200B Question" has become "AI's $600B Question" due to the following changes:
- The supply shortage of GPUs has subsided, but GPU stockpiles are growing.
- OpenAI still has the lion's share of AI revenue, and it's unclear how many AI products consumers are really using.
- The "$125B hole" has now become a "$500B hole" that needs to be filled.
3. What are the key factors contributing to the growing "hole" in AI revenue? The key factors include:
- The assumption that major tech companies like Google, Microsoft, Apple, and Meta will each generate $10B in new AI-related revenue, and other companies will generate $5B each, is likely too optimistic.
- The upcoming release of Nvidia's B100 chip, which will offer significant performance improvements, may lead to another surge in demand and supply shortage.
[02] Comparing AI Infrastructure to Physical Infrastructure
1. How does the author view the analogy of "GPU CapEx is like building railroads"? The author agrees with the analogy, but also identifies several key differences:
- Lack of pricing power: Unlike physical infrastructure like railroads, GPU computing is becoming a commodity with less pricing power.
- Investment incineration: Speculative investment frenzies in new technologies often lead to high rates of capital incineration, as seen in the history of railroads.
- Depreciation: Semiconductor technology, including GPUs, tends to improve rapidly, leading to more rapid depreciation of older-generation chips, unlike physical infrastructure.
2. What are the potential implications of these differences for the AI infrastructure build-out? The author suggests that these differences mean the AI infrastructure build-out is more likely to lead to investment incineration and harm primarily to investors, rather than creating sustainable value. However, the author also notes that declining prices for GPU computing can be good for long-term innovation and startups.
[03] The Road Ahead for AI
1. What is the author's overall outlook on the future of AI? The author believes that a huge amount of economic value will be created by AI, and that companies focused on delivering value to end-users will be rewarded. However, the author cautions against the "delusion" that everyone will get rich quick, and emphasizes that the road ahead will be long and have ups and downs, but will ultimately be worthwhile.
2. What advice does the author have for those building in the AI space? The author encourages those building in the AI space to reach out, as they would love to hear from them. The author suggests that remaining level-headed through this moment of speculative frenzy can allow for the building of extremely important companies.