558: Compute Per Capita, OpenAI, Tiktok, Perplexity, Oil Masterclass, Alberta Natural Gas, xAI, Anthropic, Jensen Huang Interview, TSMC 2nm, and Hans Zimmer
"We've built planetary-scale applications"
Cowardice causes blindness.
—Oswald Spengler
🏇 You may think of knowledge as separate from action.
But they're bound together like a horse and its rider.
One guides while the other provides the power.
💾💻📲🎮🤖I’ve been thinking about the concept of “compute per capita” and how much my daily processing power budget has grown from the time I was a kid to today. 🗓️📈
As a young kid in the 1980s, my daily FLOPS (floating-point operations per second) had to be relatively modest. My Atari (🕹️) has a 1.19 MHz chip, the original Nintendo ran at 1.79 MHz, and the Super Nintendo had a speedy 3.58 MHz chip! My dad later had a computer that I played games on (at first a 386 DX/25 MHz). We also had a CD player which did some compute every time we played a CD (💿), but most other things were largely analog, including the TV and car.
In the 1990s, computers got faster, video games became more complex and required a lot more compute (3d graphics, texture mapping, etc). The internet came and I could now use both local and third-party computers to do math on my behalf, but it was still relatively modest. Most webpages were static, images were low-resolution, and there was little if any audio or video. Remember when waiting 30 seconds for a single JPEG to load seemed reasonable? 😬
In the 2000s, my daily compute budget exploded. Not only did my computers get to the Gigahertz range and CPU cores multiplied, but everything online also got way more complex and dedicated GPUs became more common. Dynamic websites, encryption (which involves significantly more math), all kinds of apps started using audio (MP3s!) and eventually video too. Google became central to the internet, and every time you typed something in there, fleets of servers did a lot of math for you to find what you were looking for.
In the 2010s, smartphones became mainstream, as did streaming video (YouTube, Netflix) and cloud computing generally (☁️). The most popular apps (Facebook, Instagram Twitter) became highly personalized to each user, with recommender algorithms using massive compute resources to generate millions of customized content feeds and targeted ads. Having always-connected computers in our pockets meant that we spent a lot more time interacting with computers — it wasn’t something you had to go sit down somewhere to do anymore, it became a reflexive habit or a background activity.
The analog cars of my youth have turned into computers on wheels, containing hundreds of semiconductor chips. Even the humans driving them became largely guided by a lot of compute: Complex GPS routing software connected to both satellites and the internet through cell towers which are using a tremendous amount of compute to squeeze as much as possible usable bandwidth out of radio spectrum.
The 2020s have brought us into entirely new territory with AI (🤖). So far, my own daily compute usage is many many orders of magnitude higher than even just a decade ago thanks to generative AI. Many queries that previously went into Google now go into Perplexity, Claude, ChatGPT, Gemini, and others, triggering incredibly powerful GPUs — each with thousands of compute cores — to perform trillions upon trillions of operations to answer my questions. Even vanilla Google searches now include generative AI snippets.
Asking an AI to answer ONE question uses more compute than my entire digital life consumed in the 1990s.
What will the 2030s bring? 🔮
🏛️⚔️🏺 Friend-of-the-show Alex Petkas (we recorded a podcast together here) recently finished his series on Cato and it is EXCELLENT.
I recommend it, check out the first part (and if you like it, keep going):
The stories about the maneuvering and power struggles between these great men (Cato, Caesar, Pompey, Cicero) are just *chef’s kiss* 👨🏻🍳
💚 🥃 🙏☺️ If you’re a free sub, I hope you’ll decide to become a paid supporter in 2025:
🏦 💰 Business & Investing 💳 💴
🇨🇳 📱 ⌛ TikTok, Tic Toc 🕰️ (Impact on Meta?)
So wtf is going on with TikTok? There are so many plotlines running concurrently these days that it’s easy to forget some of the past season arcs…
Whatever happens, in theory it should happen soon. April 5th is the divestiture deadline for ByteDance to sell TikTok's U.S. operations or face a ban.
The White House said that a deal would be struck before the deadline and that there’s “tremendous interest”:
Blackstone is discussing joining ByteDance's existing non-Chinese shareholders, led by Susquehanna International Group and General Atlantic, in contributing fresh capital to bid for TikTok's U.S. business. The group has emerged as front-runners.
We're likely to see a last-minute deal materialize that allows everyone to claim victory.
ByteDance will get a big check and retain some economic interest (the law allows for 20%) while surrendering operational control, U.S. politicians will declare a national security win, and 170 million American users (!!!) will continue their endless scroll.
But what precedent is this setting? Could France force Meta to do surgery and amputate Facebook and Instagram’s French operations at some point because they got in a fight with the company? Or maybe the EU as a whole? How about Turkey? India? Indonesia? Brazil? 🤔
We've built planetary-scale applications that remain vulnerable to divergence in national interests and geopolitical tensions. The TikTok situation might just be the first of many.
It’s also worth noting in the “very long shot” category: Perplexity is bidding for Tiktok, even creating a presentation to describe their vision for it. They probably lack the firepower to make it happen, especially since it’s likely to come down to political favors (which makes Oracle well-positioned to get some juicy cloud deals).
But Perplexity’s ideas for Tiktok are interesting:
Building a Transparent Algorithm: Complete rebuild of TikTok's algorithm in American data centers with domestic oversight. The "For You" feed would become open source, transforming TikTok into the most neutral and trusted platform by eliminating manipulation risks through transparency.
AI Infrastructure Upgrade: Nvidia Dynamo-powered infrastructure that would scale recommender models 100x with faster inference speed, dramatically improving TikTok's recommendation capabilities to make them the best in the world.
Enhanced Trust Features: real-time citation systems for videos, combined with a context system leveraging both community feedback and AI capabilities. Similar to @AskPerplexity on X, this would provide users with reliable information verification while watching content.
Advanced Search Capabilities: Integration of Perplexity's answer engine with TikTok's vast video library.
User Experience Improvements: Shift focus from pure engagement to user satisfaction, with cross-platform integration between Perplexity and TikTok accounts. This would enable better personalization by leveraging user interests and behaviors across both platforms.
AI-Powered Content Enrichment: Enhancement of videos with automatic translation, annotation, and contextual information. Users could initiate in-depth research queries directly from videos, creating a seamless bridge between casual browsing and deep learning.
Platform Integration: Fusion of TikTok's creative community with Perplexity's knowledge infrastructure to create the world's greatest platform for creativity and knowledge discovery, offering users both entertainment and educational value in a unified experience.
Regardless of who buys it, #1 and #3 are good ideas.
🛢️Oil Masterclass: How Much Do You Know about Oil? 🛢️
A great primer on oil by friend-of-the-show Mark Nelson (we’ve done four podcasts together, here’s the most recent one).
They talk about:
The geology and chemistry of oil
Drilling technology and the fracking revolution (it made me curious about how drillers actually *know* where the drill bit is underground, and how they steer it. Maybe I’ll write more about this some other time, it’s fascinating technology)
The infrastructure to transport oil
Strategic petroleum reserves
Oil market dynamics and boom-bust cycles (and when oil went negative during COVID)
Oil’s environmental and geopolitical implications
The historical development of the oil industry
And if this whets your appetite for more, I recommend the Pulitzer-winning book ‘The Prize’ by Daniel Yergin. After reading that, you’ll know way more than you ever wanted to know about the viscous stuff.
🍁 Alberta’s Natural Gas Reserves Are Nearly 6x Larger Than Previously Thought 😮
Canada has A LOT of oil and A LOT of gas.
But apparently, it has so much more gas than we thought that it fundamentally changes the country's energy position on the world stage.
After the latest assessment of reserves, the country now has the ninth-most natural gas reserves in the world, up from 15th previously.
New data suggests Alberta has proven natural gas reserves of 130 trillion cubic feet (TCF), compared to a previous provincial estimate of 24 TCF.
Adding in probable gas reserves — which encompasses supply with a reasonable likelihood of recovery — puts the overall figure at 144 TCF, according to the study commissioned by the Alberta Energy Regulator (AER) and conducted by McDaniel & Associates Consultants Ltd.
The new gas reserve estimate more than doubles Canada’s overall total [...]
Alberta Energy Minister Brian Jean put it this way: “We’re Texas-sized as far as gas goes; we’ve got some big numbers.”
The province is hoping to lure manufacturing, petrochemicals and data centers with the promise of abundant and relatively cheap natural gas. “Long term, we are the best place to invest in the world because as other resources dry up, Alberta is not going anywhere for a long time.”
On the oil side, the preliminary new estimates show proven oil reserves of 167 billion barrels, up from the last official estimate of 159 billion (final numbers are expected soon, once an audit of all basins has been completed). 🛢️🛢️🛢️🛢️🛢️🛢️🛢️🛢️
By comparison, Texas’ proven reserves in 2023 were 20 billion barrels (EIA numbers).
There’s a growing consensus in Canada that new pipelines and export terminals need to be built. Right now, about 90% of Canada’s oil is sold to the U.S. at a $11-15/barrel discount to WTI, and it’s very possible that with more customers a better price could be realized.
Canada exported about 4 million barrels/day to U.S. refiners in 2024, accounting for nearly half of total US crude imports.
🐦 xAI + Twitter (oops, I mean X) 🤖💰
Musk merged two of his companies, selling X to himself in a stock transaction that values xAI at $80 billion and X at $33 billion (minus X’s $12 billion of debt).
My immediate thoughts are:
His goal is probably for Tesla to eventually buy the whole enchilada in a stock transaction, as a way to bring Elon’s ownership of Tesla where he said he wanted it (around 25%). This self-marked, over-valued currency would allow him to do that.
If xAI is worth $80bn, what tf is Anthropic worth? Their last round valued it at $61.5bn with $1.4bn ARR. Almost all of xAI’s revenue (estimated at around $100m ARR) comes from that licensing deal with Twitter (oops, I did it again) — a kind of mirror image to how a lot of Twitter’s current revenue probably comes from xAI paying it for access to its “real-time data” using some of the money it raised from VCs. One hand washes the other. 🤝
I get that xAI was able to spin up a massive training cluster faster than anyone else, but in a world where frontier models are increasingly commoditized and GPT-4.5 and Gemini 2.5 caught up with Grok 3 within weeks, I’m not sure if that’s enough to deserve an 800x revenue multiple 🤔
🤖🎨🏞️ OpenAI’s Viral Moment & $40bn Raise + Every Time an AI Company Throttles Users, Jensen Smiles 😎⛏️💰
OpenAI’s recent image model launch was the biggest viral moment for AI since the original ChatGPT in 2022. Sam Altman tweeted that the company’s servers were on fire and they had to throttle users and push back product releases.
100k GPU chunks! 🤯
Simultaneously, OpenAI raised $40 BILLION DOLLARS at a $300bn valuation 🤯
They expect revenue to triple to $12.7 billion by the end of 2025. Masa and Softbank are of course the most aggressive and leading the round with $30bn. 💰💰💰💰
Every time an AI company hits capacity constraints, one man is happy: Jensen 😎
High demand + fresh round of financing = lots of picks & shovels ⛏️⚒️🛠️
In the last couple of days, OpenAI, Anthropic and xAI have each posted on X that they’re running out of AI chips to power their chatbots or APIs and that they have to limit the usage of their services in one way or another.
I’ve long noticed that Anthropic’s usage limits are rather stingy, and Claude Sonnet 3.7’s reasoning mode is tuned to conserve tokens (at least in the app, in the API you can manually select that). On the same query, Grok will often “reason” for minutes and generate pages and pages of tokens while Sonnet 3.7 will often just do a paragraph or two.
I can’t help but think that if Anthropic had more inference capacity, they could tune their model to think longer on average (and not bury the reasoning button where most users won’t notice it) and their models would do better.
They may also be able to start being more competitive with Google on context window size. Gemini 2.5 has caught up with Claude 3.7 on coding AND has a much bigger context window.
Infrastructure matters 🏗️
🧪🔬 Science & Technology 🧬 🔭
😎 Casual Interview with Jensen Huang 🗣️
This was a good one, very different from the more polished media hits.
His explanation of scaling up vs scaling out was great, and his description of how Nvidia’s full stack optimization (Co-designing software, algorithms, and hardware simultaneously) is unique and can’t be matched by competitors was great.
It’s also great to hear him think through how they’re deciding between trade-offs when allocating die space to various types of compute units.
👩🏻🏫🤖 China to Launch AI Courses for Primary & Secondary School 🇨🇳 🚸🚌
The AI-native generation:
Schools in Beijing will introduce AI courses and teaching methods into the primary and secondary school curriculum starting September, to nurture young talent and boost growth in the advancing sector. [...]
AI tools will be used to “explore new scenarios of future classrooms”, which would see AI being incorporated into teaching methods - using AI learning research and teaching assistants for personalised learning.
Schools must provide at least eight hours of AI education a year, officials said, and can run courses independently or integrate them into current curriculum like IT and science classes. [...]
At the primary school level, courses will mainly focus on “enlightening students’ AI thinking”.
As students advance to the secondary school level, practical courses will be introduced, focusing on strengthening AI application capabilities and innovation methods.
✍️🤖 GPT-4o Gets Better at Writing…
The latest version of GPT-4o is pretty good at writing.
I gave it a simple prompt: give me some wonder. This is what it came up with (I’m sure it could do better with more iteration and a more specific prompt):
Above us, in the ink of night, galaxies spin like spilled silver, and comets carve signatures across the sky too vast to read.
Somewhere in that velvet sprawl, a star dies with a sigh, birthing elements that will one day become the iron in your blood, the calcium in your bones.
We are not beneath the cosmos—we are of it.
To look up is not to seek escape but to return to origin, to remember that your existence is a sentence in an infinite, ongoing epic written in plasma and gravity and improbable chances.
Wonder isn't something we chase—it’s the quiet recognition that we are walking miracles, wearing skin stitched from stardust, breathing the exhalations of ancient suns.
Wonder isn't escapism. It’s homesickness for the infinite.
The ink of the night, spinning like spilled silver, comets carving signatures across the sky. That’s some nice imagery! 👀
…and OpenAI will (finally!) Open Source a LLM 🕊️🔓
Altman also announced that: “we are excited to release a powerful new open-weight language model with reasoning in the coming months, and we want to talk to devs about how to make it maximally useful”.
This is likely a reaction to DeepSeek and would be their first open model since GPT-2.
He also poked Zuckerberg: “we will not do anything silly like saying that you cant use our open model if your service has more than 700 million monthly active users. we want everyone to use it!” 🦙
🐜🇹🇼 TSMC to Ramp Up 2nm Chips in the Second Half of 2025
This is machine-translated from Chinese:
TSMC 2nm will be mass-produced in the second half of 2025 as scheduled, and customers are queuing up to place orders, and it is expected to have a monthly production capacity of 30,000 [wafers] by the end of this year [...]
According to the supply chain, the first mass production plant of 2nm is the Zhuke Baoshan Fab 20 factory, with a monthly production capacity of 3,000 wafers in mid-2024, currently about 8,000 wafers, and is estimated to reach 22,000 wafers by the end of the year, and the Kaohsiung Fab 22 P1 plant is ahead of schedule and is now ready to start mass production, with a total monthly production capacity of about 30,000 wafers by the end of the year.
This is a big deal. Node shrinks are now rarer and more expensive, but they matter.
Compared to the 3nm, the 2nm is reported to provide a speed increase of 10~15% at the same power consumption, or power consumption is reduced by 25~30% at an equivalent speed.
Customers are lining up, with Apple at the front of the queue, as usual:
It is rumored in the industry that the "affordable" 2nm customer base is similar to the 3nm generation, and the main customer is still Apple's iPhone, and it is rumored that the iPhone 17 series, high-end Pro models will use 2nm processors, but limited production capacity and cost, and other models of chips still use 3nm.
As for Intel, AMD, Qualcomm, MediaTek, and Broadcom are expected to be introduced one after another. NVIDIA, the leader of AI GPUs, will still use TSMC's 3nm process for the Rubin platform in the second half of 2026, and it is not in a hurry to enter the 2nm generation.
🎨 🎭 The Arts & History 👩🎨 🎥
🎼 Hans Zimmer: Great In-Depth Conversation 🎶 🎞️
Hans Zimmer: “The job is not to listen to the director telling you what music he wants, because if he knows what music he wants, then he can do it himself. Right? My job is to listen to him tell me the story and then do the thing that he can't even imagine. That's the job.”
In this in-depth interview, Zimmer opens up about his creative process and the evolution of his career in the film industry. He talks about how his uniquely styled studio, designed to have a certain vibe and encourage collaboration, has been a key element and is very different from most other big, open, airy, sterile studios.
He also talks about behind-the-scenes stories from working on films like Inception, Gladiator, Dunkirk, and Interstellar. He nerds out about old-school synths and being an early innovator of sampling and electronic music.
It’s great stuff, especially if you enjoy learning about the creative process.
Beato's YouTube channel is one of the best. If you like rock, you should search his "what makes this song great" playlist and see if there's an artist you know/enjoy: you'll hear songs like you've never heard them before --> https://www.youtube.com/playlist?list=PLW0NGgv1qnfzb1klL6Vw9B0aiM7ryfXV_
Interesting stuff on AI learning in China, one question with the buzz about Alpha School has anyone found a better general purpose AI for kids education than Khan Academy? Trying to get help for my 7th grade son in Math, English, and Chinese.