Liberty’s Highlights

Liberty’s Highlights

615: Nvidia Q4, Ben Evans on AI Economics, Meta + Google TPUs, China Solar at 1TW/Year, Anthropic vs DoD, AWS Outages, METR, Sora, and Marty Supreme

"none of that guarantees the AI labs will capture much of that value"

Liberty's avatar
Liberty
Feb 27, 2026
∙ Paid

Life is so unlike theory.

—Anthony Trollope

👨‍🍳🔪🫑🥕🍅 Anything that makes cooking more fun is likely to make you cook more, so it’s worth a try.

I got this Chinese vegetable cleaver (also known as a Chinese Chef’s knife) to see if I like that form factor. I think I saw someone use one in a cooking video, and it looked fun and badass, so why not? ¯\_(ツ)_/¯

So far I’m really enjoying it. I had to get used to a slightly different technique than with the Western-style chef’s knives, but for veggies, the straighter and flatter edge works great. The blade is very thin — scary sharp!! — but unlike most thin-grind Western knives, there’s enough mass behind it to do some of the work for you.

I’m not ready to get rid of my Western-style knives, but it’s a nice addition to my kitchen.

If you’re curious, the model I got is the SHI BA ZI ZUO 7 Inch (steel: 40CR). It gets good reviews and is an excellent value.

🌡️🥵👨‍👧‍👦 I decided that my boys were old enough to try our sauna.

Of course, I didn’t go in unprepared. I read up on the risks and learned that children are not as good at heat regulation as adults (they don’t sweat much), what precautions to take, and what to watch for to avoid dehydration.

Reading up on what they do in Finland (🇫🇮) was particularly helpful. I knew they were sauna fanatics over there, but I didn’t realize many kids there start using the sauna before they’re 1 year old. Not for long, and on lower benches where it’s much cooler, and with constant adult supervision. But still!

The first time, I only took my oldest boy. It was a great moment together, 20 minutes in a quiet box, talking, joking.

I made sure that he drank a ton of water and took breaks on the lower benches or even outside, but his heat tolerance turned out to be quite good, and he just loved it.

In fact, he loved it so much that he asked to go again for the next 6 days in a row!

After a few days, his younger brother joined us, and he loved it too. He took more breaks, but to him, that was part of the fun. He loved going out in the snow and then coming back in the heat. ❄️🔥

💌📬💚 🥃 I told you about OSV Field Notes when it launched. It’s a weekly, high‑signal curation of things worth your time (sounds familiar?).

We’ve hit our first milestone: 10 issues.

What's inside?

Across fifty pieces in those ten issues, my OSV colleagues and I have covered everything from how Freud's nephew weaponized psychology to reshape American democracy, to a cancelled sci-fi series from 2017 that's one of the best shows nobody has seen, to the Dutch engineers who had to build an artificial sun (☀️) to keep Moore's Law alive.

There’s also a vinyl deep-diver on YouTube, a Roman Republic epic novel that reads like Game of Thrones, a scientific paper on how your roommate's genes shape your gut bacteria, and a personal essay on the cutting-edge sleep drug that finally worked after decades of failure. And many more.

The variety is the point.

It’s a weekly reminder that interesting things are happening in every corner, not just the ones the algorithm has decided you care about, and that it’s worth digging in the past for evergreen gems. There’s nothing like a friend’s recommendation to cut through the slop deluge.

I think you’ll really like it. You can check out past issues here:

  • Issue 1, Issue 2, Issue 3, Issue 4, Issue 5, Issue 6, Issue 7, Issue 8, Issue 9, Issue 10.

Subscribe to get future issues (it’s free).


🏦 💰 Business & Investing 💳 💴

Meta 🤝 Google: Meta Adds TPUs to Its Compute Stack (As Internal ASICs Efforts Stumble)

Zuck was clearly talking to everyone, hiring everyone, building everything... just trying to brute force his way back!

Right after announcing deals with both AMD and Nvidia, it’s now Google’s turn to become an official compute partner for Meta:

The multi-year deal is worth billions of dollars, said a person who was briefed about it. Meta has also been talking to Google about buying TPUs for its data centers as soon as next year, though the status of that discussion couldn’t be learned. (Source)

This isn’t just a deal for inference, which should make Nvidia incrementally more nervous:

Indeed, the fact that Meta plans to use TPUs for AI training is notable because most analysts, doubting that anyone could compete technically with Nvidia on training, have said they believe the biggest opportunities for challenging Nvidia lie in using chips for inference, which doesn’t require large, interconnected clusters of servers.

Meta is also continuing to develop its own chips for AI inference to save on costs and diversify away from Nvidia’s chips.

Meta won’t be the first large customer of TPUs. Anthropic last year agreed to spend about $20 billion buying TPUs from Broadcom, which co-designs the chips with Google and oversees TSMC’s manufacturing of them.

But it’ll be interesting to see how competitive the TPU is with Nvidia’s GPUs by the time this all happens.

By many accounts, the current version of the TPU is the most competitive against Nvidia we’ve seen, but the next one could be significantly less so, as Google made more conservative design decisions just as Nvidia became more aggressive.

On top of that, the math is different for Google using its own chips internally and for a third party like Meta, which has to pay some margin to Google AND make changes to its AI stack to optimize it to run on the different architecture.

It can still be worth it for Meta, as there are multiple benefits in diversifying its compute suppliers and making its AI more flexible, but the trade-offs are real.

Making chips is hard, as Meta is finding out firsthand:

Meta last week scrapped the most advanced chip it was developing for training AI models, after struggling with the chip’s design, and shifted its focus to a less complicated version, the people said. (Source)

Both Meta and Microsoft are struggling there. Amazon seems to be having more success with the Trainium, but they’re not out of the woods yet, and every generation has to earn its keep all over again vs the competition from Nvidia, AMD, and increasingly Google. Nobody gets to coast.

🔥 Thoughts on Nvidia Q4 Earnings 🤖

Demand for AI compute is so high these days that multiple sources, including Nvidia, report that Hopper and even the 6-year-old Ampere GPUs are sold out in the cloud, with rising spot prices.

Part of that is no doubt demand from agentic models that can spawn a bunch of sub-agents, use tools, and write lots of complex code, generating WAY more tokens than a chatbot user.

Nvidia’s Q4 report showed remarkable numbers by any standard.

User's avatar

Continue reading this post for free, courtesy of Liberty.

Or purchase a paid subscription.
© 2026 Liberty RPF · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture