501: Elliott vs Texas Instruments, Nvidia Regrets, Elon Musk's xAI, Mercedes & Polestar EVs, Fiber, Noise-Cancelling, China Semiconductors, and Disney
"You can’t live life looking backward!"
It is easy to crush an enemy outside oneself but impossible to defeat an enemy within.
—Eiji Yoshikawa
🌧️☁️🔍 Clouds are heavy!
An average 1x1 kilometer cumulus cloud weighs about 1.1 million pounds.
That’s about 500,000 kilograms or 550 tons.
How do they float if they’re so heavy?
"They're falling very, very, very slowly," Lamers said. Anything falling to Earth reaches what's known as terminal velocity, or its fastest possible speed as it falls freely.
Terminal velocity occurs when drag force from the air perfectly counters gravity. The water droplets are so light that their terminal velocity is super slow — between 60 and 120 feet per hour (18 to 36 meters per hour) for a droplet with a 5- to 10-micron radius. Because clouds are usually thousands of feet high in the atmosphere, this small downward shift isn’t noticeable to the eye.
But something counteracts that slow descent. Updrafts of rising air keep the coalesced droplets suspended, even as they gradually fall. [...]
Cloud formation requires warm, moist air. Warm air is more buoyant than cold air, so it rises into the atmosphere and then condenses into a cloud as it cools. The cloud is less dense than the air below it
I vaguely knew this, but not the detail. TIL!
🏎️ 🖥️ New project: Upgrading my internet to fiber.
For the past 20 years, my internet service provider has been Teksavvy, one of the few independent ISPs in Canada.
I like that they offer good service, more flexibility for those who know what they’re doing and want to use their own equipment, and don’t play games with contracts. You pay month to month and that’s it.
They’re not the cheapest or the fastest, but I need my internet to be rock solid, and over the years I’ve come to trust them. I also like that they fight internet censorship in Canada and provide competition to the big incumbents.
Until recently, they only offered cable speeds up to 120mbps in my area. I know, compared to many places in the world, that’s pretty pathetic. But I just found out that they now offer fiber-to-the-home at 1.5gbps (aka 1500mbps, 12.5x faster than my current connection!).
It’s funny how internet bandwidth works.
I found each speed level I’ve had fast enough… but every time I upgraded to something faster, it was soooooo much better and I felt like I could never go back.
Technically, there’s nothing that I’m doing that I couldn’t do with 30mbps. Even streaming 4k video can work at 15mbps (although you may have a bad time if multiple devices are online concurrently).
Once in a while, I like to go up to the next plan and marvel every time I download a long podcast, large software update, or do a full-drive cloud backup. The euphoria lasts a while each time, so I decided to go up one step at a time rather than going straight to the fastest plan. (to maximize subjective satisfaction!)
1500mbps sounds overkill, but Teksavvy doesn’t offer anything between 120 and 1500 where I am (it’s a long story, there’s a lot of regulatory capture in Canada telecoms). But it’s okay to go a little overboard on something you use all the time. Friction can take you out of a flow state.
The Wi-Fi 6 standard can’t even dish out that much data to a single device, which is why I’m starting to plan out how to eventually fish an ethernet cable to the attic and then back down to my office so that my desktop can have a wired connection (yes, my primary computer isn’t a laptop — I’m a dying breed!). That’ll be part 2 of this project…
💚 🥃 🚢⚓️ If you’re getting value from this newsletter, it would mean the world to me if you become a supporter to help me to keep writing it.
You’ll get access to paid editions and the private Discord.
If you only get one good idea per year, it’ll more than pay for itself. If it makes you curious about new things, that’s priceless.
🏦 💰 Liberty Capital 💳 💴
🐜 Elliott goes activist on Texas Instruments 🤔
Announcing a $2.5bn stake in TI, Elliott sent a letter to management asking them to modify their capital expenditure plans in order to increase free cash flow.
I don’t know if this is really what we traditionally think of as “activism” — it’s rare to see with a stock that is at all-time highs, and the letter is mostly praise — but maybe it’s “nudge-tivism”.
Here are some highlights:
In 2022, TI announced a significant expansion of its manufacturing capacity with a plan that ultimately called for six new 300-millimeter fabrication facilities in the U.S. By spending $5 billion per year through 2026 and several billion dollars annually thereafter, TI would nearly triple its internal manufacturing capacity by 2030.
At the time, the benefits of this plan seemed clear and compelling: TI would extend its scale advantage by transitioning to 90% internal wafer capacity, of which 80% would be cost-advantaged 300-mm capacity. Strategically, TI would own more geopolitically dependable analog capacity than any other company in the world by far.
So what’s the problem?
However, since announcing the substantial ramp in capacity in 2022, TI’s free cash flow per share, “the best measure to judge a company’s performance,” has declined by more than 75%.
More importantly, shareholders have been left with limited visibility or guidance from TI about when free cash flow per share will return to its historical trend line.
Critically, TI appears to be building capacity far in excess of expected demand, with targeted revenue capacity of $30 billion in 2026. This level represents 50% excess capacity above consensus revenue expectations (TI is building to similar levels of excess capacity by 2030 as well).
This is the magnitude of the investment cycle that they object to:
As it has in the past, TI is investing through a downcycle:
TI has remained committed to this level of spend even as the analog market has suffered from one of the largest down cycles in the last decade. When TI first announced its capital-investment plan in 2022, consensus expectations for 2026 revenue were $26 billion. Today, expectations have declined by 24% to $20 billion.
Do you know what “consensus” forecasts and expectations do, especially in a cyclical industry?
They change. Wildly.
You can’t modify your long-term capital plan every time they bounce around…
Elliott’s solution?
we believe TI should adopt a dynamic capacity-management approach and flex its capacity buildout in a manner consistent with its historical practices. By executing on this approach, we believe TI can generate $9.00+ per share in free cash flow in 2026 [...] ~40% above current investor expectations.
What do I think?
Well, it’s hard to know for sure as an outsider. TI has a good track record of investing at a time when others aren’t and coming out well-positioned on the other side. But I also don’t want to over-index on the past. They picked up assets on the cheap in 2009. Is today’s version of this getting government subsidies from the CHIPS Act? Would they lose subsidies if they reduce their plans too much?
Is Elliott just looking for a short-term gain by convincing the market that FCF/share expectations should be revised upward? At least they’re not asking them to slash capex to the bone.
Is this down-cycle thinking? All investment seems mad, which is why it’s the time to invest because when there’s an up-cycle and all those forecasts get revised upward, having the capacity ready to go will be very valuable.
If supply from China is ever disrupted, will all that capacity suddenly become even more valuable? Is it a kind of call option?
TI’s analog chips are very different from the leading-edge digital semis. They have a very long shelf life and the number of SKUs the company makes is ginormous so they can adjust internal allocation. When inventories don’t depreciate quickly, the threat of overcapacity isn’t the same as it would be at, say, Intel.
Nvidia Regrets 🤔
I wonder how often Masa Son thinks about how he sold his 4.9% stake in Nvidia in 2019 for $3.6bn (after Nvidia's stock price fell from $292 per share in September 2018 to $153 per share in February 2019).
I wonder how often Cathie Wood thinks about how ARKK sold its Nvidia position in January 2023.
You can’t live life looking backward!
But once in a while, I take a brief moment to look at stocks I sold that went on to do very well and think “Oh well” ¯\_(ツ)_/¯
🇺🇸 How much of each US state is federal land 👇
Looking at this, I wondered why there was such a stark difference between the West and the rest of the country. Here’s what I could find (that may not be the whole story, but it’s still useful context):
As the U.S. expanded westward in the 19th century through purchases, treaties, and conquest, the federal government took possession of vast tracts of western lands.
In the late 1800s, the U.S. government began retaining more western land in federal ownership for conservation and resource management purposes, shifting away from its earlier policy of disposing land to states, railroads, and settlers.
Western lands were seen as less economically valuable and thus drew fewer homesteaders than lands in the East under the Homestead Act and other land disposal laws. Unclaimed lands remained in federal ownership.
The geography and climate of the arid West was less conducive to farming and settlement than the wetter, more fertile eastern states. With less productive land, more acreage stayed under federal control.
Today, around 92% of federally owned land is located in 12 western states, with Nevada having the highest percentage at 84.9% federal ownership. In contrast, the federal government owns 4.1% of land in states outside the West.
Some key federal land agencies like the Bureau of Land Management and U.S. Forest Service administer mostly western lands, as that is where the majority of the federal estate is located. National parks and monuments are also concentrated in the West.
Here’s a more granular view:
🗣️🔌🚘🔋 Interview: Polestar and Mercedes CEOs on EVs
There’s a lot of noise surrounding electric cars these days. If you’re interested in the field, it’s worth taking a moment to listen to these two interviews with the chief executives at Polestar (the EV-only offshoot from Volvo) and Mercedes:
They both discuss their companies’ plans with EVs, the market in general, and where they see things going in the coming decade+.
While EVs were affected by the pandemic bubble similar to e-commerce and SaaS, and compared to that period, things seem to be going in the wrong direction, if we step back, it’s pretty clear that the secular trend is still taking place, even if not in a straight line.
The number of EVs sold is still growing nicely, the number of models coming to the market is increasing, batteries keep getting better, more charging stations are being built everywhere, and there’s plenty of innovation, especially in the Chinese market where the competition is more intense than anywhere else.
💰💰💰💰💰💰 Elon Musk Raises $6bn for xAI, Wants to Build 100k GPU Training Cluster 🤖
xAI just raised $6bn giving it a post-money valuation of $24 billion, which is impressive considering that… well, they don’t really have a business and their model isn’t leading-edge.
Friend-of-the-show and Extra-Deluxe supporter Byrne Hobart (💚💚💚💚💚 🥃 ) has a good take on this:
the right way to think about the [...] valuation of Elon Musk's xAI. It's not the capitalized value of what the company has done so far, or even the value of Elon Musk's fame as a way to differentiate from other AI businesses. But it does make sense as a proxy for the net present value of Tesla, SpaceX, and Twitter talent that might be quietly repurposed to work on AI instead.
It’s also a bet on Musk and his ability to think big and *somehow* find a way:
Elon Musk has said publicly that his artificial intelligence startup xAI will need a whopping 100,000 specialized semiconductors to train and run the next version of its conversational AI Grok—or what he’s calling a “gigafactory of compute.”
Gigafactory of compute.
Once he’s found a brand he likes, he keeps re-using it!
In a May presentation to investors, Musk said he wants to get the supercomputer running by the fall of 2025 and will hold himself personally responsible for delivering it on time. When completed, the connected groups of chips—Nvidia’s flagship H100 graphics processing units—would be at least four times the size of the biggest GPU clusters that exist today, such as those built by Meta Platforms to train its AI models, he told investors.
That’s a moving target. Microsoft and others are already working on larger clusters.
But however you measure it, 100k GPUs would be *MASSIVE*
XAI could partner with Oracle on the supercomputer. It has been talking to Oracle executives about potentially spending $10 billion to rent cloud servers over a period of years. The startup already rents servers with about 16,000 H100 chips from Oracle, where it is the largest customer of such chips.
It seems to me like Oracle was the default choice.
Microsoft is aligned with OpenAI, Google does its own thing, and AWS is mostly betting on Anthropic and is likely to start taking its own in-house efforts more seriously now that it has a new CEO.
Oracle is large enough to be able to handle this scale, but small enough that it doesn’t have a horse in the foundational model race. 🐎
SP500: Sales Growth vs Market Cap Growth (TTM)
Speaking of lots of GPUs, as I said in Edition #500, Nvidia is just bonkers right now.
🧪🔬 Liberty Labs 🧬 🔭
🎧🤫 How Noise-Cancelling Headphones Work
It may seem a bit like magic if you don’t know how it works, but I find it even cooler knowing how the trick is done.
The basic idea is:
Sound is the air vibrating. The sound waves are a bit like waves in water.
Waves that are mirror images of each other cancel each other out (ie. the peaks and valleys are inverted). It’s a bit like Yin and Yang balancing each other out, or having the exact quantity of gravel needed to fill a pothole, making the road flat again.
Noise-canceling headphones have microphones that listen to the ambient noise and analyze it using digital signal processing (DSP) chips.
They generate an opposite sound wave in as close to real-time as possible and play it through the headphone speakers.
When the anti-noise wave meets the ambient noise wave, there’s a process called destructive interference that significantly reduces the perceived ambient noise.
In theory, it’s possible to cancel a wide spectrum of sounds. However, in practice, current tech works best with consistent, low-frequency noises like engine rumbles or air conditioner hums. It's less effective against sudden, higher-frequency sounds like a dog barking.
🇨🇳🤝🐜 China injects another $47.5bn into its semiconductor industry 💰💰💰💰
This is the third time they do this, but it’s the biggest headline number yet:
China has set up its third planned state-backed investment fund to boost its semiconductor industry, with a registered capital of 344 billion yuan ($47.5 billion) [...]
The first phase of the fund was established in 2014 with registered capital of 138.7 billion yuan, and the second phase followed in 2019 with 204 billion yuan.
The question always is: how efficient is the spending — how much actually improves things and how much is wasted?
As an aside, there was a typo in a translated story of this (it originally said $475bn), and Dan Nystedt explained why:
There is no word in Chinese for the English "billion".
Instead, they say "10 hundred millions" 十億
The word for 10 million in Chinese is:
1,000 ten thousands. 千萬
1 million = 100 ten thousands 百萬
Etc., Etc.
TIL!
🔌⚡️🇺🇸🤖 U.S. electricity demand is projected to increase 2.7% this summer 💡💾 📈
This may not seem like a big number if you’re used to seeing stocks move around wildly, but this is infrastructure and electricity demand has been mostly flat for decades, so there’s a lot of inertia in the system:
The EIA estimates that electricity sales to ultimate customers, i.e. electricity consumption, will be 1,487 terawatt hours (TWh) this summer… forecast to be 39.75 TWh, or 2.7%, higher in summer 2024 than last summer... total electricity consumption is expected to be 62.1 TWh (4.4%) higher this summer than the average over the past five summers.
What about demand from data centers?
Data centers are key contributors to the recent growth in electricity demand due to digitalization and data-driven technologies.164 The rapid growth of emerging technologies like Artificial Intelligence (AI) /Machine Learning (ML) has fueled growth in data centers over the past decade and continues to do so. [...]
data center development is also expanding in over 20 metro areas nationwide. Nationwide, data center demand is expected to reach 35 GW by 2030, up from 17 GW in 2022, and has been one of the major drivers behind the sharp increase in electricity demand in 2023
To put it another way, the data center load would increase by the equivalent of 18 large nuclear power plants in the span of 8 years (ie. not enough time to build 18 nuclear power plants at the rate at which we’re building these days).
More details in the FERC’s ‘2024 Summer Energy Market and Electric Reliability Assessment’.
🎨 🎭 Liberty Studio 👩🎨 🎥
🐭💸 Why Disney’s $400m Star Wars Hotel FAILED
Ok, I admit I didn’t watch all of it because it’s 4-hours long (even at 2x it’s a lot), but in the parts I saw, there’s an incredible amount of detail on how to think about product design.
It’s not just about this specific spaceship-hotel, it’s about how to analyze all kinds of products and services.
Super cool re: noise cancellation.
I had assumed that noise cancelling headphones / earbuds were designed to be good physical barriers to outside noise. Mind blown 🤯.
Not to be nit-picky - OK, just a little - the fiber optic diagram mis-identifies the cladding, which is a glass layer surrounding the fiber core and has a slightly different refractive index. The diagram has it outside the metal strength layer. With the naked eye, you can't distinguish between the core and cladding - it's just a tiny string of glass.
The cool thing about fiber is that the ultimate bandwidth is only limited by the electronics that feeds it. The frequency of laser light is around 5X10^14 HZ. With two cycles needed to define a bit, that would be about 150,000 gigabits/second. And then multi-mode fibers can transmit hundreds of different frequencies at the same time. Gives you some insight on how Infiniband works in those Nvidia data centers. Jensen: "240 Terrabytes/second" Yikes.
The combination of AI and Nvidia's accelerated computing is like discovering fusion reactors that could power a single house or car or scale up to power factories orders of magnitude larger than possible today - all for the gasoline equivalent of one cent per gallon. Unlimited upside.