35: Fred Liu's Good Year, Nvidia's 10x Video Compression AI, GPUs Colonizing the Cloud, Claude Shannon Letdown, and Minion Capital Interview
"she keeps almost four or five different studios running at the same time"
The wolf keeps the caribou strong. —Inuit saying
→ If you’re a US citizen, please vote. ←
Investing & Business
Interview: Shomik “Minion Capital” Ghosh
They cover a bunch of stuff, including Shomik’s philosophy of investing in public markets and how that differs from VC investing, how to think about high valuations, etc. The last part is mostly about Shopify and Tobi Lütke.
I listened to it while taking a walk, so I don’t have detailed notes.
Video above, or audio here.
Mackey Offered Whole Foods to Warren Buffett
Warren Buffett turned down the chance to buy Whole Foods in 2017, CEO and cofounder John Mackey reveals in his new book, "Conscious Leadership."
"We contacted Warren Buffett to see if he had any interest in possibly buying Whole Foods," the boss of the premium grocery chain said. "He responded that it wasn't a good fit for him." (Source)
Obviously, it was a better strategic fit for Bezos, who needed the scale in perishables to make his grocery ambitions work. I doubt that Buffett would’ve paid up what Mackey wanted anyway (Bezos was ready to pay more, because it was worth more to Amazon, not necessarily because he’s less disciplined on price).
Fred Liu Having a Good Year (So Far…)
At the end of September, Hayden Capital reported that it was up 164.1% year-to-date.
You can read Hayden’s public investment letters here.
Microsoft Rebrands Bing, Google Rebrands G Suite as Google Workspace
Microsoft is rebranding Bing search as "Microsoft Bing". It reminds me of the "Instagram by Facebook" rebrand. they’re also putting the Microsoft logo front and center, rather than the "B" Bing logo.
As for G Suite, now Google Workspace, you can see the new look here. On one hand, I like colorful icons, it helps with visual memory on a phone’s screen.
But it seems like they’ve made them more abstract than before, which may cause more confusion than necessary with a lot of people. Especially since all the colors are similar (clearly what happens when designers are told to “make sure they all have the exact same brand colors, so people know they’re a family of apps, y’know”).
Imagine if you have all your Google icons next to each other in a folder, how can your visual cortex decide which is which quickly. I don’t know, maybe it won’t be that bad once used to them… ¯\_(ツ)_/¯
It would be interesting to slice and dice these numbers a bit more to see if the theory about the advantage being higher in complex enterprise niches vs consumer products/services is correct.
Science & Technology
Nvidia’s AI-Based Video Call Compression: ˜10x Less Bandwidth
This is really really cool, using generative adversarial networks (GANs) to basically create a model of your face and then re-animate it using only some key points rather than a whole flood of pixels. I love this kind of AI research that actually delivers a very practical result that could have a significant impact on billions of people.
The NVIDIA Maxine platform dramatically reduces how much bandwidth is required for video calls. Instead of streaming the entire screen of pixels, the AI software analyzes the key facial points of each person on a call and then intelligently re-animates the face in the video on the other side. This makes it possible to stream video with far less data flowing back and forth across the internet.
Using this new AI-based video compression technology running on NVIDIA GPUs, developers can reduce video bandwidth consumption down to one-tenth of the requirements of the H.264 streaming video compression standard.
The video above is worth watching if you want a better idea of how it works. Nothing like seeing it in action.
And when they do the comparisons at equal bitrate, there’s no comparison at very low rates. This can make a big difference in quality when on mobile connections or in places where bandwidth is scarce.
Of course there’s a trade-off and saving bandwidth like this will result in increased computing loads, but these costs are going down rapidly (faster than bandwidth), and there are ways to automatically balance what makes most sense. For example, on a connection with plentiful bandwidth, you may want to use a regular codec like h.264 or HEVC, while on a more bandwidth constrained one, you could switch over to this. It all depends on whether you are running on battery (mobile) or don’t mind using a bit more power.
But I suspect that the end point of all this is that sometime in the future, almost all video will be compressed using some of this tech, with models trained on millions of hours of all kinds of video, not just faces, because it’ll result in better quality at lower size/latency, even for 4k movies.
There’s more here about it, and on other features that Nvidia has been working on, like using animated avatars and automatically changing people’s gaze and face angle so they appear to be looking at the camera.
Nvidia’s GPUs are Rapidly Colonizing the Cloud
Speaking of Nvidia, they estimate that Nvidia GPU inference compute capacity in the cloud now exceed the inference capacity of all CPUs in the cloud (note the Y axis is logarithmic!):
Also, Microsoft Word’s Grammar Suggestion engine is getting a dose of machine learning.
Microsoft Editor’s AI model for grammar checking in Word on the web alone is expected to handle more than 500 billion queries a year.
Wow. They’re accelerating it with Nvidia GPUs in the Azure cloud
NVIDIA Triton’s dynamic batching and concurrent model execution features, accessible through Azure Machine Learning, slashed the cost by about 70 percent and achieved a throughput of 450 queries per second on a single NVIDIA V100 Tensor Core GPU, with less than 200-millisecond response time. (Source)
And note that the V100 is the previous generation GPU. It’d no doubt do much better on the Ampere-based A100.
Why We Need More Vertical Gov’t Software Vendors
A day after the UK government announced its highest number of new coronavirus cases in England, the reason behind the drastic rise has reportedly been revealed. According to multiple sources, a Microsoft Excel spreadsheet containing laboratory results reached its maximum size, meaning that as many as 15,841 cases between September 25th and October 2nd were not uploaded to the UK government’s COVID-19 dashboard. (Source)
So much of the world runs on Excel, it’s amazing. I guess in many cases it’s a big improvement over paper processes or old green screen COBOL programs from the 70s, but still, we can do better for important public health data.
Next Gen RAM: DDR5
The specs on the next generation of computer memory are pretty impressive:
DDR5 is the next stage of platform memory for use in the majority of major compute platforms. The specification (as released in July 2020) brings the main voltage down from 1.2 V to 1.1 V, increases the maximum silicon die density by a factor 4, doubles the maximum data rate, doubles the burst length, and doubles the number of bank groups. Simply put, the JEDEC DDR specifications allows for a 128 GB unbuffered module running at DDR5-6400. RDIMMs and LRDIMMs should be able to go much higher, power permitting. [...]
For bandwidth, other memory manufacturers have quoted that for the theoretical 38.4 GB/s that each module of DDR5-4800 can bring, they are already seeing effective numbers in the 32 GB/s range. This is above the effective 20-25 GB/s per channel that we are seeing on DDR4-3200 today. (Source)
U.K. Accelerating Wind Power Investments
"As Saudi Arabia is to oil, the UK is to wind - a place of almost limitless resource, but in the case of wind without the carbon emissions and without the damage to the environment... We believe that in 10 years’ time, offshore wind will be powering every home in the country... the government [is] raising its target for offshore wind power capacity by 2030 from 30 gigawatts to 40 gigawatts" (Source)
Current U.K. wind capacity 10.4GW. Average demand for electricity in 2014 was 34.42GW.
Of course, the capacity factor on wind isn’t as high as on some other things, so you need more capacity than the actual demand, but all that matters in the end is cost/kWh and whether the grid can be made to handle the higher variability (so would need other investments in storage, or connect to Norway where their hydro reservoirs can act as a giant battery, or things like that).
CDC Flu Stats for past 10 Years
According to the CDC, an estimated 22,000 people died from the 2019-2020 seasonal flu. The deadliest flu season since 2010 was in 2017-2018, with an estimated 61,000 deaths, according to the CDC.
The 2011-2012 flu season is estimated to have killed 12,000 in the US, so the range is fairly wide, but doesn’t go anywhere near COVID19, especially considering that for COVID19, we took drastic measures (shutdowns, masks, social distance, hand-washing, self-quarantine, contact tracing, temp checks, no travel, WFH, ordering things online, etc) and the numbers would’ve been much higher otherwise.
Speaking of viruses, contact-tracing is an important tool to contain epidemics, and it’s very under-used in many countries. Carrie Rudzinski writes about New Zealand:
Americans enamoured with New Zealand's handling of COVID don't even know the extent of how good it is: our most recent small cluster of cases was followed in such detailed contact tracing and gene swabbing that they traced 2 cases to a trash can lid & an elevator button.
In New Zealand, you have to stay in managed isolation when you arrive in the country which I did in June: I stayed in a hotel for 2 weeks, received 2 free COVID tests, free accomodation, free meals, free transport to my hotel, 24hr nursing staff.
To me it sounds really really cheap compared to an out-of-control epidemic that ravages a country’s economy and causes lots of deaths and suffering and long-term damage…
The Arts & History
How Many Studios Do You Need?
On Creative Process
when Beyoncé records an album, one of the things that she does, which is just remarkable, is she keeps almost four or five different studios running at the same time in a city.
She uses different musicians, different producers and she actually goes from room to room: brainstorming ideas, trying different things, working on different songs. Whenever the moment leaves, she will go to the next studio and do the same thing. I'm not sure if it's a predetermined schedule or if it's more spontaneous, like when she's in a vibe, but the process is essentially not a singular thing. It's something that she does in multiple parts.
I don’t know what the source of that information is — maybe it’s apocryphal? — but it certainly rings true to me.
One reason why I’ve been reading multiple books in parallel for years is that sometimes I feel like putting one down, but I can pick another up and keep going for a while. It’s not because I’m tired of reading X that I’m tired of reading generally.
I also am always researching different things at the same time, working on different projects in parallel. There’s some inefficiency in jumping from one thing to the other, but there are also some benefits that are hard to pinpoint, but that feel a bit like the Beyoncé story.
Different contexts will trigger different things in you, and sometimes just going somewhere different or working on something else can keep the flow of ideas going or the flame of interest burning.
Claude Shannon Documentary… Letdown
Started the Claude Shannon documentary 'The Bit Player' (2018, on Prime Video here in Canada), but stopped when I realized they had a mock interview with actors playing Shannon & his wife (without disclosing it up front).
I was really confused for a few moments, like, didn’t Shannon die almost 20 years ago? Is this old footage? Everybody seems weird and why is Shannon narrating this B-roll, did the interview really flow that way? But then I realized… Disappointing.
I really liked the Jimmy Soni biography, ‘A Mind at Play’ (2018), and I recommend it. Shannon is an intellectual giant and one of the builders of the modern age, and isn’t nearly as well known as he deserves to be.