8: Metaverse Masterclass, What Makes TikTok Tick, and Minecraft Inception
"A man who has committed a mistake and doesn't correct it, is committing another mistake."
"A man who has committed a mistake and doesn't correct it, is committing another mistake." —Confucius
I’ve decide that when I reach 1,000 subscribers, I’ll celebrate by letting you in on one of my past hobbies. I’ll do a special section on it in the edition after hitting that (arbitrary, but very pleasing) milestone.
Investing & Business
-Eugene Wei Deep-Dive on What Makes TikTok Tick
It’s all-TikTok all the time lately… Eugene has an in-depth and insightful piece (can he do anything else?):
Algo-mania:
They say you learn the most from failure, and in the same way I learn the most about my mental models from the exceptions. How did an app designed by two guys in Shanghai managed to run circles around U.S. video apps from YouTube to Facebook to Instagram to Snapchat, becoming the most fertile source for meme origination, mutation, and dissemination in a culture so different from the one in which it was built? [...]
the Chinese market has been largely impenetrable to the U.S. tech companies because of the Great Firewall, both the software instance and the outright bans from the CCP. But in the reverse direction, America has been almost as impenetrable to Chinese companies because of what might be though of as America’s cultural firewall. [...]
It turns out that in some categories, a machine learning algorithm significantly responsive and accurate can pierce the veil of cultural ignorance. Today, sometimes culture can be abstracted. [...]
Bytedance has an absurd proportion of their software engineers focused on their algorithms [...] Prior to TikTok, I would’ve said YouTube had the strongest exploit algorithm in video but in comparison to TikTok, YouTube’s algorithm feels primitive [...]
TikTok’s algorithm is the Sorting Hat from the Harry Potter universe [...] TikTok’s algorithm sorts its users into dozens and dozens of subcultures
It’s a bit long, but worth reading the whole thing.
The inimitable Matt Levine also had a good piece on the whole “a very substantial portion of that price is going to have to come into the Treasury of the United States, because we’re making it possible for this deal to happen” insane thing:
“It is completely unorthodox for a President to propose that the U.S. take a cut of a business deal, especially a deal that he has orchestrated. The idea also is probably illegal and unethical,” says some poor law professor who was called to comment on this dumb, dumb stuff. [...]
“America: We will expropriate the assets of foreign companies if someone pays us a big enough bribe” is a rule that might maximize short-term revenue, but there are problems in the long run.
This is all as dumb as it is possible for a thing to be
And if that wasn’t enough for you and you haven’t reached TikTok saturation point yet, there’s a good piece at the Atlantic that covers ByteDance and a lot of the security issues that have been at the center of all the current attention to the company.
-Metaverse Masterclass: Matthew Ball Interview
This is a great interview. Patrick O’Shaughnessy’s conversation with Matthew Ball (read his excellent essays) is quite dense with insights on all types of medias and the coming metaverse.
I love that his default mode seems to be thinking about big picture and strategy. Too few people are like that. And fascinating stuff on Epic and Tim Sweeney’s vision.
-Cyber-Criminal Business Model Innovation
George Kurtz: "there are ransomware groups that actually have a rev share model. I mean that's how organized they are. They'll actually give you their platform, and they take a 20% cut on your success."
-Russia Being Left Behind, and a Thought Experiment
Here’s a thought-experiment:
What if Russia, India and China had a system relatively similar to the US for all of the 20th century, up to today?
I'm not even thinking of a perfect system, just something roughly similar to the liberal democracy with a market system and the rule of law that the US has enjoyed, with a culture that encourages entrepreneurship and social mobility a decent amount.
How much further along would the world be? If we just multiply these countries' populations by the current US GDP per capita, we get about $192 trillions. That’s real money, even compared to the US GDP ($21.4 trillions in 2019).
Of course, it’s just a thought-experiment and I’m hand-waving a lot of inter-dependent variables. In such a world, the wealth creation would probably be impacted positively in some ways, but positively in others. For example, some global companies that are now in the US may be elsewhere instead, and for some things, that would be zero-sum.
But the interactions between the wealthier countries, for commerce, science, the arts, travel, etc, would've no doubt created a lot of value in a non-zero sum way and I bet the total GDP/person in such a world would be much higher than it is today. As a civilization, we’d no doubt be much further along the tech tree, with many more engineers and scientists and entrepreneurs working to solve problems, using a lot more physical and intellectual capital, etc…
Science & Technology
-Can GPT Scale in Performance with More Data and Compute?
Some great sarcasm by Gwern:
its scaling continues to be roughly logarithmic/power-law, as it was for much smaller models & as forecast, and it has not hit a regime where gains effectively halt or start to require increases vastly beyond feasibility. That suggests that it would be both possible and useful to head to trillions of parameters (which are still well within available compute & budgets, requiring merely thousands of GPUs & perhaps $10–$100m budgets assuming no improvements which of course there will be) […]
GPT-3 is an extraordinarily expensive model by the standards of machine learning: it is estimated that training it may require the annual cost of more machine learning researchers than you can count on one hand (~$5m), up to $30 of hard drive space to store the model (500–800GB), and multiple pennies of electricity per 100 pages of output (0.4 kWH). Researchers are concerned about the prospects for scaling: can ML afford to run projects which cost more than 0.1 milli-Manhattan-Projects⸮ Surely it would too expensive, even if it represented another large leap in AI capabilities, to spend up to 10 milli-Manhattan-Projects to scale GPT-3 100× to a trivial thing like human-like performance in many domains⸮ Many researchers feel that such a suggestion is absurd and refutes the entire idea of scaling machine learning research further, and that the field would be more productive if it instead focused on research which can be conducted by an impoverished goatherder on an old laptop running off solar panels. Nonetheless, I think we can expect further scaling. (10×? No, 10× isn’t cool. You know what’s cool? 100–1000×.)
More on how performance of language models scales here.
-Podcast on the Space Barrons: SpaceX and Blue Origin
If you’re into space exploration and rocket technology but haven’t yet read the book ‘The Space Barons: Elon Musk, Jeff Bezos, and the Quest to Colonize the Cosmos’ (2018), you may enjoy this podcast episode that is largely based on the book and gives a good overview of Musk and Bezos’ space ventures.
-How Big is the Sahara Desert?
The Arts
-Computer Game Inception, Minecraft Edition
It’s not possible to run a mod that allows you to run a Windows 95 computer within Minecraft, and to play Doon on it.
And here’s Minecraft running on a computer emulated inside Minecraft, for the full inception experience:
Even more impressive is users building computers inside the game using only game elements to painstakingly build the logic required for computation. Behold:
After over two years of development, I am finally here to show you guys the Redstone Computer v5.0 [...]
Computer specifications:
- 8-bit architecture and interface
- Register-based data model
- Four cores (which consist of an ALU + control unit + cache + a few other things) that are truly independent from each other
- Each core can run at separate speeds (variable clock speed integration)
- 32 bytes (31 accessible) of dual-read ERAM
- Each core has 128 lines of program memory (768 bytes per core, 3.072Kbytes of total program memory)
- Runs custom instruction set
- Can write, compile, upload, and execute programs on this computer that are written with the latest revisions of the ARCISS language specification with the DRCHLL Compiler Project
- Enhanced suite of IO to allow for better connectivity with other peripherals
- And much more! (you'll have to watch the video to find out :) )
Actually seeing how complex such a simple computer is should give you some appreciation for what’s on your desk and in your pocket/purse.