What I Read This Week…
U.S. AI companies are facing electrical grid constraints, Chinese startup DeepSeek releases DeepSeek-V3, and scientists are working to overcome limitations in cryopreservation
Watch the All-In Holiday Spectacular
Read our Deep Dive: A Primer on Cryptocurrencies
Caught My Eye…
U.S. AI companies are facing electrical grid constraints as their computing needs outstrip existing power and data center capacity. What's going on? Training large AI models requires massive computing power – for example, training GPT-4 reportedly used more electricity than 5,000 U.S. homes consume in a year. This growing power demand is straining both the electrical grid's transmission capacity and the availability of data centers with sufficient power supply, leading to voltage fluctuations in areas where AI computing clusters concentrate. In response, U.S. AI companies are pushing for new power infrastructure initiatives, including dedicated "AI economic zones" with streamlined permitting for data centers, building a national electrical transmission network to move power where it's needed, and expanding power generation capacity. As one response, OpenAI has tripled its Washington policy team to 12 people, focusing less on AI safety concerns and more on working with utilities, energy companies, and lawmakers to secure reliable electricity supply for their operations.
DeepSeek, a Chinese AI startup, has released DeepSeek-V3, an open-source LLM that matches the performance of leading U.S. models while costing far less to train. The large language model uses a mixture-of-experts architecture with 671B parameters, of which only 37B are activated for each task. This selective parameter activation allows the model to process information at 60 tokens per second, three times faster than its previous versions. In benchmark tests, DeepSeek-V3 outperforms Meta's Llama 3.1 and other open-source models, matches or exceeds GPT-4o on most tests, and shows particular strength in Chinese language and mathematics tasks. Only Anthropic's Claude 3.5 Sonnet consistently outperforms it on certain specialized tasks. The company reports spending $5.57 million on training through hardware and algorithmic optimizations, compared to the estimated $500 million spent training Llama-3.1.
Scientists are working to overcome size limitations in cryopreservation, as they can successfully freeze and restore embryos but not organs. How does this work? When freezing an embryo, the small size allows rapid and even cooling throughout, preventing ice crystals from forming that could damage cells. But with organs, the freezing process happens unevenly – outer layers freeze before inner parts, creating damaging ice crystals and temperature differences that tear tissues apart. Organs also contain many different types of cells that each need specific conditions to survive freezing, while embryos have simpler, more uniform cell structures. Scientists are testing several approaches to solve these problems. One promising method uses magnetic nanoparticles to heat organs from the inside during thawing, helping maintain even temperatures. Scientists are also developing new protective chemicals that prevent ice formation while being less toxic to cells. While they haven't yet succeeded with full organs, these new techniques are helping scientists gradually scale up from small tissue samples to larger structures. If successful, this work would extend organ preservation from the current few hours to several months, allowing more efficient matching between donors and recipients and reducing waste in the transplant system.
Other Reading…
After Another Bad Year for Bonds, Investors Lose Faith in a Turnaround (Wall Street Journal)
The U.S. Needs a Productivity Miracle (Wall Street Journal)
TSMC’s Arizona Plant to Start Making Advanced Chips (IEEE Spectrum)
How We Fell Out of Love with Dating Apps (Financial Times)
Ask HN: Are You Unable to Find Employment? (Hacker News)
Crypto Boom Draws in Wall Street Banks (Financial Times)
The Paper Passport Is Dying (WIRED)
Modern Life is Drowning in a Sea of Verbiage (Financial Times)
On X…
If anyone wants to know more about DeepSeek and it's founder Wenfeng, here is an in-depth interview with him.
https://xyzlabs.substack.com/p/in-depth-behind-deepseeks-success
Hello Chamath,
In a recent episode of All In Podcast you mentioned about lithium supply chains. Could you please elaborate on it?