What I Read This Week...
Meta’s AI reorganization efforts (600 roles cut), data centers in space, Google's quantum algorithm breakthrough, Tesla's future, and more
Caught My Eye…
1) Meta’s AI Reorganization Efforts
Meta will cut roughly 600 roles across its AI division, which has sparked confusion about whether the company is pulling back from artificial intelligence just as the rest of Big Tech doubles down. The details tell a different story.
The layoffs primarily affected Meta’s legacy Facebook Artificial Intelligence Research (FAIR) and infrastructure teams, which were consolidated under the new team, Meta Superintelligence Labs, that Mark assembled in June for $14.3B.
Meta Superintelligence Labs is comprised of four groups:
TBD Lab: The team managing Meta’s large language models, led by Alexandr Wang (prev. CEO of Scale AI).
FAIR: A team focused on longer term research for AI led by Rob Fergus (Meta’s Director of AI Research).
Products and Applied Research: The consumer integration team led by Nat Friedman (prev. CEO of Github).
MSL Infra: The team for infrastructure to sustain artificial intelligence models that is led by Aparna Ramani (Meta’s VP of engineering)
In contrast, hiring continues aggressively within its TBD Lab, which builds Meta’s next-generation foundation models and agent systems. Executives described the reorganization as a move toward smaller and faster execution from teams that can operate more like startups within the company.
“By reducing the size of our team, fewer conversations will be required to make a decision,” - Alexandr Wang, Chief Artificial Intelligence Officer
2) Starcloud: Data Centers in Space
Starcloud is preparing to deploy AI-data-center capacity in orbit this November using NVIDIA H100 GPUs and solar-powered systems that exploit the unique advantages of space. Starcloud is a recent graduate of the Startups Cloud AI Accelerator run by Google and plans to use the Gemma LLML in orbit. This startup was also backed by NVIDIA’s Inception program, a free program that guides AI startups through NVIDIA’s ecosystem with the following benefits:
The latest developer tools and training
Preferred pricing on NVIDIA hardware and software
Exclusive offers from partners
Exposure to a global ecosystem of investors
The launch of the first satellite, Starcloud-1, will be 100x more powerful in GPU compute than any previous space-based operation. The orbital setup also promises roughly a 10× reduction in CO2 emissions over its lifetime when compared with earthbound data centers..
The project aims to solve three key bottlenecks in current AI infrastructure.
First: Energy consumption. Training large models today demands tens of megawatts, and terrestrial grids plus cooling constraints create bottlenecks. Starcloud flips that dynamic by using abundant sunlight without having to worry about weather dynamics.
Second: Cooling and heat-management. On Earth, data centers rely on fresh water for cooling through evaporation towers. In orbit, the vacuum of space becomes an infinite heat sink where heat is dissipated through large deployable cooling panels.
Third: Data locality and latency for Earth-observation and satellite workloads. Starcloud plans early use-cases around processing sensor data (optical, hyperspectral, SAR) in orbit rather than down-linking terabytes of raw data to Earth.
The key applications that will be unlocked by the above are analysis of Earth observation data, which could inform crop type detection and better predict local weather. The satellite will also speed up wildfire detection and distress-signal response when emergencies pop up.
3) Google’s Quantum Algorithm Breakthrough
Sundar announced a breakthrough quantum algorithm this week using its new Willow processor. The company demonstrated a “verifiable quantum advantage” with an algorithm called Quantum Echoes that ran 13,000 times faster than one of the world’s top classical supercomputers. The research was published in Nature and improves upon their 2019 study that only sampled bitstrings from a highly chaotic quantum state of qubits.
The new experiment modeled how information spreads in a quantum system using a 105-qubit superconducting chip with some of the highest fidelity rates ever recorded. The team measured a quantity known as the out-of-time-order correlator (OTOC), a problem that overwhelms classical machines due to its exponential complexity. Independent verification confirmed the results. This shift from “faster” to “verified” is what makes the Willow milestone distinct. It establishes a baseline for scaling future quantum processors that are both powerful and testable by other advanced quantum chips.
Simulating molecules and materials is one of the earliest real applications for quantum systems, with potential breakthroughs in battery chemistry, drug design, and clean energy catalysts. In partnership with The University of California, Berkeley, they ran the Quantum Echoes algorithm to study two molecules, one with 15 atoms and another with 28 atoms, to verify this approach. The experiment resulted in improved models of the molecular structure, but is still not yet beyond classical methods.
“As quantum computing continues to mature, such approaches could enhance NMR spectroscopy, adding to its powerful toolbox for drug discovery and the design of advanced materials.” - Ashok Ajoy, Assistant Professor of Chemistry, UC Berkeley
Learn With Me and My Friends…
Other Reading…
LLMs Can Get Brainrot (Github)
Analog In Memory Computing (Nature)
Silicon Valley Has China Envy… (The New York Times)
Religion Is Gaining Influence in American Life (Pew Research)








Great work.
Interesting