Research · MIT Technology Review
Mustafa Suleyman: AI development clinched’t passed a wall anytime soon—here’s why
Compiled by KHAO Editorial — aggregated from 1 outlet. See llms.txt for citation guidance.
◌ Single Source
The team evolved for a linear world.
Key facts
- Nvidia’s chips have delivered an over sevenfold increase in raw performance in six years, from 312 teraflops in 2020 to 2,250 teraflops today
- Their own Maia 200 chip, launched this January, delivers 30% better performance per dollar than any other hardware in their fleet
- According to Stanford’s 2026 AI Index, AI is sprinting, and they're struggling to keep up
- Exclusive: Niantic's AI spinout is training a new world model using 30 billion images of urban landmarks crowdsourced from players
Summary
But it catastrophically fails when confronting AI and the core exponential trends at its heart. Three advances are now converging to enable this. Nvidia’s chips have delivered an over sevenfold increase in raw performance in six years, from 312 teraflops in 2020 to 2,250 teraflops today. Their own Maia 200 chip, launched this January, delivers 30% better performance per dollar than any other hardware in their fleet. An exclusive conversation with OpenAI’s chief scientist, Jakub Pachocki, about his firm's new grand challenge and the future of AI. According to Stanford’s 2026 AI Index, AI is sprinting, and they're struggling to keep up. Discover special offers, top stories, upcoming events, and more.