• The Loop
  • Posts
  • 🔆 What is up with AI slowing down?

🔆 What is up with AI slowing down?

Is AI reaching a plateau, Sweden and Greece present their AI roadmaps and OpenAI's Sora leaked by artists.

Was this email forwarded to you? Sign up here

🗞️ Issue 47 // ⏱️ Read Time: 5 min

Hello 👋

We’ve gotten used to the explosive growth of GenAI, especially Large Language Models (LLMs), over the past few years. People refer to the “before” and “after” of ChatGPT, due to the impactful shift we are experiencing. As technology is growing exponentially, we should actually expect a plateau. So, are we approaching an AI plateau, or are we just getting started?

In this week's newsletter

What we’re talking about: The complex dynamics of AI development and what scaling laws tell us about potential limitations and opportunities. What do scaling laws tell us about the future of artificial intelligence? The ways AI labs try to advance their models for the next five years likely won’t resemble the last five. 

How it’s relevant: As AI labs report diminishing returns from traditional scaling methods, we're witnessing a pivotal moment that could reshape the entire industry's trajectory. Every Big Tech company has basically gone all in on AI. Nvidia, which supplies the GPUs all these companies train their models on, is now the most valuable publicly traded company in the world.

Why it matters: The market has the expectation that this scaling, also called exponential growth, will continue at the same pace. Understanding these limitations helps us make better predictions about AI's future and guides strategic decisions in technology investment and development.

Big tech news of the week…

🏷️ Uber enters the AI labelling business with its new division, Scaled Solutions, which provides data annotation and AI training services through a global network of gig workers. 

🎨 A group of artists leaked OpenAI’s Sora video AI model to Hugging Face, highlighting the tension between AI developers and creatives over compensation and control. 

🇬🇷 Greece published its Blueprint for AI Transformation to prepare its citizens for the AI transition and strengthen Greece’s innovation potential.

🇸🇪 In Sweden, the AI Commission submitted its final report, proposing how to strengthen the country’s AI capabilities. The otherwise progressive country is falling behind in AI adoption.

📊Anthropic introduced the Model Context Protocol (MCP), an open-source standard designed to connect AI assistants with diverse data sources. This would replace writing custom integrations for every data source, which could fundamentally change how data scientists and engineers build AI-powered tools.

The Natural Laws of Growth

Just as physics governs how objects move through space, scaling laws govern how AI systems improve with more resources. But here's the catch: Exponential growth inevitably plateaus as underlying resources deplete. Whether it's rabbits running out of food, viruses running out of hosts, or AI running out of quality training data – nature has its way of imposing limits.

Scaling laws are not laws of nature, physics, math, or government. They’re not guaranteed by anything, or anyone, to continue at the same pace. One of the most known scaling laws is Moore’s law. It states that the number of components on a single chip doubles every two years at minimal cost. This observation was made by the co-founder of Intel, Gordon Moore, in 1965. Strictly speaking, Moore's law is no longer valid - if you want to know more about this, read this brilliant piece in Fortune.

"We can't ensure that even reliable components will continue to work reliably when combined. This is one of the most important principles we need to keep in mind when scaling any system." - Barbara Liskov

This observation was made by the creator of fundamental principles in distributed computing, Barbara Liskov. It captures something crucial about the “scaling laws” of artificial intelligence: While machines can execute increasingly complex operations, they remain bound by physical and computational limits. You need exponentially more resources for linear gains in performance.

Subscribe to keep reading

This content is free, but you must be subscribed to The Loop to continue reading.

Already a subscriber?Sign In.Not now