• The Loop
  • Posts
  • 🔆 The Power of Compute: In Chips We Trust

🔆 The Power of Compute: In Chips We Trust

A newsletter by Team Lumiera for crisp AI insights

🗞️ Issue 4 // ⏱️ Read Time: 6 min

Hello đź‘‹

How much time do you spend thinking about hardware? For most of us, this thought process probably begins and ends at the desire for latest phone or laptop. We have tendency to separate technology from the physical entities powering it. But there is so much more going on under the hood that is worth our extra attention. 

If you have ever seen a sticker on your laptop that says “Intel” you are already familiar with one of the biggest names in the business. Intel is best known for developing the microprocessors (super small and super fast engines, if you will) found in most of the world's personal computers, and is a major player in the power behind AI.

Intel have recently been overshadowed by another industry giant. NVIDIA is the number one producer and supplier of graphics processing units (GPUs), and currently the sixth most valuable corporation on Earth. These GPUs, part of a larger hardware family known as AI chips, pack thousands of cores, enabling something called parallel processing, which is what has made the exponential acceleration of AI possible. Parallel processing increases speed. It breaks problems down into thousands or sometimes millions of tasks and works on them simultaneously. This allows complex algorithms, like the ones used for Large Language Models (LLMs), to compute faster.

These industry giants are engaging in what is being referred to as the GPU race, NVIDIA is giving Intel a run for their money. The rule that the AI industry is bound by is this: the more computing power, the faster the model. The faster the model, the more efficient the product or service. OpenAI’s ChatGPT, for example, runs on thousands of NVIDIA chips. NVIDIAs position as a leader in the space has skyrocketed along with ChatGPT. NVIDIA’s CEO recently announced that they are no longer a graphics company, but an AI company. 

âťť

 Should industry leaders be pushing for government investment in AI chips?

The Lumiera Question of the Week

So why do these chips matter? GPUs require infrastructure to be used efficiently; they produce a lot of heat and require a lot of electricity to perform optimally. Because of this lift, NVIDIA has positioned themselves as the go-to solution due to their dominance in data centers. Think about how hot your laptop gets when it is overrun, and imagine the fans they need to keep these centers regulated! As you might imagine, all this power is expensive. NVIDIA’s latest chips cost upwards of $25,000 and is one reason why some companies choose to rent the chips on-demand instead. Big players such as Meta and Amazon are looking at decreasing their dependency on NVIDA chips. In 2023, governments started looking at investing in chips for various reasons, even getting into geopolitical friction over the chips. This is a space to watch as the public sector increases its AI knowledge.

Do you want your laptop to have AI features?

Login or Subscribe to participate in polls.

What we are excited about:

Ambassadors of EU member states adopted the world’s first comprehensive rulebook for AI. We are keeping an eye of the implementation process as European capitals try to develop their AI understanding in 2024 since they were caught off guard in 2023.

The Nordic State of AI indicates the AI maturity level of Nordic organisations and the readiness of the Nordic states globally. We are not surprised to see that the Nordics need to step it up to stay competitive.

🤖 AI too expensive to replace most jobs
Does AI make it easier for us to do our job, or will AI take our jobs? An MIT study looking at productivity and worker displacement has found that around 23% of worker compensation would be cost-effective to automate, mainly because of the large upfront costs of AI systems. In other words, don’t worry about technological unemployment but do make sure to make the most out of AI solutions by automating repetitive tasks.

🍰 Everyone wants a piece of the pie
AMD sits between NVIDIA and Intel on the podium, and is predicted to dominate 80% of the market within the next 4 years. Google has been designing their own units, called tensor processing units (TPUs) and Normal Computing is exploring stochastic processing units (SPUs), which leverage thermodynamics.

đźš©Using AI in extractive industries - who benefits?
KoBold, a Bezos Gates venture, now says that they want to use AI powered technology to create a “Google Maps” of the Earth’s crust. The technology, KoBold said, can locate resources that may have eluded more traditional geologists and can help miners decide where to acquire land and drill. There is so much to say about extractive industries, and introducing AI into the technical side of things will not make the sustainability and human rights side less complicated. More on this another time.

Until next time.
On behalf of Team Lumiera

Lumiera has gathered the brightest people from the technology and policy sectors to give you top-quality advice so you can navigate the new AI Era.

Follow the carefully curated Lumiera podcast playlist to stay informed and challenged on all things AI.