(Santa Clara, Calif.) In the cavernous room of a single-story building in Santa Clara, Calif., two-meter-tall white machines emitted a hushed sound. This is a new supercomputer that has been running for a month.

The supercomputer, unveiled Thursday by Silicon Valley start-up Cerebras, runs on specialized chips designed to power artificial intelligence products. These chips are distinguished by their size – the diameter of a plate – which is 56 times that of chips commonly used in AI. Each Cerebras chip has the computing power of hundreds of ordinary chips.

Cerebras built the supercomputer for G42, an AI company. G42 says it plans to use it to build and power AI products for the Middle East.

“We are proving that it is possible to build a very large AI supercomputer,” says Andrew Feldman, CEO of Cerebras.

The demand for computing power and AI chips is skyrocketing this year, propelled by the global AI boom. Tech giants like Microsoft, Meta, and Google, along with myriad startups, have been rushing to launch AI products since the success of the chatbot ChatGPT, which generates eerily human-like prose.

But AI products require great computing power and specialized chips, hence a frantic race to develop these technologies. In May, Nvidia, the leading AI chipmaker, said its quarterly sales would exceed Wall Street estimates by more than 50%, so strong is the demand for its “graphics processing units” – or GPUs.

For the first time, AI is driving a huge increase in computational capabilities, says Ronen Dar, founder of Run:AI, a small firm in Tel Aviv, Israel, that helps companies develop AI models. This is “creating huge demand” for specialty chips, and companies are scrambling for vendors who can produce them.

To meet their needs, some big tech companies – including Google, Amazon, Advanced Micro Devices and Intel – have developed their own AI chips. Young shoots like Cerebras, Graphcore, Groq and SambaNova have also entered the race to break into this market dominated by Nvidia.

Chips are going to play a crucial role in AI, to the point that they could change the hierarchy of tech companies and even countries. Thus, the Biden administration is considering restricting sales of AI chips to China, as some US officials believe they could pose a threat to US national security by bolstering Beijing’s military and security apparatus.

AI supercomputers have already been built, notably by Nvidia. But this is rarely within the reach of small businesses.

Cerebras, based in Sunnyvale, Calif., was founded in 2016 by Andrew Feldman and four other engineers, with the goal of building hardware that accelerates the development of AI. Over the years, the company has raised $740 million, including from OpenAI CEO Sam Altman and venture capitalists like Benchmark. Cerebras is valued at US$4.1 billion.

Because the chips used to power the AI ​​are small – the size of a postage stamp – it takes hundreds, if not thousands, of them to run a complex AI model. In 2019, Cerebras unveiled what it claims to be the largest computer chip ever built.

Abu Dhabi-based G42 began working with Cerebras in 2021. In April, it used a Cerebras system to train an Arabic version of ChatGPT.

In May, G42 asked Cerebras to build a network of supercomputers in different parts of the world. Talal Al Kaissi, CEO of G42 Cloud, a subsidiary of G42, says the cutting-edge technology will allow his company to create AI-powered chatbots to analyze genomic data and suggest preventative care.

But the scarcity of AI chips made building a supercomputer difficult. Cerebras’ technology was both available and cost-effective, says Al Kaissi. Thanks to its oversized chips. Cerebras built the G42 supercomputer in just 10 days, says Feldman.

“The production time has been reduced considerably,” says Mr. Al Kaissi.

Within a year, Cerebras plans to deliver two more supercomputers to G42 – one in Texas and the other in North Carolina – and, after that, six more distributed around the world. Cerebras called this network Condor Galaxy.