UK to spend $130M on AI chips amid scramble to buy up computing power

British Prime Minister Rishi Sunak is set to spend $130 million (100 million pounds) to buy thousands of computer chips to power artificial intelligence, amid a global shortage and race for computing power.

The Telegraph reported on Aug. 20 that the United Kingdom aims to build an “AI Research Resource” by mid-2024 as part of Sunak’s plan to make the country an AI tech hub.

The government is reportedly sourcing chips from makers NVIDIA, Intel and AMD — and it is understood that the science funding body UK Research and Innovation — which is leading the effort — is in the late stages of ordering 5,000 NVIDIA graphic processing units (GPUs).

However, while $130 million has been allocated to the project, the funds are reportedly seen as insufficient to match Sunak’s AI hub ambition, meaning government officials could pressure for more funding in an upcoming November AI safety summit.

It follows a recent report that said many companies are struggling to deploy AI due to available resources and technical obstacles.

In March, an independent review of the country’s AI computing capabilities said investment in the space is “seriously lagging” behind international counterparts in the United States and European Union.

At the time, less than 1,000 NVIDIA chips were available for researchers to train AI models — a panel recommended the U.K. make available at least 3,000 top-quality chips to meet immediate needs.

Related: US and China AI-tech standoff shows signs of spreading to other countries

On Aug. 16, S&P Global’s global AI trend report found that many firms reported they’re not ready to support AI, due to not having enough computing power, along with challenges managing data and security concerns.

While it’s still early days for AI — S&P senior research analyst Nick Patience said a deciding factor for who will lead in the space will be decided by who can support AI workloads.

AI Eye: Apple developing pocket AI, deep fake music deal, hypnotizing GPT-4

Source: Read Full Article