The surge of interest in AI is creating a massive demand for computing power. Around the world, companies are trying to keep up with the vast amount of GPUs needed to power more and more advanced AI models. While GPUs are not the only option for running an AI model, they have become the hardware of choice due to their ability to efficiently handle multiple operations simultaneously—a critical feature when developing deep learning models.
But not every AI startup has the capital to invest in the huge numbers of GPUs now required to run a cutting-edge model. For some, it’s a better deal to outsource it. This has led to the rise of a new business: GPU-as-a-Service (GPUaaS). In recent years, companies like Hyperbolic, Kinesis, Runpod, and Vast.ai have sprouted up to remotely offer their clients the needed processing power.
While tech giants like Amazon or Microsoft offering cloud computing services own their infrastructure, smaller startups like Kinesis have created techniques to make the best out of the existing idle compute.
“Businesses need compute. They need the model to be trained or their applications to be run; they don’t necessarily need to own or manage servers,” says Bina Khimani, co-founder of Kinesis.
Studies have shown that more than half of the existing GPUs are not in use at any given time. Whether we’re talking personal computers or colossal server farms, a lot of processing capacity is under-utilized. What Kinesis does is identify idle compute—both for GPUs and CPUs—in servers worldwide and compile them into a single computing source for companies to use. Kinesis partners with universities, data centers, companies, and individuals who are willing to sell their unused computing power. Through a special software installed on their servers, Kinesis detects idle processing units, preps them, and offers them to their clients for temporary use.
“At Kinesis, we have developed technology to pool together fragmented, idle compute power and repurpose it into a server-less, auto-managed computing platform,” says Khimani. Kinesis customers even have the possibility to choose from where they want their GPUs or CPUs to come.
AI Is Growing Faster Than Servers Can Keep Up
GPUaaS is filling a growing gap in the AI industry. As learning models get more sophisticated, they need more power and an infrastructure that can process information faster and faster. In other words, without a sufficient number of GPUs, big AI models cannot operate—let alone improve. In October, OpenAI’s CEO, Sam Altman, admitted that the company was not releasing products as often as they had wished because they were facing “a lot of limitations” with their computing capacity.
Also in October, Microsoft’s CFO, Amy Woods, told the company’s investors in a conference call that demand for AI “continues to be higher” than their “available capacity.”
The biggest advantage of GPUaaS is economical. By removing the need to purchase and maintain the…
Read full article: AI Demand Leads to “GPU-as-a-Service” Industry
The post “AI Demand Leads to “GPU-as-a-Service” Industry” by Juan Pablo Perez was published on 01/20/2025 by spectrum.ieee.org
Leave a Reply