Dark Mode Light Mode

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Follow Us
Follow Us
Login Login

As businesses look for less expensive GPUs, alternative clouds are expanding.

Alternative clouds have never been more in demand.

A case in point is CoreWeave, the GPU infrastructure company that started out as a cryptocurrency mining company, which last week secured $1.1 billion in fresh capital from investors including Altimeter Capital, Fidelity, and Coatue. With the round, its total raised in debt and equity is $5 billion, a stunning amount for a firm that is less than 10 years old, and its value is $19 billion post-money.

Beyond CoreWeave, that is.

Advertisement

Months after completing a $320 million Series C investment, Lambda Labs—which also provides a range of GPU instances hosted in the cloud—secured a “special purpose financing vehicle” of up to $500 million in early April. Last October, the cryptocurrency millionaire Jed McCaleb-backed NGO Voltage Park said that it would be investing $500 million in GPU-powered data centres. In March, Salesforce led a $106 million fundraising round for Together AI, a cloud GPU host that also conducts generative AI research.

Therefore, why is there such fervour and money flooding into the alternative cloud space?

Generative AI is the solution, as you would guess.

Along with the generative AI boom, the need for the hardware necessary to run and train generative AI models on a large scale is growing. Given their hundreds of cores that can operate in parallel to solve the linear algebra equations that comprise generative models, GPUs are the obvious option for training, fine-tuning, and executing models.

Graphics card installation is costly, however. For this reason, most developers and companies go to the cloud.

There is no lack of GPUs and specialised hardware instances tailored for generative AI workloads from cloud computing giants like Google Cloud, Microsoft Azure, and Amazon Web Services (AWS). Alternative clouds, on the other hand, may end up being less expensive and offering more availability for at least certain models and applications.

One well-liked model training and inferencing option on CoreWeave, the Nvidia A100 40GB, costs $2.39 per hour, or $1,200 per month. The same GPU costs $2,482 a month on Azure and $3.67 a month on Google Cloud.

Since generative AI tasks frequently run on GPU clusters, the cost differentials increase quickly.

“Companies like CoreWeave participate in a market we call speciality ‘GPU as a service’ cloud providers,” Gartner VP of Cloud Services and Technology Sid Nag told Eltrys. “Given the great demand for GPUs, they offer an alternative to the hyperscalers, where they have taken Nvidia GPUs and provided another route to market and access to those GPUs.”

Nag notes that when huge IT companies struggle with computational resources, even some of them have started to rely on other cloud providers.

CNBC reported in June of last year that Microsoft had partnered with CoreWeave on a multibillion-dollar contract to guarantee that OpenAI, the creator of ChatGPT and a close Microsoft partner, would have enough processing capacity to train its generative AI models. With some alternative cloud providers reportedly having preferential access to its GPUs, Nvidia, the manufacturer of most CoreWeave chips, views this as a positive trend, possibly for leverage reasons.

Principal analyst Lee Sustar believes that cloud providers like CoreWeave are prospering in part because they are not burdened with the infrastructure “baggage” that incumbent providers must manage.

“Given hyperscaler dominance of the overall public cloud market, which demands vast investments in infrastructure and a range of services that make little or no revenue, challengers like CoreWeave have an opportunity to succeed with a focus on premium AI services without the burden of hyperscaler-level investments overall,” he said.

However, can this expansion continue?

Sustar is a little unsure. According to him, whether alternative cloud providers can continue to bring GPUs online in large quantities at relatively low prices will determine how far they can go.

Over time, as incumbents like Google, Microsoft, and AWS increase their expenditures on bespoke hardware to run and train models, price competition may grow more difficult. Google provides its TPUs; Microsoft just introduced Azure Maia and Azure Cobalt, two bespoke processors; and AWS offers Trainium, Inferentia, and Graviton.

Sustar noted that although Nvidia would be looking to CoreWeave and other GPU-centric AI clouds, Hypercalers would use their own hardware to lessen their reliance on the company.

Then there is the reality that not all generative AI tasks need GPUs, especially if they are not time-sensitive, even if many of them do. While usually slower than GPUs and specialised technology, CPUs may do the required computations.

There’s a chance that the generative AI bubble may pop more existentially, leaving suppliers with plenty of GPUs but not nearly enough clients who need them. Sustar and Nag, who both anticipate a consistent supply of upstart clouds, however, believe the near-term prospects are bright.

“GPU-orientated cloud startups will give [incumbents] plenty of competition, especially among customers who are already multi-cloud and can handle the complexity of management, security, risk, and compliance across multiple clouds,” Sustar added. “Those kinds of cloud consumers are at ease experimenting with a new AI cloud if it has reliable financial support, reputable leadership, and GPUs that never wait.”

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Add a comment Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post

Kind of the point is that the Rabbit r1 shipped half-baked.

Next Post

Under the CHIPS Act, digital twins are expected to receive $285 million from the Biden administration.

Advertisement