Walrus, IO bring GPUs and decentralized storage to AI builders

This is a segment of the drop newsletter. Subscribe to read full editions.

Decentralized Cloud Computing -provider Io.net, which has its own IO -token, cooperates with the Walrus team to have startups train, run and store their own custom AI models.

IO offers a network of GPUs for AI training and refinement, while Walrus enables AI model storage in the deal. The integration will be available as a pay-per-use offer, which means that builders are only charged for the amount of storage and computing power they use.

This brings your own model (Byom) offer with which AI -agent -developers or AI -app -builders can develop and exploit AI models without setting up their own data centers or hardware to do this.

Io says it has more than 10,000 GPUs and CPUs worldwide.

The Byom offer from Walrus and IOO will have to compete with other AI developer Cloudservices, although, just like that of Bittensor, Lambda, Spheron, Akash, GenSyn, Fast AI and Google’s Vertex products.

“Traditional centralized cloud models are not only expensive – they only come up with considerable privacy risks and limited composability options that are a challenge for developers who prioritize decentralization,” said Rebecca Simmonds, Managing Executive at Walrus Foundation.

“By using our decentralized data storage solution, Io.net can offer the necessary computing power for advanced AI and ML development without one of the disadvantages of traditional models, making this a clear victory for developers, users and the entire web3 industry,” the Exec continued.

The internet can feel very centralized when gigantic cloud service experience disturbances, such as what happened with Google last week (and influenced Cloudflare and a series of other apps and sites). That is an obvious reason why centralized AI accounts may not be ideal.

See also  FIFA Kicks Off Avalanche L1 to Power Soccer NFT Collectibles Platform

Walrus’ Mainnet was launched in March, with his main field programmable, decentralized storage. The Walrus Foundation announced an increase of $ 140 million that month.

More generally, the need for computing power for AI is expected to increase annually. McKinsey researchers predict that data centers worldwide need $ 6.7 trillion to keep track of the demand by 2030 (although there are a series of possible scenarios that they have modeled below).

Credit : cryptonews.net