bitcoin

Mismatch Between the Demand and Supply of GPUs Stifling AI Innovation — Tory Green



Mismatch Between the Demand and Supply of GPUs Stifling AI Innovation — Tory Green

The mismatch between the demand and supply of graphics processing units (GPUs) is potentially stifling artificial intelligence (AI) innovation, Tory Green, the COO of the decentralized GPU cloud service provider io.net, has said. According to the COO, this mismatch stems from the fact manufacturers cannot build new supplies quickly enough to match the growth in demand.

Mismatch Between Demand for GPUs and Their Supply

To highlight the extent of the problem, Green points to compute requirements for machine learning (ML) training which have grown 10x every 18 months since 2010. In contrast, computing power is believed to have grown 2x during the period and the primary reason for this is the “long lead time to build new supply.”

To limit the impact of GPU shortages on AI growth and innovation, Green, an investor and entrepreneur, proposes using a decentralized physical infrastructure network (DePIN). According to Green, under this kind of network, “the physical computing resources are spread across multiple locations and owned by various entities.”

Unlike traditional cloud providers who are hampered by “limited by geography and physical capacity,” a decentralized network, according to the io.net COO, “can scale almost infinitely” by tapping into the collective power of a global network of nodes.

Meanwhile, when asked about the possibility of DePINs also experiencing scarcity issues, Green, in his written answers sent to Bitcoin.com News via Telegram, pointed to how things like geographical distribution and community-driven solutions prevent such a scenario from playing out.

Below are Tory Green‘s answers to all the questions sent.

Bitcoin.com News (BCN): The soaring demand for GPUs recently propelled the market capitalization of the global GPU leader Nvidia to $1 trillion. Why do you think there is such a high demand and a low supply of these at the same time and what are the consequences on AI innovation?

Tory Green (TG): The rise of AI has exponentially increased the demand for GPU computing power, especially deep learning and transformer models require vast amounts of processing power to train, optimize and deploy.

Readers Also Like:  Economist Peter Schiff Warns of ‘Full-Blown Financial Crisis’ Hitting US Economy Before Fed Reaches Inflation Target

As these models evolve in complexity, demand surges. For instance, compute requirements for ML training have grown 10x every 18 months since 2010, while compute power has only doubled in the same period.

Given the long lead time to build new supplies, it’s nearly impossible for manufacturers to keep up. This leads to numerous problems for engineers, including:

  • Long wait times: It can often take weeks to get access to GPUs using cloud services like AWS, GCP or Azure.
  • Limited choice: Users have little choice in terms of GPU hardware, location, security level, etc…
  • High costs: Getting good GPUs is extremely expensive (for instance, OpenAI spends $700K per day running ChatGPT).

Ultimately, these problems stifle AI innovation.

BCN: Besides the usual centralized cloud service providers, who else — both in Web2 and Web3 space — owns significant GPU compute power?

TG: The vast majority of GPUs are hosted outside of the major cloud providers. Major sources of GPU compute power include:

  • Data centers: There are thousands of independent data centers in the US alone, and their average utilization rate is only 12% to 18%.
  • Crypto miners: Miners have suffered significant losses with Ethereum’s switch to Proof-of-Stake, and are looking for an alternative way to deploy their supply of GPUs.
  • Consumer GPUs: Consumer GPUs account for 90% of the total supply, yet the majority of these resources lie latent in consumer households.
  • By aggregating supply from these sources, io.net is able to provide nearly unlimited computing power at a fraction of the cost of traditional cloud providers.

BCN: What is a decentralized physical infrastructure network (DePIN) and how does it ease the problem of compute power scarcity for startups building AI solutions?

TG: A DePIN is a distributed system where physical computing resources are spread across multiple locations and owned by various entities, rather than being centralized in one place.

Because decentralized networks tap into the collective power of a global network of nodes, they can scale almost infinitely. This stands in stark contrast to traditional cloud providers, which are limited by geography and physical capacity.

Readers Also Like:  Bitcoin (BTC) About to Enter 'Hope' Market Stage – Close to 'Euphoria,' Analyst Says - U.Today

BCN: Your company io.net reportedly aims to make GPU computation accessible, flexible, and readily available. That puts you in direct competition with traditional players like AWS and Azure. Why would startups rely on DePIN rather than the so-called trusted traditional platforms?

TG: There are several inherent advantages of a DePIN model:

  • Massive Computing Power: Decentralized networks harness the combined computational power of all of their nodes, increasing as more entities join, while the traditional cloud has physical and geographic constraints.
  • Cost Efficiency: DePIN systems reduce costs by eliminating middlemen and optimizing resource use across the network, whereas traditional cloud services charge excessive rents.
  • Scalability: DePINs can organically and seamlessly scale as more participants add resources, whereas cloud services require more intricate scaling procedures.
  • Higher Security & Reliability: A DePIN’s distributed nature makes it harder for malicious attacks to compromise the entire system, offering inherent redundancy, whereas centralized clouds present a single point of potential failure.
  • Accessibility: Decentralized networks are inherently permissionless. This stands in stark contrast to services like AWS, which often require long-term contracts and significant KYC.

In addition, users can create a distributed cluster on DePINs like io.net virtually instantaneously (vs. weeks for cloud services). Perhaps most importantly, decentralized physical infrastructure networks have the ability to disrupt the oligopoly that currently dominates the cloud market. By expanding supply, companies like io.net can create structural arbitrage – simultaneously lowering prices for consumers by 10x and increasing profits for GPU suppliers by 10x.

BCN: Web3 infrastructure is still at a nascent stage. Do you believe that DePIN and other AI-allied technologies, could ride the AI wave to go mainstream? If yes, can you give reasons why you say this?

TG: Yes. While DePINs are a very compelling technology, they have yet to reach mainstream adoption. This is primarily due to insufficient demand. A DePIN for AI, however, is solving a true “hair on fire” problem. It:

  • Solves the global shortage in GPU resources.
  • Provides a better, faster and cheaper experience for customers.
  • Allows the owners of underutilized computing resources to significantly increase their profitability.
Readers Also Like:  Malaysian Authorities Bust Syndicate Accused of Laundering Fraud Proceeds via Cryptocurrency

In short – AI provides the “perfect storm” for DePINs. It’s the catalyst that may allow them to cross the chasm and reach mainstream adoption.

BCN: The GPU scarcity is due to the demand far exceeding the supply. Is there a possibility of the same happening with DePIN and how would this be overcome if it were to happen?

TG: It’s unlikely that a DePIN would face the same issue of scarcity, and here’s why:

  • Diverse Hardware Contribution: Unlike our current cloud infrastructure, which is dominated by a handful of centralized players, a DePIN is decentralized and can harness computing power from diverse sources.
  • Resource Optimization: DePIN systems, by their nature, can be designed to optimally utilize resources across the network. This ensures that available compute power is used efficiently, reducing the overall demand on any single node.
  • Inherent Scalability: As demand grows in a DePIN, it incentivizes more participants to join and contribute resources, organically scaling the available infrastructure.
  • Geographical Distribution: The decentralized nature of DePIN allows for global participation. This geographic distribution can offset regional scarcities or supply chain disruptions.
  • Community-driven Solutions: A decentralized community can come up with innovative solutions and strategies to handle resource scarcity, unlike traditional markets driven primarily by regulations and bureaucracy.

In essence, the decentralized and diverse nature of a DePIN inherently avoids the bottlenecks and scarcity issues present in the current, centralized system.

What are your thoughts on this interview? Let us know what you think in the comments section below.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.