Decentralized Training Cuts AI Energy Costs

Decentralized training of AI models can help reduce the energy costs associated with artificial intelligence. This approach leverages existing computing power instead of building new energy-intensive data centers.

AI audio

Listen to the article

Hear the article with natural AI narration.

AI explained

How does decentralized training reduce AI energy costs?

Decentralized training uses existing computing power across independent nodes instead of building new data centers. This approach lowers the energy needed for AI model training by leveraging idle servers and smaller data centers.

  • Summary: Decentralized AI training distributes workloads across networks of existing computers, reducing reliance on energy-intensive data centers.
  • Why it matters: It can decrease energy consumption and costs associated with AI training by optimizing underused resources.
  • Key point: Technologies like Nvidia's Spectrum-XGS and platforms like Akash Network enable efficient decentralized AI training using dispersed computing power.

Decentralized AI Training Utilizes Existing Computing Power

Researchers and companies are working to implement decentralized training for AI models. This means training occurs across a network of independent nodes, which can range from idle servers in research labs to computers in solar-powered homes. By using existing resources, the need to construct new data centers—which often require substantial electrical infrastructure—is reduced.

Training AI models is an energy-intensive process, and major tech companies are now seeking more sustainable solutions. Nvidia has developed Spectrum-XGS Ethernet to enable efficient training across geographically dispersed data centers. Cisco has also launched a router designed to connect AI clusters in different locations. Additionally, Akash Network has created a platform where users can rent out unused GPU computing power from smaller data centers, making it easier to utilize underused resources.

Implications for the U.S. Tech Landscape

Decentralized AI training offers U.S. developers a practical way to optimize existing computing resources, potentially lowering both costs and energy consumption in AI projects. Platforms like Akash Network could encourage American companies to explore innovative business models for distributed computing power.

Source: IEEE Spectrum

Read the full story in Norwegian

Les pĂĄ norsk

Read also: Anthropic Launches Claude Mythos and New AI Tools for Businesses