DataCentreNews India - Specialist news for cloud & data centre decision-makers
Flux result 3545c794 2988 4ad7 9b09 1dfbfe95694a

NeuReality says AI networking could cut rack emissions

Thu, 23rd Apr 2026 (Today)

NeuReality has released research on AI networking efficiency and data centre emissions, arguing that better network utilisation could cut annual carbon emissions by about 200 tons of CO2 for each AI rack.

The study focuses on energy wasted by data movement in distributed AI systems rather than compute alone.

Under standard industry calculations, a single high-performance AI rack running continuously can generate about 600 tons of CO2 a year. NeuReality's analysis found that improving system utilisation by 33% through more efficient networking could reduce that total by roughly one third while also increasing output from existing equipment.

The argument centres on a common problem in large AI installations. As workloads spread across more systems, delays in data transfer and coordination can leave accelerators idle while still drawing power, raising electricity use and emissions.

Idle Resources

As AI deployments grow, network bottlenecks have become a significant source of hidden waste, according to the research. Rather than adding more hardware to meet demand, it points to higher utilisation of existing infrastructure as a way to reduce both emissions and cost.

NeuReality said the gains from better networking efficiency include lower carbon per token, less idle energy waste, and higher throughput from current infrastructure. It also argued that making better use of existing capacity could, in some cases, delay or avoid new data centre construction.

The research ties performance, cost, and sustainability to the same issue: how efficiently infrastructure is used. That framing reflects a broader debate in the AI sector, where concern has grown over the power requirements of training and inference systems as companies deploy larger models and serve more users.

While much of the public discussion has focused on the energy demands of chips and servers, the findings draw attention to the network layer inside AI systems. In distributed environments, the speed at which data moves between compute resources can affect how much productive work a rack delivers for the power it consumes.

NeuReality links that issue to its own products, including the NR2 AI-SuperNIC and the NR-NEXUS inference operating system, which it said can be deployed separately or together. The company described them as part of a system-level approach to moving data more efficiently and orchestrating distributed inference workloads.

Moshe Tanach outlined the company's view in comments accompanying the research.

"If your AI infrastructure isn't optimized for efficient data movement, you're not only leaving performance on the table, you're also burning energy unnecessarily," said Moshe Tanach, Chief Executive Officer, NeuReality. "It's time to stop chasing compute to meet demand. The real opportunity is unlocking idle capacity, up to 2× more output, by fixing system-level inefficiencies with an inference operating system like NR-NEXUS to lower cost and carbon per token at scale."

Efficiency Debate

The figures underline how small changes in utilisation can have a large effect when applied across many racks in large facilities. For enterprises and hyperscale operators running AI infrastructure at scale, a reduction of 200 tons of CO2 per rack would add up quickly across a wider estate.

That emphasis on utilisation also reflects a commercial pressure facing operators. AI demand has pushed many companies to invest heavily in new infrastructure, but bottlenecks elsewhere in the system can limit the return on that spending if expensive accelerators are left waiting for data.

NeuReality's position is that network optimisation is one of the most immediate ways to address that imbalance. It said fixing inefficiencies can increase output without additional power or hardware, a claim likely to appeal to operators managing constrained power supplies and rising energy costs.

A second quote from Tanach framed the issue in broader terms.

"Sustainability doesn't require trade-offs," said Tanach. "When you fix system-level inefficiencies, performance, cost, and carbon all move in the same direction. What's good for the business is good for the planet."