SCX.ai debuts sovereign, eco-efficient AI node in Sydney
SCX.ai has launched an AI inferencing node in Sydney that it says is Australia's first sovereign AI infrastructure node and delivers ten times higher efficiency than traditional GPU-based systems.
The deployment runs at Equinix's SY5 International Business Exchange data centre in Sydney. SCX.ai said the system operates without water-intensive cooling and records the lowest carbon output per AI token of any facility currently operating in Asia-Pacific.
The company positioned the node as an option for enterprises and government agencies that want locally hosted AI processing and data residency controls. It said all processing occurs within Australian borders and data does not transit to offshore facilities.
"This represents a fundamental shift in how Australia can deploy AI at scale," said David Keane, Founder and CEO, SCX.ai.
"We're delivering enterprise-grade AI infrastructure that doesn't require massive water consumption for cooling and operates at a fraction of the energy cost of GPU-based systems. For Australian organisations handling sensitive data, this means they can finally run advanced AI workloads onshore without compromise," said Keane.
Sovereignty focus
SCX.ai said the node targets industries with strict data protection and data residency requirements, including financial services, healthcare and government. It framed the offer as an alternative to offshore AI services for organisations that want onshore processing for sensitive workloads.
The company also highlighted latency. It said the location in Sydney's main digital corridor supports high speed response times for real-time AI applications compared with offshore processing.
SCX.ai said the infrastructure avoids water dependency for cooling. It compared this with hyperscale AI facilities that use large volumes of water for cooling and heat rejection.
Technology choice
SCX.ai said the node uses an ASIC-accelerated architecture under a partnership with SambaNova Systems. It said the approach provides higher inference throughput per watt than GPU-based platforms.
The company described the architecture as designed for sustained production workloads. It said the design allows higher density deployment than traditional data centres can match.
Equinix described SY5 as designed for next-generation compute workloads, with connectivity and interconnection services for enterprise deployments. The data centre operator also pointed to the presence of cloud and AI ecosystems on its platform.
"AI is fundamentally reshaping digital infrastructure requirements," said Guy Danskine, Managing Director, Australia, Equinix.
"Sovereignty has become a critical consideration for organisations. Equinix's global and high-performance AI ecosystem enables organisations to develop AI in Australia while remaining seamlessly connected to global clouds, partners, and innovation, without compromising security or control," said Danskine.
Model plans
SCX.ai said it will use the infrastructure for customer AI applications and for Project MAGPiE, its Australian-focused large language model. The company said the model has shown competitive benchmark performance in localised evaluations.
SCX.ai said the model aims to better understand Australian context, terminology and cultural nuances than offshore alternatives. It did not provide technical details on the model architecture or the benchmark suite it used.
Local services
Servers Australia, which SCX.ai described as a 100% Australian-owned IT infrastructure provider, will deliver local technical support and managed services for the deployment. SCX.ai linked the arrangement to domestic employment growth in technical roles tied to infrastructure operations and support.
The company did not disclose commercial terms for the deployment at Equinix, including capacity, pricing models, or customer commitments. It also did not disclose power usage figures or methodology behind the carbon per token claim.
Network rollout
SCX.ai said the Sydney node is the first in a planned national network. It said it expects additional locations during 2026 to expand capacity and reduce latency for regional users.
The company said demand for domestically hosted AI infrastructure is increasing among organisations that weigh data governance, performance requirements and environmental impact when selecting AI services.