Neuralwatt, the software company helping the AI industry get more compute from the power it already has, today unveiled its end-to-end AI power optimization platform. As part of the launch, the company introduced Neuralwatt Cloud, a hosted inference service featuring the industry's first energy-based pricing model. Founded by Microsoft veterans Chad Gibson and Scott Chamberlin, Neuralwatt's software increases compute capacity, reduces energy costs, and lowers carbon emissions without new hardware, power infrastructure, or code changes. In production environments, the company demonstrated a 33 percent increase in AI compute output within the same power footprint.
AI's rapid growth has outpaced the infrastructure behind it, and the technology is defined by what it costs — to the grid, communities, and to the environment. U.S. data center power demand is expected to more than triple by 2030, yet new power infrastructure takes years to build. Compounding matters, roughly 70 percent of U.S. grid infrastructure is approaching end-of-life, and residential electricity prices are rising at more than double the rate of inflation in regions with heavy data center development. Rather than forcing grid operators and data centers into an adversarial cycle of demand and constraint, Neuralwatt helps both partners achieve more compute within existing power agreements.
"We're huge believers that AI has the potential to do a lot of good," said Chad Gibson, co-founder and CEO of Neuralwatt. "But that promise is being held back by power constraints that don't need to exist. There's so much capacity sitting inside existing facilities that's never been unlocked. While the industry races to build more power, we're focused on harnessing more from the infrastructure our partners already have."
Proven Results
Neuralwatt’s software optimizes how AI infrastructure uses power, turning power-constrained facilities into higher-performing ones without building anything new. It deploys in days, requires zero code changes, and works with existing hardware. In production on Crusoe Cloud, running NVIDIA H100 clusters, Neuralwatt demonstrated:
- 33 percent increase in AI inference throughput while consuming less peak power
- 33 percent improvement in GPU density, running eight GPUs in the power envelope that normally supports six
- More than 40 percent reduction in idle GPU power draw, from 125 watts down to 73 watts
These results are powered by three integrated platform capabilities:
- Neuralwatt Optimize is the company’s core power optimization engine for data center operators. The software acts as an intelligent layer between AI workloads and GPUs, continuously turning power consumption in real time with less than 0.1 percent performance overhead.
- Neuralwatt Cloud is an all-new hosted inference service for organizations that want high-performance AI without the energy footprint. It is the first AI inference service to offer energy-based pricing, charging a flat rate of $5 per kilowatt-hour across all models and replacing opaque, model-specific token billing with a single transparent alternative. The service also provides real-time energy data with full observability into how many milliwatt-hours each call consumes, giving organizations the ability to compare efficiency across models. Teams that prefer traditional per-token billing can access competitive rates across all models, with monthly subscription plans starting at $20 per month. The service is OpenAI-compatible. Mercury Computing, a grid flexibility platform connecting data centers and utilities to unlock underutilized power capacity, is among the first partners integrating with Neuralwatt Cloud.
- Neuralwatt Deploy brings the company's energy optimization directly into a partner's own data center, giving operators full control over their hardware, security, and power consumption.
"Utilities want to serve data center load today, but lack the tools needed to unlock underutilized grid capacity. Mercury solves that problem, enabling utilities and data centers to turn load flexibility into more power capacity,” said Mark Gately, CEO of Mercury Computing. “Partnering with Neuralwatt adds another lever because when a data center can flex its workloads, it becomes a better grid partner, and we can connect it to the grid even faster."
Neuralwatt has been selected to the Winter 2026 Plug and Play Accelerator program and named to the Cleantech Group's Cleantech 50 to Watch. The company has also partnered with Hugging Face to co-develop the AI Energy Benchmarks powering the AI Energy Score, an emerging industry standard for measuring the real-world energy cost of AI.
About Neuralwatt
Neuralwatt is an AI power optimization software company that helps AI infrastructure do more with the power it already has. Founded by Microsoft veterans Chad Gibson and Scott Chamberlin, the company's platform optimizes how AI workloads use energy, increasing compute capacity, reducing costs, and lowering carbon emissions without new hardware or infrastructure changes. Neuralwatt has demonstrated 33 percent compute gains in production environments. For more information, visit neuralwatt.com.