Analytics & Measurement
Smarter, Cheaper, Greener: The Efficiency Revolution in AI
For much of its early history, artificial intelligence carried three stubborn labels: expensive, energy-hungry, and complex to use. The most powerful models were locked behind paywalls, the environmental footprint raised eyebrows, and even experienced users struggled to make sense of model choices. That story is changing fast.
12 September 2025
8 min read
We’re now in the middle of what can only be described as an efficiency revolution. One where breakthroughs in cost, energy use, and accessibility are transforming AI from a luxury tool into everyday infrastructure.
The Fall in Costs
Only a year ago, generating text with GPT-4 carried a price tag of around $50 per million tokens. That might sound abstract, but in practice it meant meaningful AI interaction was prohibitively costly at scale.
Fast-forward to GPT-5, and the difference is dramatic. GPT-5 Nano now runs at roughly $0.14 per million tokens. That’s more than a 350x drop in cost, a fall so steep it fundamentally rewrites what’s possible. Suddenly, high-volume use cases like education, customer service, and creative exploration can run at almost negligible cost.
Cleaner, Leaner Prompts
AI has not only become cheaper. It’s become vastly more efficient. Google recently reported a 33-fold improvement in energy efficiency per prompt. Put another way:
A single AI query now consumes just ~0.0003 kWh of electricity.
That’s about the same energy as 10 seconds of Netflix streaming in 2008, or running a Google search from that era.
Even water consumption, which raised early concerns, has been mapped and benchmarked. Current estimates suggest 0.25 to 5 mL of water per prompt, depending on cooling assumptions. While not trivial, the trajectory is clearly downward as hardware and datacenter designs continue to improve.
Why Efficiency Matters
The knock-on effects of these efficiency gains are enormous:
Lower costs per user mean that free or ad-supported models are sustainable at scale.
Environmental savings reduce one of the loudest criticisms of AI adoption.
Ease of access ensures that AI isn’t just the domain of enterprises or experts—it’s rapidly becoming an everyday utility.
Taken together, these shifts signal a tipping point. AI is moving from rarefied technology to ambient infrastructure, as common and accessible as search or streaming once were.
The Next Chapter
The efficiency revolution doesn’t just make AI smarter, cheaper, and greener, it makes it universal. The barriers that once kept AI exclusive are collapsing. What was once the reserve of researchers, big tech firms, or deep-pocketed businesses is now in the hands of billions.
That universality will shape everything. From how students learn, to how brands connect with consumers, to how we build and measure sustainable technologies.
The question is no longer whether AI can scale. It’s how quickly we can adapt to a world where it already has.
The Fall in Costs
Only a year ago, generating text with GPT-4 carried a price tag of around $50 per million tokens. That might sound abstract, but in practice it meant meaningful AI interaction was prohibitively costly at scale.
Fast-forward to GPT-5, and the difference is dramatic. GPT-5 Nano now runs at roughly $0.14 per million tokens. That’s more than a 350x drop in cost, a fall so steep it fundamentally rewrites what’s possible. Suddenly, high-volume use cases like education, customer service, and creative exploration can run at almost negligible cost.
Cleaner, Leaner Prompts
AI has not only become cheaper. It’s become vastly more efficient. Google recently reported a 33-fold improvement in energy efficiency per prompt. Put another way:
A single AI query now consumes just ~0.0003 kWh of electricity.
That’s about the same energy as 10 seconds of Netflix streaming in 2008, or running a Google search from that era.
Even water consumption, which raised early concerns, has been mapped and benchmarked. Current estimates suggest 0.25 to 5 mL of water per prompt, depending on cooling assumptions. While not trivial, the trajectory is clearly downward as hardware and datacenter designs continue to improve.
Why Efficiency Matters
The knock-on effects of these efficiency gains are enormous:
Lower costs per user mean that free or ad-supported models are sustainable at scale.
Environmental savings reduce one of the loudest criticisms of AI adoption.
Ease of access ensures that AI isn’t just the domain of enterprises or experts—it’s rapidly becoming an everyday utility.
Taken together, these shifts signal a tipping point. AI is moving from rarefied technology to ambient infrastructure, as common and accessible as search or streaming once were.
The Next Chapter
The efficiency revolution doesn’t just make AI smarter, cheaper, and greener, it makes it universal. The barriers that once kept AI exclusive are collapsing. What was once the reserve of researchers, big tech firms, or deep-pocketed businesses is now in the hands of billions.
That universality will shape everything. From how students learn, to how brands connect with consumers, to how we build and measure sustainable technologies.
The question is no longer whether AI can scale. It’s how quickly we can adapt to a world where it already has.