Skip to main content

AI Compute Costs Drive Shift To Usage-Based Software Pricing

1 month 1 week ago
The software-as-a-service industry is undergoing a fundamental transformation, abandoning the decades-old "per seat" licensing model in favor of usage-based pricing structures. This shift, Business Insider reports, is primarily driven by the astronomical compute costs associated with new "reasoning" AI models that power modern enterprise software. Unlike traditional generative AI, these reasoning models execute multiple computational loops to check their work -- a process called inference-time compute -- dramatically increasing token usage and operational expenses. OpenAI's o3-high model reportedly consumes 1,000 times more tokens than its predecessor, with a single benchmark response costing approximately $3,500, according to Barclays. Companies including Bolt.new, Vercel, and Monday.com have already implemented usage-based or hybrid pricing models that tie costs directly to AI resource consumption. ServiceNow maintains primarily seat-based pricing but has added usage meters for extreme cases. "When it goes beyond what we can credibly afford, we have to have some kind of meter," ServiceNow CEO Bill McDermott said, while emphasizing that customers "still want seat-based predictability."

Read more of this story at Slashdot.

msmash

Even the US Government Says AI Requires Massive Amounts of Water

1 month 1 week ago
A Government Accountability Office report released this week reveals generative AI systems consume staggering amounts of water, with 250 million daily queries requiring over 1.1 million gallons -- all while companies provide minimal transparency about resource usage. The 47-page analysis [PDF] found cooling data centers -- which demand between 100-1000 megawatts of power -- constitutes 40% of their energy consumption, a figure expected to rise as global temperatures increase. Water usage varies dramatically by location, with geography significantly affecting both water requirements and carbon emissions. Meta's Llama 3.1 405B model has generated 8,930 metric tons of carbon, compared to Google's Gemma2 at 1,247.61 metric tons and OpenAI's GPT3 at 552 metric tons. The report confirms generative AI searches cost approximately ten times more than standard keyword searches. The GAO asserted about persistent transparency problems across the industry, noting these systems remain "black boxes" even to their designers.

Read more of this story at Slashdot.

msmash