Skip to main content

Some US Electricity Prices are Rising -- But It's Not Just Data Centers

1 week 4 days ago
North Dakota experienced an almost 40% increase in electricity demand "thanks in part to an explosion of data centers," reports the Washington Post. Yet the state saw a 1% drop in its per kilowatt-hour rates. "A new study from researchers at Lawrence Berkeley National Laboratory and the consulting group Brattle suggests that, counterintuitively, more electricity demand can actually lower prices..." Between 2019 and 2024, the researchers calculated, states with spikes in electricity demand saw lower prices overall. Instead, they found that the biggest factors behind rising rates were the cost of poles, wires and other electrical equipment — as well as the cost of safeguarding that infrastructure against future disasters... [T]he largest costs are fixed costs — that is, maintaining the massive system of poles and wires that keeps electricity flowing. That system is getting old and is under increasing pressures from wildfires, hurricanes and other extreme weather. More power customers, therefore, means more ways to divvy up those fixed costs. "What that means is you can then take some of those fixed infrastructure costs and end up spreading them around more megawatt-hours that are being sold — and that can actually reduce rates for everyone," said Ryan Hledik [principal at Brattle and a member of the research team]... [T]he new study shows that the costs of operating and installing wind, natural gas, coal and solar have been falling over the past 20 years. Since 2005, generation costs have fallen by 35 percent, from $234 billion to $153 billion. But the costs of the huge wires that transmit that power across the grid, and the poles and wires that deliver that electricity to customers, are skyrocketing. In the past two decades, transmission costs nearly tripled; distribution costs more than doubled. Part of that trend is from the rising costs of parts: The price of transformers and wires, for example, has far outpaced inflation over the past five years. At the same time, U.S. utilities haven't been on top of replacing power poles and lines in the past, and are now trying to catch up. According to another report from Brattle, utilities are already spending more than $10 billion a year replacing aging transmission lines. And finally, escalating extreme-weather events are knocking out local lines, forcing utilities to spend big to make fixes. Last year, Hurricane Beryl decimated Houston's power grid, forcing months of costly repairs. The threat of wildfires in the West, meanwhile, is making utilities spend billions on burying power lines. According to the Lawrence Berkeley study, about 40 percent of California's electricity price increase over the last five years was due to wildfire-related costs. Yet the researchers tell the Washington Post that prices could still increase if utilities have to quickly build more infrastructure just to handle data center. But their point is "This is a much more nuanced issue than just, 'We have a new data center, so rates will go up.'" As the article points out, "Generous subsidies for rooftop solar also increased rates in certain states, mostly in places such as California and Maine... If customers install rooftop solar panels, demand for electricity shrinks, spreading those fixed costs over a smaller set of consumers.

Read more of this story at Slashdot.

EditorDavid

Does Generative AI Threaten the Open Source Ecosystem?

1 week 4 days ago
"Snippets of proprietary or copyleft reciprocal code can enter AI-generated outputs, contaminating codebases with material that developers can't realistically audit or license properly." That's the warning from Sean O'Brien, who founded the Yale Privacy Lab at Yale Law School. ZDNet reports: Open software has always counted on its code being regularly replenished. As part of the process of using it, users modify it to improve it. They add features and help to guarantee usability across generations of technology. At the same time, users improve security and patch holes that might put everyone at risk. But O'Brien says, "When generative AI systems ingest thousands of FOSS projects and regurgitate fragments without any provenance, the cycle of reciprocity collapses. The generated snippet appears originless, stripped of its license, author, and context." This means the developer downstream can't meaningfully comply with reciprocal licensing terms because the output cuts the human link between coder and code. Even if an engineer suspects that a block of AI-generated code originated under an open source license, there's no feasible way to identify the source project. The training data has been abstracted into billions of statistical weights, the legal equivalent of a black hole. The result is what O'Brien calls "license amnesia." He says, "Code floats free of its social contract and developers can't give back because they don't know where to send their contributions...." "Once AI training sets subsume the collective work of decades of open collaboration, the global commons idea, substantiated into repos and code all over the world, risks becoming a nonrenewable resource, mined and never replenished," says O'Brien. "The damage isn't limited to legal uncertainty. If FOSS projects can't rely upon the energy and labor of contributors to help them fix and improve their code, let alone patch security issues, fundamentally important components of the software the world relies upon are at risk." O'Brien says, "The commons was never just about free code. It was about freedom to build together." That freedom, and the critical infrastructure that underlies almost all of modern society, is at risk because attribution, ownership, and reciprocity are blurred when AIs siphon up everything on the Internet and launder it (the analogy of money laundering is apt), so that all that code's provenance is obscured.

Read more of this story at Slashdot.

EditorDavid