Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Batteries double the cost of solar for a 24 hour storage cycle. If you want to store energy for 48 hours, the cost doubles again. 96 hours? Doubles again. Want to store a mere 10% of US electrical generation from summer to use in winter? Literally 52 trillion (yes, trillion) dollars if you use batteries. A bit expensive, even if you spread the cost over 25 years. Pumped hydro, CAES and hydrogen stored in salt caverns are much cheaper at that scale.


Only if you’re failing to build enough solar. There is zero reason to ramp up storage rather than reduce the need for it. Over production is cheaper than any other option and reduces the need for longer term storage as daily solar output never drops to zero.

Solar is cheap enough it’s still cost competitive if you assume significant power production is wasted. Further, battery systems are designed to avoid full discharge, design for 99.5% of the time and you have spare capacity for the last 0.5%.


> daily solar output never drops to zero

True, but it is also almost never 'rated power'. Typically a solar installation will deliver somewhere between 30 and 50% of rated power on average, depending on your latitude it might be even lower. The times when solar panels work at their best is during clear winter days and when they are capable of two axis adjustment. Because of the cost of such an installation the typical choice is to install more panels running at lower efficiency.

And you'd need to size your batteries for those worst case scenarios of obscured sub in which case your panels might only output between 5 and 10% of their nominal capacity.

For reference, my 1600W array is outputting 15.9 Watts right now, earlier, at the best time of the day it was making about 300W due to the sky being overcast. If I had to rely on battery power to make up the difference I would not be able to make it on solar alone.

A healthy mix uses both solar and windpower, hydro if you have access to it (it is by far the most stable of the three and capable of providing baseline power without any tricks).


The economics of residential solar is heavily tied to local conditions. But, for grid solar there is little reasons to build it in the north rather than just move electricity from the south. We are already below 2c/kWh for solar power in some areas, the average consumer price is 13.27 cents per kilowatt hour in the US. That’s a lot of wiggle room.

Let’s assume 50% over production that’s 4c/kWh, backed up by 60h worth of batteries and transmission and your still cheaper than Nuclear while covering base load and peaking power needs. Further at National grid scale 50% over production an 60h of batteries is crazy overkill.


> But, for grid solar there is little reasons to build it in the north rather than just move electricity from the south.

That's true, and if you're going to ship it then HVDC lines running East/West can add a couple of hours overlap as well. All this focus on rooftop solar is nice because it is decentralized but quite a few parts of a structural solution will not come from rooftop solar but from very large grid scale installations.


> We are already below 2c/kWh for solar power in some areas

Yes, the power is very cheap exactly when nobody needs it.

> Let’s assume 50% over production that’s 4c/kWh, backed up by 60h worth of batteries and transmission and your still cheaper than Nuclear while covering base load and peaking power needs.

Can I see your calculations? Last time I did the math, building enough batteries to store 48 hours worth of US energy use was simply not viable.


Note as this is crazy overkill I am ignoring battery and lithium shortages of actually trying to do this. Any such transition would aim to minimize such costs. This is also not the kind of thing you do in a weekend long term pricing is required.

Batteries are predicted to cost 62$/kWh or 62 Billion$ per tWh in 2030 assuming no supply shortages. We can’t build anywhere near that many batteries by then, so it seems like a reasonable baseline long term. Further, by having vastly more batteries than required they end up with fewer and longer discharge cycles which extends lifespan. Current system are also deigned for much shorter lifespans because of rapid battery price drops. Based on that and further tech progress a 20-30 year lifespan seems reasonable I will use 25. Capacity is also going to drop over time and installation and maintenance costs are > 0 so I am going to add 10% ‘other’ and ignore 7% of US grid electricity generation being hydroelectric.

US annual electricity usage is ~4,000TWh / 365.24 days per year / 24h * 60h = 1.7 trillion dollars that’s a lot of money but the US can easily borrow on that scale for cheap. Further over 25 years that’s 4,000 TWH * 25 = 96,000 TWH for 1.7 Trillion dollars or 1.70$ for 96kWh or ~1.78c/kWh * 1.1 so call it 2c/kWh. Most of this power would of course come directly from solar generation, but having that many batteries is what transforms intermittent solar into peaking power.

Add 4c/kWh for crazy overkill on solar and you’re at 6c/kWh to have both base load and peaking power power. Though you still need to add significant new transmission and would have significant losses associated with charging and discharging batteries etc. Still add another 2c/kWh for all the costs I am not including and your still cheaper than Nuclear though not by as much.

PS: At grid scale wind and hydro further reduce battery storage requirements. Combined with excess though more reasonable generation say 1.5x and averaging across huge geographic areas 15h of battery power is probably significant overkill.


> Batteries are predicted to cost 62$/kWh (...)

Who predicts that? National Renewable Energy Laboratory[1] summarizes the predictions, and the average of predictions is $200/kWh in 2030, which is already more than triple of your estimate.

> or 62 Billion$ per tWh in 2030 assuming no supply shortages.

It is convenient to assume perfectly elastic supply, yes.

> US annual electricity usage is ~4,000TWh

Why do you consider only electricity? It's only just above a third of US energy use. US uses about 101 quads[2] of energy annually, which is just under 30,000 TWh (BTW, US Energy Information Administration believes that we're using 11000 TWH electricity annually, which is again triple of what your numbers are).

So, by the numbers of US government agencies, we're at (30,000 TWh / 365 days) * 3 days = 246 TWh, which, again, using government estimates, is 246 TWh * 200 dollars/kWh, which is $50 trillion, which is 250% of US GDP. So, assuming that the lifetime of the battery installation is 20 years, it means that 12.5% of US GDP will need to be spent on maintaining battery infrastructure, in perpetuity. That's already 2.5 times larger than current energy sector, and that's only cost of battery construction, we haven't even started talking about generation or grid maintenance!

In comparison, two new US nuclear power plants in Georgia, which are already extremely expensive due to cost overruns (US cannot build stuff, and you should also expect cost overruns for battery plants too), are astimated to cost $23 billion, and planned to have 2.2 GW power, and, unlike renewables, they'll actually produce as much day and night. 30,000 TWh/year is 3500 GW of power, so we'd need about 2000 nuclear plants like the one in Georgia. 2000*$23 billion is $46 trillion, amortized over 50 years is less than trillion a year. But, of course, if we could build reactors as cheaply as, say, South Korea can today, we'd only need spend fifth of the cost, so with $200B/year of construction costs, we could satisfy all our energy use with nuclear, not even needing solar/wind for peaks.

[1] - https://www.nrel.gov/docs/fy19osti/73222.pdf

[2] - https://www.eia.gov/energyexplained//us-energy-facts/


23 billion for 2.2GW is just a building and some equipment you still need workers, fuel, repairs, decommissioning etc. Further Nuclear only has a ~90% capacity factor they can’t operate 24/7 365 for 50 years without turning off. Nuclear also has the opposite problem, it needs to scale up production in the day to cover peak demand, and even more to reach peak seasonal demand. Run the numbers assuming exactly the right amount of power based on an annual average and your shifting energy across months. At a 90% capacity factor nuclear costs ~12c/kWh, at a 45% capacity factor that jumps to about 24c/kWh though as peak instantaneous demand is over twice average demand you would still need batteries.

Also, pretending we suddenly need to replace all sources of energy from PV power that’s then stored in batteries is crackpot territory. Part of those 101 quads are fuel used to generate electricity at sub 40% thermal efficiency. We don’t need generate electricity to replace the energy from fuel used to generate electricity.

Further, batteries only store electricity and it’s silly to pretend they are part of the loop for jet fuel which is also part of that 101 quads estimate in these calculations. Hell even using electricity rather than fuel oil to heat your house uses less energy via a heat pump.

As to prices, I think we are comparing apples to oranges here. If you want to design a system that takes AC power from the grid coverts it to DC for storage on a battery and then back to AC your adding far more than just batteries. But, if you’re producing PV power on site that’s DC so it would need it’s own DC to AC converter as well as equipment to connect to the electric grid and regulate production. This all adds up to significantly reduced prices when packaged together. So when I say battery prices are already under 200$/mWh I mean Battery prices.

“In March, an analysis of more than 7000 global storage projects by Bloomberg New Energy Finance reported that the cost of utility-scale lithium-ion batteries had fallen by 76% since 2012, and by 35% in just the past 18 months, to $187 per MWh” So even today we are beating your 2030 estimate. https://www.sciencemag.org/news/2019/07/giant-batteries-and-...

PS: The article makes it’s own estimate at 900GWh of batteries being sufficient for 100% rentable electricity generation using some other set of assumptions.


> 23 billion for 2.2GW is just a building you still need workers, fuel, repairs, decommissioning etc.

Workers, fuel, repairs, decomissioning etc. are very cheap, much cheaper than equivalent coal fired plant, for example. Levelised cost of electricity, which includes all of these plus construction costs, is still the lowest for nuclear energy, much lower than wind and solar even if you don't build any batteries at all.

> Further Nuclear only has a ~90% capacity factor they can’t operate 24/7 365 for 50 years without turning off.

??? How is this relevant for the discussion at all? If you have 2000 plants, then with 90% capacity factor, you'll have 10% of plants down at any given time, thus you simply need to have 10% more plants than what you need. The crucial thing is that with nuclear, down time can be scheduled, while with wind and solar, overcast sky and windless days are completely beyond your control.

> Nuclear also has the opposite problem, it needs to scale up production in the day to cover peak demand, and even more to reach peak seasonal demand.

You simply need to build enough for peak seasonal demand. The nuclear plants can scale up and down as needed, albeit slower than gas plants, so you just scale up to cover peaks, and have aluminium plants enjoy cheap electricity when you overshoot.

> Run the numbers assuming exactly the right amount of power based on an annual average and your shifting energy across months.

My 2000 plants for $200B/year was already 12% above average use, but sure, I could double my numbers to 4000 plants and still be below 10% of US federal government spending. Imagine how big of a boon would such plentiful and cheap off-peak energy be for US heavy industry or data centers.

> Also, pretending we suddenly need to replace all sources of energy from PV power that’s then stored in batteries is crackpot territory. Part of those 101 quads are fuel used to generate electricity at sub 40% thermal efficiency. We don’t need generate electricity to replace the energy from fuel used to generate electricity.

Sure, but note that the nuclear numbers also scale just as well, so even if the increased efficiency might make the whole batteries business a bit closer to the realm of feasible, it will make the nuclear approach even cheaper too. The inefficiencies you mention will probably amount to something like 30% of current energy use, so they don't change the calculations substantially: utility scale batteries still infeasible.

> PS: The article makes it’s own estimate at 2.25 trillion for 900GWh of batteries being sufficient for 100% rentable electricity generation using some other set of assumptions.

Yes, their estimates are 10 times higher than mine. That should make you make think really hard about feasibility of large scale batteries. For 2.25 trillion, South Koreans can build 900 GW worth of nuclear generating capacity -- which one you think is more sensible choice, one hour of 900 GW from batteries, or continuous 900 GW from nuclear plants, if your goal is "decarbonizing the grid"?


> cheap

Far from it.

“Areva, the French nuclear plant operator, offers that 70% of the cost of a kWh of nuclear electricity is accounted for by the fixed costs from the construction process.” https://en.m.wikipedia.org/wiki/Economics_of_nuclear_power_p...

However, that’s including loan financing which inflates the construction costs further. Similarly, decommissioned might cost 1 Billion, but as it takes place so long after construction it’s easy to set money aside which can then compound for decades. https://www.reuters.com/article/idUS178883596820110613

Looking at pure construction costs in a steady state environment without loans it’s a significantly smaller fraction which continues to increase with age. This is why several viable nuclear power plants have been decommissioned early, their operating costs are to expensive even after construction has been paid off.

> ??? How is this relevant for the discussion at all?

Rather than a 2.2 GW power plant producing 2.2 * 24 * 365 GWh it gives you 90% as much energy. So if you want 2.2 TW 24/7 you need to build ~1,112 of them not 1,000 of them. Further this also multiples every other cost.

> down time can be scheduled

Up to a point, your talking days of downtime not hours so you still need peaking power. The cheapest approach is for a steady state of workers smoothly moving from one project to the next. That’s hard to pull off, and you end up needing even more generation to cover scheduling issues and unexpected problems.


> “Areva, the French nuclear plant operator, offers that 70% of the cost of a kWh of nuclear electricity is accounted for by the fixed costs from the construction process.”

Yes, this exactly means that operating costs are very cheap: if 70% of the cost per kWh is amortization of construction costs, and only 30% is operating costs, and the kWh from nuclear plants are still the cheapest, or exactly means that the operating costs of running a nuclear power plant are very, very low, much lower than coal plants.

> Looking at pure construction costs in a steady state environment without loans it’s a significantly smaller fraction which continues to increase with age. This is why several viable nuclear power plants have been decommissioned early, their operating costs are to expensive even after construction has been paid off.

Which ones? What were their operating costs?

> So if you want 2.2 TW 24/7 you need to build ~1,112 of them not 1,000 of them.

Sure, but as I noted, we can build twice the amount we need rather easily, while building enough of utility scale batteries to ride out the variations in solar/wind production is just infeasible. Of course nuclear isn’t perfect, but I thought your alternative is solar/wind, which has all the same problem, but worse - instead of predictable and controllable 90%, you get 40-50% of installed capacity on average, and it can be all down at times beyond your control.

> Up to a point, your talking days of downtime not hours so you still need peaking power. The cheapest approach is for a steady state of workers smoothly moving from one project to the next. That’s hard to pull off, and you end up needing even more generation to cover scheduling issues and unexpected problems.

That’s okay, since we can build twice our peaks and still be firmly in the realm of feasible.

I note, however, that you gave up on the idea of utility scale batteries being feasible approach to work around the inherent unpredictability of solar and wind. Good.


> Which ones? What were their operating costs?

“Each nuclear power plant employs 500 to 1,000 workers. Nuclear worker salaries are 20 percent higher on average than those of other electricity generation sources.” 40 mill x 50 years = 2 Billion dollars just for wages. https://www.nei.org/advantages/jobs

As giant complex mechanical systems over time moving components like pumps and turbines need to be replaced. Due to contamination some of these costs are crazy high.

Insurance is a big one. They need normal insurance for stuff like fires and workplace accidents, plus government subsidized insurance for the rare major accidents that could be horrifically expensive.

While cheap relative to coal they still need fuel which adds up to 14% of operating costs. While U3O8 is $68 by the time it’s ready to be used the costs increase to $1390 per kg (https://www.world-nuclear.org/information-library/economic-a...) and a 1GW reactor goes through ~25,000kg of enriched uranium per year. https://www.nuclear-power.net/nuclear-power-plant/nuclear-fu... That’s 35 million dollars per year and 1.75 billion over 50 years. (As a sanity check. If it’s 14% of total operating costs then it’s operating costs are 12.5 billion for a 1GW nuclear power plant operating for 50 years. If 12.5 billion operating costs = 30% of total costs then a 1GW Nuclear reactor costs 41.5 billion over 50 years or 830 million per year and at 90% capacity factor it’s 10.6c/kWh which is close to the 12.5c/kWh I have seen quoted frequently.) With reprocessing it’s about only 3kg for 1GW of electrical power per day at 33% thermal efficiency, but without repressing they need significantly more fuel. Unfortunately repressing is not currently cost effective.

They also need the basics like office equipment like computers and supplies, as well as the normal costs associated with buildings like lightbulbs, roof repair, water and sewage etc. Plus the normal taxes etc.

They also need to set aside money for decommissioning.

PS: Digging into those fuel costs where really interesting it’s interesting to see how 68$/kg and 3kg per day which I have seen quoted before gets transformed in a much larger expense.


“Each nuclear power plant employs 500 to 1,000 workers. Nuclear worker salaries are 20 percent higher on average than those of other electricity generation sources.” 40 mill x 50 years = 2 Billion dollars just for wages. https://www.nei.org/advantages/jobs"

Yes, if you multiply things by 50, you get large numbers. Note, however, that $40M per year for a plant generating 2200 MW day in day out (well, let's say 90% of the time) results in $40M / (2200 MW * 0.9 * 1 year) = $0.002 per kWh, that is, 1/5th of 1 cent per kWh in employees cost. I say that's really cheap.

> As giant complex mechanical systems over time moving components like pumps and turbines need to be replaced.

Same is true for all kinds of power plants. You need to compare the costs between all for a sensible comparison. It's fun to make all these calculations by yourself, but fortunately, there's already a metric for that, called levelised cost of electricity (LCOE). It's basically the total lifetime cost of a facility (construction, operation and decomissioning) divided by total amount of energy produced. As you can see in [1], LCOE of nuclear easily beats solar, and matches wind plants, which suffer from the disadvantage of only producing electricity when the wind blows. If you didn't have option of base load fossil or nuclear plants, you'd have to build storage for wind and solar, which would make their LCOE jump through the roof.

Look, it's clear that nuclear isn't free, so yeah, you really need to spend a billion on fuel and another on wages over 50 years. More importantly, though, over those 50 years you'll produce $50 billion worth of electricity. Just look at the published LCOEs, and consider how much they'd have to go up for solar & wind if we couldn't fall back on fossils and nuclear.

In future, if we ever manage to get off fossil fuels, the way we do it will be nuclear for base load along with some amount of wind and solar (though not much, due to environmental concerns that will make it very hard to build significant amounts of wind and solar). As I hope is clear for you, it most definitely will not be wind & solar + batteries, and no nuclear.

[1] - https://www.energy.gov/sites/prod/files/2015/08/f25/LCOE.pdf




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: