Excuse my ignorance (I am a layperson and simply interested in this subject), but are you suggesting that the utility themselves would rollout distributed solar to their customers, and thus the utilities engineers are skeptical of the technology and how it would be rolled out?
It seems to me that the whole point is that consumers will begin the rollout themselves, generating and storing power on their own and the only visible effect to the utility would be a dramatic decrease in power use by that customer.
While having a grid where you could sell your unused energy is a great idea, I didn't see that as the focus of this article. I thought this article was simply about the distributed generation and storage.
In the context of distributed generation and storage on an individual level -- why do utility engineers matter? They don't get a say in the rollout, they don't manage it, and they certainly don't get to prevent it. All they can do is maintain their infrastructure in the face of a change they cannot stop.
I was not talking about the completely off grid case at all. Maybe I misunderstood the article. I was talking in terms of grid-connected solar installs, which I feel just make more sense. What happens if you get a cloudy week? You have no power until the sun shines again... or have to fire up your inefficient generator (the existence of which may in fact negate any savings anyway... because price and inefficiency at small scale of generation). It could be my bias showing too... I'll admit that.
However, there are interesting regulatory consequences here. Various levels of government have regulations in place that are designed to protect the consumer, but also put a strain on utilities. In a very large number of places, the power company is only allowed to charge a certain rate or less. This rate is partly based on building out the infrastructure for all houses, businesses, etc in the region, and assuming certain usage values. They are also required to make power available to anyone in those areas. They are also required to provide power under certain conditions, regardless of the customer paying (e.g. they can't shut off power in the winter in many places, because that could kill someone). The utility being constrained by the rules in these places must figure out how to operate profitably (being a publicly traded company in many cases) with these constraints. So, if suddenly everyone starts going off-grid, they are still required to sell at a certain rate or lower, still required to be able to provide power to everyone, and being told that they can't necessarily charge enough for that service, because a lot of people will just go off-grid.
It is a sticky situation, and until operating models and regulations can be worked out to account for wide-spread off grid people, it is kind of a catch-22 for everyone. I don't want my power to go out because the utility can't afford maintenance because they can't charge me what they need to for reliable service, because half of my neighbors are off-grid now.
Basically, the question is: if you were under regulation to act a certain way, and those regulations were based on assumptions, wouldn't you be against allowing behavior that breaks those assumptions without a change in your regulatory responsibility?
That sounds like the 60 train restrictions/governmental restrictions on closing branch lines/rate restrictions in Atlas Shrugged (Not looking for Ayn Rand discussion). So if you happen to have a system with high inertia (stacks of paper weighting it down), any change will break it, whether it's more demand (brown/blackout) or less demand (financing model collapses).
I think the article is talking about customers who buy their own solar installations. The thing is, no system is available (afaik) that is 24/7 solar power (batteries add a massive cost to user solar installations) so the only way for you to have power at all times is to sell the surplus generated solar back to the utility (which is where the engineering problems come in) during peak solar hours and using the gird off peak.
>batteries add a massive cost to user solar installations
SolarCity will install a bank of Tesla Motors Li-ion batteries for grid backup and load shifting. This would be financed over the course of the loan (typically 20 years), not all at once. http://www.solarcity.com/residential/energy-storage.aspx
I was recently in rural South Sudan, at a place where there are no power lines within 80 miles. So the systems they use must be off the grid by necessity.
There were several PV and battery setups there which worked pretty well. Obviously you need to be careful with the amount of power that you use, but it can run computers, TV and lighting with a reasonable setup.
There was a computer lab at one installation which had one desktop machine acting as a server and about 15 thin clients to keep the power usage low.
Thats why i think CSP and thermal storage (via molten salts KNO3+NaNO3) might work better in the long run (at least in an environment with an atmosphere).
Yes! But in that paradigm, where you generate 80%+ of your needed energy and rely on the grid for the rest, how does that affect the utility?
The only difference to them is that you draw less power than you used to. Sure, it would benefit them to learn your habits so they can manage steady power delivery at scale, but they don't get a say in your solar rollout. They don't get to prevent it. They don't get to be skeptical (or if they are, they can't act on it).
The person I'm replying to seemed to say that the engineers at utilities are skeptical of this technology and honestly I don't understand how they're relevant, since utilities aren't rolling out the technology, maintaining the technology, etc.
> The only difference to them is that you draw less power than you used to.
That is not the only difference; the pattern of your load changes too.
With 100% grid-supplied power, changes in load are driven by slow-moving, predictable systems like sunrise/sunset, weather, and seasons.
If you are running your own solar array, though, you will vary your grid load on much shorter time scales unless you invest in a big battery pack to smooth out the variations in insolation from clouds and storms. And even the normal daily variation will be stronger, since when it gets dark you'll not only be increasing your load (turning on lights, TV, cooking, etc), you'll also be losing your local generation.
The current electric grid is not built to handle such large changes in load on such short timeframes.
With 100% grid-supplied power, changes in load are driven by slow-moving, predictable systems like sunrise/sunset, weather, and seasons.
If you are running your own solar array, though, you will vary your grid load on much shorter time scales
I'm not so sure that these latter variations are significantly less predictable than what you mention in the former paragraph. Isn't the output of a solar array dependent on the insolation? Isn't the insolation dependent on the cloud cover? Can't you predict the cloud cover in any single place simply by taking advantage of real-time meteo satellite data? A similar feedback could be established for wind power. Given enough data, I'm reasonably certain that models could be established that would allow you to predict how the solar and wind power generation distribution is going to change in the next hour(s) so that you could prepare for it.
The problem isn't the prediction, full solar is probably only slightly more volatile than full grid on a large scale. Maybe not hourly but the companies would still be able to work out the supply side.
The issue is that the supply side and supporting infrastructure is built for an entirely different system where power leaves the power plants and goes through the grid to consumers. Now people are adding solar panels which generate a ton of electricity during the day (when everyone is largely at work/school) that must then be fed back into the grid, opposite the direction of normal flow.
But what if you generate 110% of your electricity needs? You would push out more power than is coming in, sometimes literally running the meter backwards.
In the US it depends on where you are at and who your power company is. Some times they do it the way you describe in NZ, but in many areas it is more the way the OP describes, where there is one meter, and sometimes it literally runs backwards- only if there is an excess at the end of the month is it credited at the lower rate.
Are we saying that there is no way to prevent power from going backwards back to the utility?
I figured what you generated went directly to your storage system, you drew from your storage system OR the grid, or some third part intelligently drew from the grid or your system as necessary.
Is it functionally impossible to have solar panels that don't "run the meter backwards?" Because I can see how that would threaten the infrastructure that wasn't designed for it.
I mean, I see places like Apple generating all their electricity on location for their new planned office and using the grid as backup. Is Apple sending their excess power back to the utility? Totally different scale, I'm well aware, but just curious.
Yes, you can configure inverters to not dump excess current back to the utility. You either charge batteries or you dump the power into a load like an electric water heater.
In almost all grid-tied scenarios though, you want to sell the power back to the utility. I'm not familiar with commerical operations, as they would fall under power purchase agreements (PPAs), but with residential installations your meter quite literally does spin backwards or a separate meter is used to determine how much power you've sent back to the grid. This is dependent on how the utility compensates you for the power you generate (spin the meter back if its a credit or the same price as what your purchase it from them at, separate meter if the pricing paid to you is different than the retail rate).
As the article states, we don't have the issues in the US that Germany has yet, because Germany produces so much more solar energy than in the US. At some point though, questions will need to be answered about who is going to pay for the spinning capacity (likely natural gas generators) that isn't used except for those rare times when the wind isn't blowing, the sun isn't shinning, and you can't drag enough power in from another geographic region over HVDC transmission lines.
> who is going to pay for the spinning capacity (likely natural gas generators) that isn't used
That doesn't sound like a bad problem to have.
Fukushima showed us how hard it is to turn a nuke on and off
quickly, but I would think most hydrocarbon burning and other generation systems could be throttled according to demand.
If you really have a problem with too much energy, well, smelt some aluminum or electrolyze water or something.
"Spinning Reserve is the on-line reserve capacity that is synchronized to the grid system and ready to meet electric demand within 10 minutes of a dispatch instruction by the ISO. Spinning Reserve is needed to maintain system frequency stability during emergency operating conditions and unforeseen load swings."
Correct. You're not paying for that field of gas turbines to run. You're paying for them to sit in the field warm until issued to run, because if they're not there, hello blackouts.
Spinning reserve ought to be replaced with batteries. I bet some of it could be replaced with batteries today with the utilities' customers saving money, aside from the sunk costs not then providing the proper return on capital to their investors.
LOL talk to an EE about the orders of magnitude involved, you're not going to like the answers.
The cheapest storage battery is simple lead acid figure a quarter per watthour installed. Or a KWh is $250. But with a 100% charge/discharge cycle the battery will be dead and need replacing in about 10 or so cycles. To get up to 1000 or so cycles (which is only 3 yrs daily cycles) means you only get to use about 10% of the capacity. Lets round up because rectifier/inverter gear, and buildings, and operators and their stuff, are not free. So you can guess about $3/watt of storage as an absolute best case, but probably more realistically a turn key battery storage facility would cost more like $4 and would depreciate fully in about half a decade.
Hydro turnkey is about $1/watt plus or minus massive corruption (what is the dollar value of the hetch hetchy valley of yosemite national park, etc?) Coal plants sell turnkey for a bit over $2/watt, natgas is arguably the most expensive around $6/watt. All of those last like 50 years, so divide those costs by about 10 to compare with a battery bank that only lasts at best 5 years.
Spinning capacity (well, sorta, in case of natgas) is around 10 times cheaper per KWH than batteries. You have to remember that power companies don't really care about pushing an agenda, more or less. There is no conspiracy, they just want to sell KWH. If they could install a battery bank instead of a natgas peaking plant, and keep huge profits, they most certainly would.
There are some interesting math problems too. If each stored utility grade battery costs $2500/KHW and the total overall worldwide battery industry is about $50B, that means if we abandon all other forms of battery use in the world and get rid of all laptops, cellphones, etc, we could build nothing but lead acid batteries at a pitiful rate of ... drum roll ... 20 megawatt hours of utility grade storage per year. Now since the batteries are scrap in 5 years, that means if in a Manhattan style worldwide project we focus the entire industry on utility grade storage, we can never store more than 100 megawatt-hours worldwide. Which if you assume a daily charge discharge cycle is about 10 MW continuous or about the capacity of ONE small gas turbine system. So its not as simple as going down to "batteries plus" and picking up a battery large enough to UPS a nuke plant.
It's about 5 megawatt hours (varies somewhat based on draw, could stretch it to 6.5 MW-h for 15 minute runtime, less at higher draws).
Cost $35 million, so ~$7000 per KW-h of capacity.
Planning authorized in 1993, online in 2003.
Expected battery life of 20 years (maybe 30).
So a project that didn't seem to take up the entire manufacturing capacity of the battery industry managed to bring up 0.5 megawatt hours per year, with a lifetime of 20 years, half of your prediction of capacity production capacity (but at much higher cost than you started from).
"An economically and ecologically more viable alternative to ‘spinning reserve’ – gas turbines kept running in case of an emergency – is battery back-up."
Do you have any idea how big and environmentally unfriendly those batteries would have to be? Ready reserve is generally provided by peaking generators (the aforementioned natural gas turbines) and by hydro dams.
There aren't many batteries that can provide hundreds of megawatts for hours at a time.
It seems to me that the whole point is that consumers will begin the rollout themselves, generating and storing power on their own and the only visible effect to the utility would be a dramatic decrease in power use by that customer.
While having a grid where you could sell your unused energy is a great idea, I didn't see that as the focus of this article. I thought this article was simply about the distributed generation and storage.
In the context of distributed generation and storage on an individual level -- why do utility engineers matter? They don't get a say in the rollout, they don't manage it, and they certainly don't get to prevent it. All they can do is maintain their infrastructure in the face of a change they cannot stop.