Friday, August 28, 2009

Making a Case for Rationality in [Renewable] Energy Deployment

It has become increasingly evident that there is a serious need for some clear thinking in the energy debate. As I have often said, the sum total of all of the vested interests (utilities, environmentalists, renewable energy advocates, etc., etc.) does not equate to the public interest. In an earlier blog on the Western Wind & Solar Integration Study, I noted that we are becoming increasingly adept at operating the electric grid with greater amounts of intermittent renewable resources. But there are many proponents who are becoming positively irrational about this. It seems that some of the policy makers who are closest to the problem are the farthest from reality. To some I would grant the benefit of the doubt and suggest that they simply know no better. Other proponents would clearly subjugate the public interest to their own economic and/or political self interest. Unfortunately, the result is the same.

As anyone familiar with this field knows, California's renewable standard will likely soon require major utilities to provide one-third of their energy from renewable resources by 2020. It seems to matter little whether or not this is an attainable goal. And, I used to think that California was on the right track in requiring that these renewables actually deliver renewable power into the state in order to qualify. But as the likelihood of meeting this ambitious goal becomes increasingly problematic, the policy makers and stakeholders seem willing to make tradeoffs that will dilute the effectiveness of the goal.

Let's start with an article in the Sacramento Bee describing how the various stakeholders are debating the sourcing requirements for renewable energy to comply with the standard (see Utilities, groups at odds over sources for renewable energy). Utilities are now claiming -- perhaps rightfully so -- that they cannot meet this aggressive standard unless they are allowed to procure resources from out of state, and a California PUC analysis supports this contention. So, if you were in California, which would you find to be the more rational approach: a standard that cannot be met and which encourages utilities to procure out-of-state resources, often at exorbitant costs, or a more reasonable ramp-up in renewables that fosters in-state development at reasonable cost? Would California ratepayers benefit more from sending their utility dollars out of state to essentially purchase RECs or keep that money in state to develop local projects? They may as well propose to get solar energy from a satellite! Oh, sorry, they're proposing that too. And then everyone laments the high contract failure rate in renewable energy.

The lesson seems to be: Set a standard that you cannot achieve and then simply change the way you measure compliance to make it appear that you did. This situation has become so bizarre that Colorado's major utility has now hatched a scheme to combine brown power from unspecified sources with its excess RECs and sell the bundle to California utilities as green power to comply with the standard that they cannot otherwise meet. Of course, the Colorado utility in question hopes to profit handsomely from the sale of these RECs which, incidentally, were already paid for by Colorado ratepayers. What makes this possible from the Colorado utility's perspective is that it has acquired too many RECs, too soon, at too high a cost. This is beginning to look like a zero-sum game. The same amount of renewables will ultimately be developed, the only question is where? Call me old fashioned but how about a scenario in which each utility develops its own renewable resources to serve its own local needs?

I mentioned that the Colorado utility had acquired too many RECs, too soon, at too high a cost. I made this comment back in 2007 when I showed that the utility proposed to purchase more solar RECs than it needed for compliance in an environment where solar costs were projected to diminish rapidly, thereby saddling ratepayers with higher costs than they would otherwise have to pay. This prevented the utility from taking advantage of the inevitable cost reductions that were sure to result from technological advance and deployment experience. But it is this myopic belief that if some is good, more must be better that is responsible for the irrational behavior that is inflicting policy makers with regard to renewable energy. As stated in a recent op-ed piece on cap-and-trade by Paul Gerlach in the South Florida Sun-Sentinel, this "preoccupation with setting unrealistic targets for renewable sources has blinded policy makers to the almost unlimited opportunities for technological breakthroughs in the production and use of conventional fuels..." and, I might add, in the deployment of renewable energy technologies.

Tuesday, August 11, 2009

Geoengineering redux - aka Climate Engineering

Scroll down a bit in this blogspace and you'll find a blog I wrote on Geoengineering as a response to climate change (or, just click the link). It is nice to know that this potential fix is drawing increasing attention in legitimate research circles. Today's NY Times contains an article entitled The Earth is Warming? Adjust the Thermostat which provides an excellent concise discussion of this issue from both sides. The proponents of such solutions rightfully cite the decreasing likelihood of getting worldwide agreement on emissions reduction strategies and note that it may require affirmative action on the part of a few to avert the crisis. This is the assertion I made in my earlier blog post on this topic. Skeptics point to problem of unintended consequences that may result from such activities and they too are correct. Spraying aerosols into the atmosphere to simulate a volcanic winter is likely to have many other adverse effects. Pick your poison.

It is also true that simply adjusting the thermostat, so to speak, won't reduce the amount of CO2 in the atmosphere and thus problems with ocean acidification still remain. That is why I still believe that the best approaches to geoengineering (now referred to in some circles as climate engineering) are those that essentially scrub the atmosphere of CO2 rather than those that simply mask the problem by putting more junk into the atmosphere.

Interested readers should also consult work being done at the Copenhagen Consensus Center which is publishing a series of perspective and analysis papers on climate change and approaches to adaptation and mitigation. It is an excellent resource for those who wish to be more fully informed on the range of solutions that may exist. In my earlier post,
I noted that we should definitely take prudent measures now to avert a calamity in the future. But, I also noted that we must begin a serious investigation of the more active approaches to mitigating climate change and its first cousin, ocean acidification, now so that those technologies may be suitably developed if and when we need them.

Sunday, August 02, 2009

Corporate Right of Passage -- A Note to Techies

Today, I'd like to offer just a short mention and recommendation to read an interview in the Sunday NY Times with Cisco CEO John Chambers (see In A Near-Death Event, A Corporate Right of Passage). In this interview, Mr. Chambers discusses how Cisco's near brush with failure made it a better company. He also discusses how capitalizing on modern modes of communication -- video blogs, distributed communication, and relationship building -- is increasingly important to success in today's world.

Mr. Chambers also mentions something else that I found interesting. In this interview, he notes that "when there’s an accident happening, that’s when you’ve got to be the calmest. And yet that’s when most people are not." This made me think back to a blog I posted on July 15 (see Geoengineering) in which I questioned Eugene Kleiner's assertion that "There is a time when panic is the appropriate response." In contrast, Mr. Chambers goes on to say "So I’ve learned when something with tremendous stress happens, I get very calm, very analytical." Nice to know that I'm not alone in that regard.

There is one other important learning point in the Chambers interview and that is the ability to admit mistakes and failures. Unfortunately, such candor is anathema in the utility and regulatory environment that I am presently part of, especially with regard to renewable energy and public policy. Apparently it borders on sacrilege to question the all knowing legislature, Commissioners, and executive branch policy makers. But, we're learning as we go and their credibility would go much farther if they could only admit "Well, we screwed that one up. Let's fix it and move on." I'm hoping for too much.

Saturday, August 01, 2009

Western Wind & Solar Integration Study

On Thursday, July 30, NREL hosted the Western Wind & Solar Integration Study (WWIS) Stakeholder Meeting in Denver. Great attendance (probably 80 to 100 or so) and some really interesting updates on this work that began in 2007. If you're not familiar with this study, you can find more information on the NREL WWIS website. Generally, the goal is to assess the feasibility of adding as much as 35 percent wind and solar generation within the WestConnect footprint (WY, CO, NM, AZ, NV, and northern CA). Relying on some gi-normous datasets (e.g. 30,000 30MW wind sites in the footprint with data every 10 minutes for 3 years), this has been an extraordinarily computationally intensive exercise. Unfortunately the solar data has been considerably more sparse.

We'll talk about a few of the observations from this study -- some predictable and others not so much. The bottom line, however, is that having 35 percent renewable energy penetration (30% wind and 5% solar) appears technically feasible although operation of the electric grid would have to be dramatically different than it is now. Essentially, instead of the large number of small control areas that presently exist, it would require a larger, more geographically dispersed control area that could balance the intermittent contributions of wind and solar generators. And, as anyone familiar with this field is aware, transmission (or the lack thereof) is an issue. But the extent to which it is an issue appears to be a function of whether we rely on fewer megaprojects located in the best wind and solar resource areas or a larger number of smaller dispersed projects (albeit with lower capacity factors) sited throughout the footprint. Hence, there is a trade-off between transmission costs and the higher cost of renewable energy from lower quality resources.

One of the very interesting outcomes from the analyses conducted thus far is the relationship between total load and renewable generation. As the General Electric folks conducting the modeling noted, "The bad actor is the wind." There are times in the spring when the wind is high and there may be more total wind and solar on the system than load. This presents great operational difficulties for system operators. Alternatively, there are times in the early morning when load is ramping up sharply just when the wind is ramping down.

These types of issues highlight the importance of forecasting to system operators. From a market perspective, it was found that with a perfect forecast, increasing renewable penetration drives spot prices lower (since there is no fuel cost). But, with an imperfect forecast, the forecast error drives spot prices back up because operators would commit insufficient capacity and have to turn expensive peaking units back on. Furthermore, it was found that at renewable penetrations exceeding 20 percent, coal units would begin to be impacted. However, the greatest impact was found to be on combined cycle gas plants being backed down. Overall, the value of wind energy rose with a perfect forecast and dropped with an imperfect forecast. Not much of a surprise there, I suppose.

The study also found that generator total revenues fell with increasing renewable penetration (as noted above, spot prices decreased). But, the total revenues for nonrenewables fell at a steeper rate for two reasons -- they generated less energy and spot prices fell. There were two other particularly interesting outcomes presented. The first concerned the cost of unserved energy and the important role that demand response could play in mitigating this problem. The second was that the role that hydro, and especially pumped storage hydro (as well as other large scale storage), could play is less than commonly believed.

Researchers found that higher wind penetration also resulted in greater amounts of unserved energy, due largely to over-forecasting of the wind. But, discounting the wind forecast has the effect of driving spot prices down because you're carrying more gas, resulting in less unserved energy. Thus, there was a very high cost to reducing the unserved energy by discounting the wind forecast. It was found to be far more cost effective to get the load to be responsive rather have the system make up the shortfall to the extent that a couple of thousand MW of interruptible load was found to be cost effective.

Lastly, we discussed the operational impacts of increasing renewable penetration on hydro operation. Hydro needed to be scheduled in response to the wind forecast while increasing wind penetration also increased the variation in hydro scheduling. There was found to be a large operating cost increase if you did not shift hydro commitments in response to the wind forecast -- obviously hydro has the flexibility to move while wind does not.

With regard to pumped hydro, it was found that if you have it, the system will use it, but there was no incentive to add more. This seemed counter to the conventional wisdom so much of the ensuing discussion focused on this topic. The researchers reported that increasing pumped storage increased overall costs. As you increased renewable penetration, the storage ran more. As you forced the storage to run more, it drove costs up. It was found that, even with 30 percent wind penetration, the WestConnect footprint has sufficient pumped storage and no more is needed. Exploring this further, the group concluded that the pumped hydro may be more useful in a smaller footprint. In a larger area, it was preferable to use the system as storage. As part of this discussion, one participant from Ireland noted that studies on their system demonstrated a similar result and that pumped storage was not needed until renewable penetration reached as high as 50 percent. There, it was found that the capacity cost of pumped hydro displaced other capacity costs. But, it only provides capacity if you fill it. Thus, you need to reach the higher wind penetration levels before the pumped storage pays out.

So ended a valuable update to this important research initiative. The day concluded with some discussion of next steps and areas of focus as the project moves forward. One shortcoming, as noted earlier, is the dearth of solar data. To model increasing photovoltaic penetration, the project needs more one-minute PV data. Presently, the only one-minute PV data available to the project is from the 4 MW Springerville project in Arizona. Though there are two larger PV projects currently in operation -- notably Nellis AFB and Alamosa, CO -- this data is apparently not being made available to this study. Why?