Thursday, January 12, 2012
Consumers go 2-0 in recent decisions at Colorado PUC
Back after a hiatus: A Chat About RPS Costs and an Unconventional Solar Array
Wednesday, July 21, 2010
Why Feed-In Tariffs are Not FiT for Colorado
In contrast, the renewable portfolio standard (RPS) approach adopted by many states in the US offers an alternative mechanism for supporting renewable energy capacity expansion that is better suited our regulatory structure. Through market-based REC prices, it also does a better job of recognizing the economic burden that high cost renewables place on ratepayers. An in depth comparison of the two incentive structures would take more space than we have available here, but suffice it to say that some of the important differences concern ratepayer impact, the difficulty of developing a tariff structure that incentivizes generation while not overcompensating developers, market responsiveness (or lack thereof), consideration of cost reductions due to technological advance, and legal restrictions.
In the summer and fall of 2009, the Colorado PUC conducted a comprehensive survey and analysis of existing and proposed feed-in tariffs around the world. Our conclusion was that FiTs were problematic due to the concerns expressed above. It also questioned whether a FiT would prove superior to the very successful renewable and solar programs that Colorado has implemented under its RPS. That study, entitled The Application of Feed-In Tariffs and Other Incentives to Promote Renewable Energy in Colorado can be downloaded from the PUC website. Our investigation into FITs continued this summer with our own internal analysis of the legality of statewide FiTs based on FERC preemption concerns and PURPA restrictions. That analysis concluded that there are very limited circumstances under which a statewide FiT may be implemented by a state regulatory body or legislature. And, some of those circumstances are even more problematic in Colorado due to the Colorado Taxpayer Bill of Rights commonly known as the TABOR amendment. I should point out that individual utilities as well as coops and muni utilities are free to implement a FiT as they desire. What is at issue here is whether or not a state body may order a FiT for any class of utilities. Our assessment of a state's limited authority to implement a FiT was, coincidentally, affirmed by a recent FERC decision pursuant to a California PUC case. A fact sheet describing the essential elements of that case is available here.
Perhaps more important than the legality of FiTs is whether or not they are the most effective and efficient way of fostering renewable energy generation. It is clear to me that they are not. Aside from the concerns with ratepayer impact and market responsiveness already discussed, markets in which they have been implemented have been subjected to boom and bust cycles (the highly touted Gainesville, FL program is a prime example). FiTs are simply not the way to develop a sustainable renewable industry. From a policy perspective, one must determine if the goal of an incentive program is to guarantee a rate of return to a generator (as with FiT programs) or to simply compensate renewable generators for their above market costs of doing the right thing. I would argue that the latter approach is more considerate of the additional costs being shouldered by ratepayers.
It is also clear that the guaranteed RoR provided by a FiT fails to motivate the technological advance needed to bring renewables to grid parity and economic sustainability. And I am not alone in this belief. Tech entrepreneur and Sun Microsystems cofounder Vinod Khosla writes:
"Every time there is a carve-out for some technology or deployment method, a market is being warped, and suddenly the chosen technology doesn’t need to compete and minimize cost in order to ‘win’ (case in point, solar feed-in tariffs in Europe, and more recently, Oregon). Consumers lose and excluded technology development slows down.” (Greentech Media, 16Jul2010)
Dr. Petri Konttinen also writes that FiTs are “successful in the short term but risk creating false and unsustainable markets vulnerable to speculators.” (07Jun2010)
The bottom line in this message should be that you cannot build a sustainable energy infrastructure on a foundation of unsustainable economics.
Unfortunately, proponents of FiTs seem totally unconcerned with any of the difficulties I've discussed. I was recently invited to participate in a panel discussion of feed-in tariffs to be held in Boulder, Colorado on July 22 organized by a California based advocacy group called the FIT Coalition. Though initially reluctant, I was convinced to participate by the panel moderator because he felt that I could bring some balance to the discussion, particularly with regard to the regulatory and legal considerations. It turns out that my initial reluctance was well founded as the workshop organizers ordered the moderator to "uninvite" me because they did not want an opposing viewpoint. The FiT intelligentsia, it seems, can be every bit as despotic as the antirenewable orthodoxy they condemn. Whether they simply fear alternative viewpoints or are defending a pecuniary interest, I'm not sure. Probably both. So, given that I won't have an opportunity to give the brief presentation I developed, I am posting the slide presentation here for you to view. Of course, you won't have the benefit of the comments that expand on the bullet points, but for the most part you can probably read between the lines.
Colorado's Tiered Electric Rates
But, truthfully, this is a serious topic and those who complain about the tiered rate schedule are not all wrong, even though some may not artfully express their dissatisfaction. I agree with the sentiment that those who impose the greatest load during peak times should perhaps shoulder the greater incremental cost. But, if the average use is, as Xcel noted, 657 kWh per month, then the 500 kWh threshold at which the higher rate goes into effect seems exceedingly low. Does that imply that the average user is wasteful? What about the unfortunate person who is on an oxygen generator that by itself consumes over 300 kWh per month? Or the person who installed a ground source heat pump in the name of efficiency and is now being penalized because the electric pumps for those systems run much more often. And, those respondents who noted that Denver water users were so successful in heeding the Water Board's call for conservation that rates had to increase because revenues fell so drastically expressed a very valid concern. So much for the argument about efficiency saving consumers money. The same could easily happen here.
From a societal standpoint, I am not convinced that average rates (what we have traditionally had) are not in the best interests of the community as a whole. After all, we're not talking about a luxury here but a necessity. And, that is why the monopolist is regulated. If you want to place a luxury tax on the commodity, then increase the threshold for the more expensive tier to something that is truly reflective of luxury use rather than necessary use. Moreover, Xcel's two-tier rates for June-September don't seem particularly targeted at peak loads, just aggregate demand by individual users. If we want to fiddle with rates to promote energy conservation, then TOU rates would be better than a naive two-tier rate that does not provide adequate discrimination between uses. TOU rates would at least allow many of the consumers who have a valid use to shift it to lower cost times. We need a tool that can be applied with more precision than a sledge hammer.
Of course, TOU rates would require new, so-called "smart meters" which Xcel would be only too happy to foist on ratepayers. But the problem there is that the PUC and Colorado legislature seem only too happy to jump into palliative, partial solutions without considering all of their ramifications. While the PUC does have open a couple of "investigatory dockets" into matters such as data privacy and cyber security, one would think that it would address those issues before entertaining utility applications for cost recovery of its Boulder Smart Grid City experiment.
It seems a particularly perverse notion of technological innovation in which the goal is to increase the cost of a commodity necessity to the end user. If the computer industry had adopted this model of innovation you'd still be reading your morning paper on newsprint and blogs would be handed out on street corners as they were in colonial days. Rather than coercing users to use less of what has been called the "master resource," shouldn't the focus be on making its production more efficient, with less environmental impact, AND at lower cost?
Thursday, December 10, 2009
It's Just Good Engineering....
One of the most cogent arguments that I have seen in favor of dealing with climate change was recently published in a New York Times op-ed column by Thomas Friedman entitled Going Cheney on Climate. Frankly, he could have made the point about the need to plan for low probability, high impact events quite well sans the Cheney analogy, but I suppose it made for a good headline. Aside from the fact that it is simply prudent to plan for such events, he notes that there are several benefits that society would realize from transitioning to clean energy, not the least of which would be greater innovation and energy independence.
But, beyond that, I look at it another way. My engineering training tells me that it is simply good engineering to make the most efficient use of the inputs to production. Why exhaust depletable resources for energy production when there are nondepletable alternatives available? What I'm suggesting here is that there may be a higher economic use of petroleum than setting fire to it (such as creating plastics and other high tech materials). This all comes back to the multiple reasons for promoting renewables and energy efficiency that seem to have been lost in the global warming debate: technological advance, economic development, conservation of scarce resources (including water), etc. This is not just about CO2, or shouldn't be.
In my last column, I noted that climate change was not the only serious issue facing society, and that is still true. But, it doesn't mean that we don't begin to work toward dealing with it, and whether or not there will be a catastrophe in 2050 or whenever is beside the point. We should be good stewards of the only planet we have to live on. That is just good engineering. But, in terms of the contradictory evidence, it is far from clear to me that there are not other natural events beyond our control that may suddenly raise or lower the temperature of the earth more than our actions. For instance, it has been theorized since the mid-1960s that sudden small changes in the orbit and tilt of the earth were the principal cause of successive periods of glacial advance and retreat. If true, all of our mechanations to strictly maintain the temperature of the earth at the levels known in the short time span of recorded history may be for naught. Perhaps adaptation would be a better strategy. Nonetheless, logic and good engineering should tell us that we cannot significantly alter the composition of the atmosphere without some impact. And, while it seems clear that human activities have resulted in a dramatic increase in CO2 within a relatively short time span, the presumed correlation between CO2 and temperature and the notion that it is irreversible is less clear. The fact that one of Kevin Trenberth's emails in the climate controversy noted that it is a "travesty that we can't account for the lack of warming at the moment" illustrates that point.
I have written previously about the religious fervor that permeates the debate on global warming, renewable energy development, and related issues. What I expect from scientists is that they will pursue new knowledge regardless of the direction it leads them. When scientific skepticism and inquiry is replaced by advocacy in pursuit of a largely political agenda, that is when scientists lose credibility. And, we cannot afford that because there are more than enough advocates of one agenda or another espousing their beliefs and publishing studies in support of their position.
One final thought about the continuing discussion of cap and trade and the supposed incentive that it provides as a mechanism to reduce greenhouse gases. This approach merely promises to enrich those that trade in allowances and offsets and the inevitable derivatives that some smart guys will devise to take advantage of the opportunities. In another op-ed piece in the New York Times entitled Cap and Fade, James Hansen identifies the problem well:
"Because cap and trade is enforced through the selling and trading of permits, it actually perpetuates the pollution it is supposed to eliminate. If every polluter's emissions fell below the incrementally lowered cap, then the price of pollution credits would collapse and the economic rationale to keep reducing pollution would disappear."Let's come back to the basics of regulation. If greenhouse gases are to be considered a criteria pollutant, simply tax or restrict their emissions. Period. And, if you think that offsets are a helpful component of cap and trade schemes, you may be interested in a somewhat tongue and cheek website on this topic called Cheat Neutral.
So, in summary, the science is not settled and the proposed market based solutions are a wrong-headed approach to dealing with the issue. It comes down to just doing the right thing, and that is just good engineering.
Friday, December 04, 2009
Some additional thoughts on renewable energy and rationality
Transmission is another area that has succumbed to a dogmatic theology. The argument that transmission is a limiting factor on renewable generation is simply a red herring. Virtually all states have some renewable energy potential, whether it be wind or solar or biomass, etc. -- some in more than sufficient quantities. But, do we really need to power the entire U.S. from a 75-mile square portion of the Mojave Desert or from mega wind farms in North Dakota? We need to analyze the trade off between the cost of high capacity factor resources that are dependent on transmission to move the energy and lower capacity factor local resources that don't require it. Moreover, we are now beginning to see resistance from some environmental groups and land owners who do not believe that high voltage transmission lines are necessarily the highest use of pristine land. Thank you.
But, while distributed generation (DG) has its place, it too is not the solution to all of our energy needs. There are tremendous opportunities for the creative and more efficient use of space to deploy renewables -- unused roof tops and awnings over parking lots to name just two. And these don't need to be unattractive if designed well. But the notion that we must deploy black panels along every highway right of way or wherever there are a few square feet of vacant land is also nonsense. The line of thought that if some is good, more is better has unfortunately become the canon of the solar theocracy. I am looking forward to more creative uses of thin films and new technologies in BIPV designs.
If nothing else, consolidation in the solar industry, the turmoil over Climategate, and debates over transmission lines and land use promise to return some balance and rationality to discussions of energy and climate change. That will be a welcome change.
Friday, August 28, 2009
Making a Case for Rationality in [Renewable] Energy Deployment
It has become increasingly evident that there is a serious need for some clear thinking in the energy debate. As I have often said, the sum total of all of the vested interests (utilities, environmentalists, renewable energy advocates, etc., etc.) does not equate to the public interest. In an earlier blog on the Western Wind & Solar Integration Study, I noted that we are becoming increasingly adept at operating the electric grid with greater amounts of intermittent renewable resources. But there are many proponents who are becoming positively irrational about this. It seems that some of the policy makers who are closest to the problem are the farthest from reality. To some I would grant the benefit of the doubt and suggest that they simply know no better. Other proponents would clearly subjugate the public interest to their own economic and/or political self interest. Unfortunately, the result is the same.
As anyone familiar with this field knows, California's renewable standard will likely soon require major utilities to provide one-third of their energy from renewable resources by 2020. It seems to matter little whether or not this is an attainable goal. And, I used to think that California was on the right track in requiring that these renewables actually deliver renewable power into the state in order to qualify. But as the likelihood of meeting this ambitious goal becomes increasingly problematic, the policy makers and stakeholders seem willing to make tradeoffs that will dilute the effectiveness of the goal.
Let's start with an article in the Sacramento Bee describing how the various stakeholders are debating the sourcing requirements for renewable energy to comply with the standard (see Utilities, groups at odds over sources for renewable energy). Utilities are now claiming -- perhaps rightfully so -- that they cannot meet this aggressive standard unless they are allowed to procure resources from out of state, and a California PUC analysis supports this contention. So, if you were in California, which would you find to be the more rational approach: a standard that cannot be met and which encourages utilities to procure out-of-state resources, often at exorbitant costs, or a more reasonable ramp-up in renewables that fosters in-state development at reasonable cost? Would California ratepayers benefit more from sending their utility dollars out of state to essentially purchase RECs or keep that money in state to develop local projects? They may as well propose to get solar energy from a satellite! Oh, sorry, they're proposing that too. And then everyone laments the high contract failure rate in renewable energy.
The lesson seems to be: Set a standard that you cannot achieve and then simply change the way you measure compliance to make it appear that you did. This situation has become so bizarre that Colorado's major utility has now hatched a scheme to combine brown power from unspecified sources with its excess RECs and sell the bundle to California utilities as green power to comply with the standard that they cannot otherwise meet. Of course, the Colorado utility in question hopes to profit handsomely from the sale of these RECs which, incidentally, were already paid for by Colorado ratepayers. What makes this possible from the Colorado utility's perspective is that it has acquired too many RECs, too soon, at too high a cost. This is beginning to look like a zero-sum game. The same amount of renewables will ultimately be developed, the only question is where? Call me old fashioned but how about a scenario in which each utility develops its own renewable resources to serve its own local needs?
I mentioned that the Colorado utility had acquired too many RECs, too soon, at too high a cost. I made this comment back in 2007 when I showed that the utility proposed to purchase more solar RECs than it needed for compliance in an environment where solar costs were projected to diminish rapidly, thereby saddling ratepayers with higher costs than they would otherwise have to pay. This prevented the utility from taking advantage of the inevitable cost reductions that were sure to result from technological advance and deployment experience. But it is this myopic belief that if some is good, more must be better that is responsible for the irrational behavior that is inflicting policy makers with regard to renewable energy. As stated in a recent op-ed piece on cap-and-trade by Paul Gerlach in the South Florida Sun-Sentinel, this "preoccupation with setting unrealistic targets for renewable sources has blinded policy makers to the almost unlimited opportunities for technological breakthroughs in the production and use of conventional fuels..." and, I might add, in the deployment of renewable energy technologies.
Tuesday, August 11, 2009
Geoengineering redux - aka Climate Engineering
It is also true that simply adjusting the thermostat, so to speak, won't reduce the amount of CO2 in the atmosphere and thus problems with ocean acidification still remain. That is why I still believe that the best approaches to geoengineering (now referred to in some circles as climate engineering) are those that essentially scrub the atmosphere of CO2 rather than those that simply mask the problem by putting more junk into the atmosphere.
Interested readers should also consult work being done at the Copenhagen Consensus Center which is publishing a series of perspective and analysis papers on climate change and approaches to adaptation and mitigation. It is an excellent resource for those who wish to be more fully informed on the range of solutions that may exist. In my earlier post, I noted that we should definitely take prudent measures now to avert a calamity in the future. But, I also noted that we must begin a serious investigation of the more active approaches to mitigating climate change and its first cousin, ocean acidification, now so that those technologies may be suitably developed if and when we need them.
Sunday, August 02, 2009
Corporate Right of Passage -- A Note to Techies
Mr. Chambers also mentions something else that I found interesting. In this interview, he notes that "when there’s an accident happening, that’s when you’ve got to be the calmest. And yet that’s when most people are not." This made me think back to a blog I posted on July 15 (see Geoengineering) in which I questioned Eugene Kleiner's assertion that "There is a time when panic is the appropriate response." In contrast, Mr. Chambers goes on to say "So I’ve learned when something with tremendous stress happens, I get very calm, very analytical." Nice to know that I'm not alone in that regard.
There is one other important learning point in the Chambers interview and that is the ability to admit mistakes and failures. Unfortunately, such candor is anathema in the utility and regulatory environment that I am presently part of, especially with regard to renewable energy and public policy. Apparently it borders on sacrilege to question the all knowing legislature, Commissioners, and executive branch policy makers. But, we're learning as we go and their credibility would go much farther if they could only admit "Well, we screwed that one up. Let's fix it and move on." I'm hoping for too much.
Saturday, August 01, 2009
Western Wind & Solar Integration Study
We'll talk about a few of the observations from this study -- some predictable and others not so much. The bottom line, however, is that having 35 percent renewable energy penetration (30% wind and 5% solar) appears technically feasible although operation of the electric grid would have to be dramatically different than it is now. Essentially, instead of the large number of small control areas that presently exist, it would require a larger, more geographically dispersed control area that could balance the intermittent contributions of wind and solar generators. And, as anyone familiar with this field is aware, transmission (or the lack thereof) is an issue. But the extent to which it is an issue appears to be a function of whether we rely on fewer megaprojects located in the best wind and solar resource areas or a larger number of smaller dispersed projects (albeit with lower capacity factors) sited throughout the footprint. Hence, there is a trade-off between transmission costs and the higher cost of renewable energy from lower quality resources.
One of the very interesting outcomes from the analyses conducted thus far is the relationship between total load and renewable generation. As the General Electric folks conducting the modeling noted, "The bad actor is the wind." There are times in the spring when the wind is high and there may be more total wind and solar on the system than load. This presents great operational difficulties for system operators. Alternatively, there are times in the early morning when load is ramping up sharply just when the wind is ramping down.
These types of issues highlight the importance of forecasting to system operators. From a market perspective, it was found that with a perfect forecast, increasing renewable penetration drives spot prices lower (since there is no fuel cost). But, with an imperfect forecast, the forecast error drives spot prices back up because operators would commit insufficient capacity and have to turn expensive peaking units back on. Furthermore, it was found that at renewable penetrations exceeding 20 percent, coal units would begin to be impacted. However, the greatest impact was found to be on combined cycle gas plants being backed down. Overall, the value of wind energy rose with a perfect forecast and dropped with an imperfect forecast. Not much of a surprise there, I suppose.
The study also found that generator total revenues fell with increasing renewable penetration (as noted above, spot prices decreased). But, the total revenues for nonrenewables fell at a steeper rate for two reasons -- they generated less energy and spot prices fell. There were two other particularly interesting outcomes presented. The first concerned the cost of unserved energy and the important role that demand response could play in mitigating this problem. The second was that the role that hydro, and especially pumped storage hydro (as well as other large scale storage), could play is less than commonly believed.
Researchers found that higher wind penetration also resulted in greater amounts of unserved energy, due largely to over-forecasting of the wind. But, discounting the wind forecast has the effect of driving spot prices down because you're carrying more gas, resulting in less unserved energy. Thus, there was a very high cost to reducing the unserved energy by discounting the wind forecast. It was found to be far more cost effective to get the load to be responsive rather have the system make up the shortfall to the extent that a couple of thousand MW of interruptible load was found to be cost effective.
Lastly, we discussed the operational impacts of increasing renewable penetration on hydro operation. Hydro needed to be scheduled in response to the wind forecast while increasing wind penetration also increased the variation in hydro scheduling. There was found to be a large operating cost increase if you did not shift hydro commitments in response to the wind forecast -- obviously hydro has the flexibility to move while wind does not.
With regard to pumped hydro, it was found that if you have it, the system will use it, but there was no incentive to add more. This seemed counter to the conventional wisdom so much of the ensuing discussion focused on this topic. The researchers reported that increasing pumped storage increased overall costs. As you increased renewable penetration, the storage ran more. As you forced the storage to run more, it drove costs up. It was found that, even with 30 percent wind penetration, the WestConnect footprint has sufficient pumped storage and no more is needed. Exploring this further, the group concluded that the pumped hydro may be more useful in a smaller footprint. In a larger area, it was preferable to use the system as storage. As part of this discussion, one participant from Ireland noted that studies on their system demonstrated a similar result and that pumped storage was not needed until renewable penetration reached as high as 50 percent. There, it was found that the capacity cost of pumped hydro displaced other capacity costs. But, it only provides capacity if you fill it. Thus, you need to reach the higher wind penetration levels before the pumped storage pays out.
So ended a valuable update to this important research initiative. The day concluded with some discussion of next steps and areas of focus as the project moves forward. One shortcoming, as noted earlier, is the dearth of solar data. To model increasing photovoltaic penetration, the project needs more one-minute PV data. Presently, the only one-minute PV data available to the project is from the 4 MW Springerville project in Arizona. Though there are two larger PV projects currently in operation -- notably Nellis AFB and Alamosa, CO -- this data is apparently not being made available to this study. Why?
Tuesday, July 28, 2009
Can Machines Outsmart Man? Human Intelligence vs. Artificial Intelligence
Ever since the dawn of the field of artificial intelligence, there has been speculation about machine intelligence surpassing human intelligence and somehow reversing the master-servant relationship. It is not that I have some overwhelming faith in humanity's ability to prevail (though for the most part I do). I believe that the problem is not one of technological capability but rather one of the inappropriate human exploitation of that capability. This makes it not a technological problem but a societal one.
The computer scientists mentioned in the article expressed the concern that "technological progress would transform the work force by destroying a widening range of jobs, as well as force humans to learn to live with machines that increasingly copy human behaviors." While I'm not certain about the copying of human behaviors, which if true could turn out to be AI's Achilles' heel, I fail to see how such human adaptation is any different from that which has occurred for hundreds if not thousands of years. Moreover, this concern sounds very similar to that once expressed about another emerging technology:
"_________________, if they succeed, will give an unnatural impetus to society, destroy all the relations that exist between man and man, overthrow all mercantile regulations, and create, at the peril of life, all sorts of confusion and distress."
Another recent NY Times article, In Battle, Hunches Prove to Be Valuable describes how the most high tech gear "remains a mere supplement to the most sensitive detection system of all -- the human brain." It describes how a soldier's experiential knowledge, depth perception, and focus creates an almost uncanny ability to perceive and respond to dangerous situations. In spite of the many advances in machine knowledge, AI still does not have the ability to mimic this sensory perception and decision making capability, partly because we cannot fully explain it either.
With that said, at the end of the day, the danger seems not so much that machines may outsmart man. Rather, the danger is that man may deploy technology inappropriately, thereby outsmarting himself.
Thursday, July 23, 2009
The Green Roadway Project IP Auction
Perhaps you've already heard of this venture. It's been featured in a number of newspapers across the country including the New York Times. I first heard about it when I was contacted by representatives of the project who suggested that the state should submit a bid (reserve price of $500K for Colorado, $1.5 million for California). So, tomorrow, July 24, 2009, is auction day -- the day on which these entrepreneurs will attempt to auction off a license to their IP on a state by state basis for six figures plus... each.
I have to wonder if these folks seriously believe that state governments will bid on this IP. That simply isn't the way that states would promote such development. Moreover, the schemes that they are promoting (e.g. micro wind turbines turning in the breeze generated by passing automobiles, etc.) are years from being practical, if ever, save possibly for projects such as the Oregon solar highway which is really just conventional PV in an open area near a highway intersection (and which apparently does not infringe on these entrepreneurs' IP). Also, not mentioned is the small issue that not included in these auctions is the right to public rights of way, generally controlled by various state highway departments.
So, while some of these technologies are interesting, no one has yet shown that they will work either at scale or in the harsh, real-world environment (think winter snow storms, snow plows, etc.). We'll see what the auction brings tomorrow but I would guess that they would have greater success offering a nonexclusive license to all takers, that is if they really have IP that will be difficult to work around. But it could be that they simply hope to strike while the iron is hot and capitalize on the Zeitgeist.
Friday, July 17, 2009
Largest Green-Power Program Stumbles
They wonder why the program is so under-subscribed. Someone must be kidding. Has anyone looked at what they're offering? From the Austin Energy website:
“An average residential customer consuming about 1,000 kWh per month will pay about $43.50 per month more if opting for a 5-year subscription or $58.50 per month more if opting for a 10-year subscription.”First, this is a huge surcharge for renewable energy. We're talking a 63% increase if you sign up for the 10-year plan. Second, the pricing is upside down. A higher unit rate for a longer subscription? The NY Times should try selling its newspaper with that type of pricing strategy and tell me how well they do. Moreover, with renewable portfolio standards expanding throughout the country, more ratepayers are beginning to pay the costs of adding renewables to the electric grid. But, at least they share this cost equally... and so far the percentage increase is only in the single digits. Done well, the cost of adding renewables to the electric system does not have to short circuit customers' wallets.