Wednesday, March 18, 2015

Technology Forecasting & Strategic Technology Planning

For this post, I am going to build on our recent series of articles on technology forecasting and technology scouting elsewhere on this blog (see here, here, and here) and speak about how to incorporate the results of your technology forecasting and scouting efforts in the organization’s strategic technology planning efforts.

Recognizing that technological change is a principal driver of competition, an important concern of corporate CEOs and Chief Technology Officers (CTO) is managing the firm's technology development or acquisition effort to support overall company objec­tives.  Given the accelerating pace of technological progress, managing this effort is becoming increasingly difficult but also increas­ingly important. Not too long ago, much of this discussion centered on the question of whether a firm's technology strategy should be predominantly market-pull or technology push.  But, it is more than that.  It is as basic as deciding what business(es) the firm is in now and determining those in which it should participate in the future. 

The most radical such decision I know of occurred several years ago when Boulder, Colorado based Cell Technolo­gy exited the biotech field and reinvented itself as an air ambulance company.  A more interesting example today might be Big Oil.  Are they oil companies?  Energy companies?  Or, recognizing that over 70% of petroleum is destined for use as fuel – and over 90% of transportation fuel is derived from petroleum – are they really transportation companies?  How you see yourself now and in the future impacts the current and future technology portfolio and the skill set that your business must have.  If I were an oil company and concerned about my long term business, it isn’t just oil markets that I would be concerned with.  Nor would staking out a position in electric vehicles or electrical power generation necessarily be where I would focus. 

But, before we get there let’s start with the basics.  Webster's New Collegiate Dictionary defines technology as "the totality of the means employed to provide objects necessary for human sustenance and comfort."  Fair enough, but in this context, we must define "objects" to include both services and goods. Going further, Martino notes that technology may also include "know-how" and software [1]. I like that better, but would extend it even further to include systemic technologies such as management processes and systems. Thus, when referring to a given technology we really mean an entire family of technical approaches that have some major characteris­tic in common or that perform the same function.  For example, internal combustion engine vehicles represent a class of transportation technologies as distinguished from electric vehicles or fuel cell vehicles. Each can be further aggregated or disaggregated according to our needs. 

Essential to making strategic decisions concerning technolo­gy is an understanding of the dynamics of techno­logical change.  Histori­cal data from many fields demon­strate that progress is not random and discontinu­ous, but follows a fairly regular pattern when some performance attribute is tracked over time.  Just as products and processes follow a life cycle, so do technologies.  The resulting Technology S-curve (figure 1) is similar in form to product or process life cycle curves and is at the heart of our earlier discussions on technology trend analysis mentioned earlier.  As we noted then, utilizing various fore­casting tech­niques, a firm can do more than simply monitor tech­nology – it can estimate where the Technology S-curve will lead it and what the likely impact will be on its future lines of business. 



Figure 1 - Technology s-curve.


As shown in figure 1, technology begins with an invention or discovery and initially grows rather slowly as shown by the flat initial portion of the S-curve.  As the diffusion of technology proceeds and the potential for its use becomes known, continued work leads to increasing levels of performance, shown by the steep part of the curve.  Beyond the inflection point, increases in the technology's performance come harder.  Of particular importance is the recog­nition that no technology can be advanced without limit.  There has always been found to be some natural upper limit beyond which a technology cannot progress – though we may not necessarily know what that is at first.  Increases in performance beyond this point require shifting to a new S-curve associated with a new technology or a breakthrough associated with the old one.  More on that in a moment.

Eschenbach and Geistauts defined strategy as a "... fundamen­tal approach for gaining long-term advantage over both competitors and [the] environment ..."  They note that "strategy explicitly considers and tries to control the impact of uncertainty."[3] In this context, technology can be viewed as either an opportunity or a threat. The aggressive, technology‑oriented firm they conclude will wield its technology as a competitive weapon to offer unique or superior products or services, significantly lower production costs, or make substantive improvements in manage­ment processes.

A firm's technological skills, although difficult to invento­ry, are some of its most important assets, even though they don't appear on the bean counter’s balance sheet. Given that technology is a corporate resource, the idea of a technological audit has been proposed to assess a firm's ability to compete on the strength of its techno­logical assets [2].  To aid in this endeavor, table 1 shows the various stages in a tech­nology's life cycle and the importance of each stage to a firm's competitive advantage.

Table 1 - Technology life cycle and competitive advantage [2].
Technology Life Cycle Stage
Importance to Competitive Advantage
I Emerging Technologies
Have not yet demonstrated potential for changing the basis of competition.
II Pacing Technologies
Have demonstrated their potential for changing the basis of competition.
III Key Technologies
Are embedded in and enable product/ process.  Have a major impact on value added stream (cost, performance, quality). Allow proprietary/patented positions.
IV Base Technologies
Minor impact on value added stream.  Common to all competitors – commodity.


By determining the level of the technologies that it relies upon, and doing the same for its competition, a firm can assess its competitive standing vis-a-vis its competitors.  Such is the foundation of science & technology intelligence.  A firm that wishes to adopt an offensive technological strategy should have strong positions in pacing and key technologies while being at the forefront of emerging technologies.  One that finds itself dependent primarily on base technologies is by default going to be a follower in the market place.

A critical determinant in establishing technology strategy is to select the right technology and the right time to pursue it [4].  Returning to figure 1, since the slope of the S‑curve represents technological progress per given level of input, it can also be considered to be a measure of R&D productivity.  Richard Foster, formerly of McKinsey, notes that "you cannot improve the performance of one laboratory over another by a factor of twenty through better organization and project management.  You can only do it by picking the right technologies."[4]

The position of a technology on the S-curve determines the potential that remains to be developed in that technology.  If performance improvement, which may be considered a measure of R&D productivity, has been stag­nating after having reached a previous high, the only way to improve it is to get onto a new S-curve where the rate of produc­tivity growth (i.e. performance improvement) will be higher (see figure 2).  Faced with the technological discontinuity shown here, management must then decide whether to exploit the potential remaining in the present technology or shift to a new technology and, if so, when to make that move [4].


Figure 2 - Transitioning from an old technology (declining growth rate) to a new technology (increasing growth rate).


In deciding which technology to exploit, firms must exercise some foresight as it will take some time after shifting to the new technology to travel down a new learning curve to the point where the new technology becomes profitable.  Considering development lead times, this often means thinking about switching to a new technology just as the current technology is maturing and business is going well.  Foster notes that:

         "The time to begin exploring technological alternatives is when roughly half of the full potential of the present technology has yet to be exploited. Yet this is pre­cisely the time when it is most difficult to get manage­ment to think about new technologies. ....conventional management systems, with their emphasis on short term measurements and rewards, work against the correct diagnosis of the technologi­cal situation."[4]

Here is where some capability at opposite ends of the same coin can be important.  Forecasting both technological trends and market trends is a skill that is – or should be – instrumental to strategic technology planning.  How well do you do this?  In spite of our best efforts, no firm can be right 100% of the time.  This is where flexibility and the ability to respond rapidly to a changing environment can help create and sustain a competitive advantage, especial­ly for the entrepre­neurial firm that must compete against better endowed competi­tors.

Finally, a well thought out strategic technology planning effort will include both near and far term components.  Strategic plans, although forward looking, must be grounded in the present.  In this regard, a quote from Peter Drucker concerning long range planning seems especial­ly apropos to strategic technolo­gy planning:
         "Decisions exist only in the present.  The question that faces the long-range [technology] planner is not what we should do tomorrow, it is: What do we have to do today to be ready for an uncertain tomorrow?"


References

1. Martino, J.P., 1983, Technological Forecasting for Decision Making, 2nd ed.; Elsevier, 1983, 385p.
2. Burgelman, R.A., and Maidique, M.A., 1988, Strategic Management of Technology and Innovation; Irwin, 1988, 604p.
3. Eschenbach, T.G., and Geistauts, G.A., 1987, Role of technology in strategic management: Engineering Mgmt Int'l, v.4, p.307-318.
4. Wolff, M.F., 1981, Picking the right technology should be first priority: Research Mg­mt., July 1981, p.7-8.



video
**** Dear Readers, I hope you enjoyed reading this article. On April 23-24, 2015, I will be teaching a workshop on Technological Forecasting for Science & Technology Intelligence in Golden, Colorado.  We’ll discuss both trend analysis and the proper application of expert opinion in formulating strategic technology plans.  We would love it if you would join us for this unique and valuable course.  Details and registration can be found on the TEMI website here – RM

Wednesday, February 04, 2015

Technology Scouting for Fun & Profit

The rapid pace of technological change has technology managers in all industries concerned with how they can keep abreast of developments in related – and unrelated – fields that may create new opportunities or pose threats to their business.  No longer can a manager simply look to the firm’s internal R&D effort or its business partners to provide access to all of the technologies and skills the company needs.  For one thing, over the last 30 years, much scientific research and radical innovation has been pushed upstream into academia and government research centers as corporate R&D centers focus their efforts on supporting current and next generation products.  Furthermore, innovation is increasingly the product of the fusion of two technologies – think bioinformatics, for example – or the application of a known concept to a new field (such as laser printers which were developed from existing copier technology).  No internal effort could possibly hope to master it all – not even in the largest of companies.  So how do these businesses maintain an awareness of emerging technologies and the opportunities (or threats) they may portend?

One approach is to maintain a high enough profile and hope that tech transfer managers, entrepreneurs, and other developers of new technology come to you with their developments.  This may work in some cases, but no business can afford to sit back and hope that important components of its technology portfolio will just walk in the door. Maintaining an awareness of new technological developments has traditionally been the purview of scientists and engineers in the lab whom, it was believed, lived on the front line of S&T development and thus would be familiar with new technologies.  This is a bad assumption.  The short comings of this approach are the same as those noted above that afflict the business more generally – that is, no one can know it all.  Besides, that is not their job.  They have their own projects to look after.  Today, leading technology companies resort to a systematic process of Technology Scouting to proactively search out and acquire new technologies.

Technology scouting doesn’t require a massive effort, but it does call for thoughtful organization.  And, it is not competitive intelligence per se though there is some overlap in research skills. (Competitive intelligence seeks to maintain an early warning system of competitors’ actions while scouting seeks out new technological developments for incorporation into the firm’s technology portfolio.)  Today, technology intensive firms from virtually all industrial sectors including IT, pharmaceuticals, automotive, electronics, chemicals, oil& gas, and more rely on a variety of sources to keep abreast of new developments of interest to the firm.  Even businesses that we seldom think of as technology companies have climbed on board.  For example, low-tech products from toys to food increasingly rely on a variety of technology intensive processes to manufacture them and which impact their competitive position.

So now you’ve decided that your firm needs to scout for new technologies – in self-defense if nothing else.  Where do you start?  The figure below shows the basic elements of a technology scouting program.


First, you must determine precisely what you want this scouting effort to achieve.  And, as we’ve said, this often comes under the general heading of finding opportunities and identifying threats.  In this effort, do not fail to consider the possible societal responses to new technology (Monsanto and Unilever, to name just two, now wish that they had more carefully considered the potential societal resistance in Europe to genetically modified crops). 

Next comes the scouting plan which details the various roles and responsibilities of those in the group.  Who will be responsible for which sets of technologies?  While the focus is clearly on emerging technologies, be careful to ensure a proper alignment between scouting activities and corporate goals and objectives.  To what extent will you manage this effort with in-house resources and will there be a need to go outside the organization for additional expertise? And, as shown, it is often just as important to be aware of what is not happening in the environment as what is happening.

Third, a complete listing of data sources – both secondary and primary – to support the effort must then be developed.  Obviously it is less costly to comb through previously published materials, but figure on obtaining the real nuggets from discussions with other people – internal and external.  There is a skill to doing this.  And don’t fall into the trap of assuming that patent searches will completely reveal the technological landscape.  

Fourth, consider the methods of observation that you expect to employ.  Although the three categories of surveillance are often used interchangeably, I regard scanning as a broad look at the technological landscape to identify areas for more detailed study, monitoring as a continuous look at a specific field or technology to identify new developments of interest, and tracking as a detailed look at advancements in a specific technology or technological approach.  Often, tracking provides the time series data that you will use to conduct a trend analysis (such as Moore’s Law, for example). 

Finally, not to be overlooked are the mechanisms that you will use to convey the results of your scouting activities to senior management and other decision makers.  There are roles for both reports and alerts that the group may issue as well as monitoring databases that may be accessed by employees when they need information on a topic.  But then, the really hard work begins – due diligence on the target technologies of interest and assessing the strategic and operational fit will be critical to moving forward and must not be taken lightly.  But, if done well, the benefits can be enormous.

Ready to get started?  Emerging technologies are everywhere and only a systematic effort will help ensure that you keep abreast of those of most interest to you.  If you need help, contact me at Rich@TEMI.us.

Friday, January 30, 2015

New Renewable Energy Proposals in Colorado's 2015 Legislative Session

No sooner did the new legislative session begin than we saw some proposed changes to -- some would say attacks on -- Colorado's renewable energy standards.  Three bills were recently introduced in the legislature in the new session -- all by Republicans -- and they truthfully could be considered an attack on the RPS. First was a bill introduced in the state Senate (SB15-044) that would reduce the RPS obligation for IOUs from 30% to 15%. I don't see this bill having a snowball's chance in hell of passing. Even if it did survive the Republican controlled Senate, it would surely die in the Democrat controlled House and/or be vetoed by the governor. Moreover, it seems unlikely that Xcel Energy, the state's dominant utility which supported the increase to 30% and has already met the RPS for 10 years out, would support it.  Only Black Hills Colorado Electric, the smaller IOU serving the Pueblo area, would benefit. But, with that said, SB15-044 was just passed out of committee without amendment, predictably on a 5-4 party line vote, and sent on to the full Senate for consideration there.

Another bill introduced in the Senate (SB15-046) would classify solar gardens in coop territory as retail distributed generation and grant a 3x multiplier for coop compliance with the RES (see my 2013 posts on this topic here and here).  My intel says that the 3x multiplier will probably not survive but, given the controversy over the increase in coop RES obligation back in 2013, classifying solar gardens as "retail DG" as it is defined in the Colorado RES could be OK.

In the House (HB15-1118) would expand hydro for compliance with the RES from new hydro less than 30MW, as presently defined, to all hydro regardless of size or vintage. It would also add pumped hydro to the list of resources eligible for compliance with the RES. Back when I worked for the PUC, we seemed to come across this bit of nonsense every couple of years or so and had to go through the drill of explaining why pumped hydro is not considered renewable energy. But, it keeps returning. Simply put, if the water is pumped uphill at night using fossil generated electricity, releasing it in the afternoon doesn't make it renewable energy.  If the water is pumped uphill using wind energy, then it is the wind generator that gets the RECs.  Awarding the RECs to both would amount to double counting which is generally considered verboten.  Moreover, the water is pumped uphill using undifferentiated grid power. This bill was assigned to what is typically regarded as a "kill committee" in the House so I expect it will die an early death.

More recently, SB15-120 was introduced in the Senate by Sen. Matt Jones (D) that would require each provider of retail electric service in Colorado to develop an electric grid modernization plan. This is an interesting bill, the stated goals of which are to  1) Optimize demand side management, 2) Optimize supply side management, and 3) Achieve advanced metering infrastructure (AMI) functionality within 5 years.  Certainly it is difficult to argue with optimizing supply side and demand side management (which refers to enabling energy efficiency and renewable energy integration) but the goal to advance AMI is likely to be controversial -- and potentially costly to ratepayers.  With AMI, more commonly referred to as smart meters, utilities could implement time of use (TOU) rates. AMI is also the foundation for building out the "smart grid."  This bill has been assigned to the Senate Agriculture, Natural Resources and Energy Committee and it will be worth watching to see how it fares.

UPDATE 01FEB2015: I see that the Denver Post has now waded in on this with an editorial that you can find here

UPDATE 11FEB2015: As of this morining neither SB15-046 nor HB15-1118 had been taken up in their respective committees.  On 05Feb, SB15-044 sailed through the full Senate and passed out of the Senate on an 18-17 vote (presumably party line vote though I didn't check the party affiliation for each of the votes) and was sent on to the House.  On 10Feb it was assigned to the House State, Veterans, & Military Affairs Committee (kill committee) where it will presumably languish until the end of the session.

Wednesday, December 17, 2014

Technology Substitution Requires Forward Thinking

Several weeks ago I wrote about some of the basic considerations in evaluating technology trends. (see here and here),  In those posts I discussed some of the dynamics of technological advance and offer some initial guidance to get you started in technology trend analysis.

One of the points I emphasized in those columns is that it is imperative to assess the performance of technologies, not at their present level of performance but at where they will be in the future.  Similarly, when evaluating the substitution of an incumbent technology by an emerging one, we make the case that it is less important to compare the present performance of the two technologies than their future potential.

To illustrate this dynamic, the figure below shows two technology s-curves – one for an incumbent technology and the other for an emerging technology destined to eventually replace it.  On the y-axis we plot some relevant performance measure of the technologies in question.  Note that this is not sales or market share but rather some performance parameter of interest.  For example, in computing it could be the clock speed of a chip (as was done for the original Moore’s Law) or the storage density of disk drives.  In the energy arena, we may track the energy conversion efficiency of solar photovoltaic cells, the heat rate of thermal generators (either fossil or biomass), or the efficiency of wind turbines. 



Note that, in the figure, the new technology begins life at a lower level of performance than the incumbent technology.  This phenomenon, which is all too common, often leads one to mistakenly dismiss the emerging technology as inferior to the incumbent.  But, in fact, those who do are asking the wrong question.  Rather, it is important to look at the potential of the new technology to surpass the performance of the incumbent which occurs at the cross over point shown in the schematic.  Equally important is the difference in performance limits of the two technologies, for if the difference is sufficient, the new technology will sustain its inexorable march to overtake the incumbent. 

Even if we do not have sufficient data to plot precisely where we are on the s-curve, it is crucial to know whether the technology is just beginning its upward trajectory or is nearing its performance limit. This is well demonstrated in a recent Washington Post article describing a prototype US Navy laser weapon system which quotes one naval analyst: “’Naval guns are near the theoretical limit of their performance envelope now,’ [he] said. ‘We can only expect very minor improvements in the future, whereas with lasers we can expect significant improvements in range, lethality, and accuracy.’"


Laser weapon on the USS Ponce

Were we plotting the s-curves for naval combat systems, as the quote implies, we might employ range, lethality, and accuracy as the performance metrics of merit.  In this case, even without quantitative data on these measures, it is apparent that this new weapons technology holds great promise to overtake existing systems and substitute for conventional weapons technology.

Monday, December 08, 2014

Surprising PUC Decision in Colorado Solar Case

Back in May, I wrote in Solar Industry Magazine about Xcel Energy’s application to the Colorado PUC to implement a solar-based green pricing program called Solar*Connect.  This proposal was of great concern to both the solar industry and consumers for a number of reasons – the industry because it represented a direct competitor to net metering and community solar offerings and ratepayers because they were being asked to subsidize a program that would primarily benefit the utility.

This proceeding drew many intervenors including several from the solar industry.  Not one of them – not the industry, environmental groups, consumer groups, or the PUC staff – supported the proposal.  A few of them suggested modifications to the proposal, more I think to avoid being labeled as anti-solar than because they thought the proposal had any merit.

 Solar Industry Magazine

This proceeding was as contentious as any that I have seen recently at the PUC.  In November there was a four-day hearing presided over by the full Commission which gives some indication of the importance of the policy issues raised by this proposal.  This afternoon, the Commission deliberated on the proposal and issued its decision.  Certain that this would be a long deliberation because of the myriad policy issues at play, I settled in, notepad in hand.  Much to my surprise, it was over before I even got comfortable.

Months of testimony and competing motions followed by four days of hearings settled in a 10-minute deliberation!  As is typical, a member of the Commission’s advisory staff set the stage and then gave his simple recommendation – deny the application.  In providing their recommendation, commission advisors cited:
  1. no need by the utility for the solar RECS (which incidentally the utility planned to keep even though subscribers would be paying more for them) for compliance with the renewable standard,
  2. no need for the capacity provided by the proposed 50 MW facility,
  3. no need for the energy that would be produced from the system,
  4. no consumer demand shown for the program, and
  5. concerns with it being subsidized by general class of ratepayers.

In agreeing with the recommendation, the three commissioners each expressed somewhat different rationales for denying the application (in addition to the above) including:
  1. unspecified profit by the utility,
  2. revised testimony during the hearing which left it unclear just what the utility was proposing,
  3. such proposals should be included in the 2015 ERP filing rather than filed separately (an issue that the Chairman was most adamant about), and
  4. the application was premature given that the Commission has yet to rule on its net metering policy in a separate docket.

With the denial, the Commission never discussed any of the proposed modifications or the concerns about whether such a program was legal under Colorado statutes leaving unanswered many underlying issues.  As usual, we'll have to wait for the written decision but I'll be surprised if there is much more in it than what we heard during the deliberation.


UPDATE 17 DEC 2014: The written decision from the PUC came out today and is available for download here.  Notably, in addition to denying the utility's program, the Commission also placed 100% of the risk for the so-called "start-up energy," which the utility contracted for after its request for an early RFP was denied, solely on Xcel.

Monday, November 10, 2014

Electric Vehicle Car Sharing Comes to Colorado

Last Saturday, a new entry in the Colorado car sharing market opened its doors, right here in Golden no less.  eThos Electric Car Share bills itself as the nation’s first all–electric vehicle car sharing service.  I'm not sure it's the first in the nation but first in Colorado will do.  eThos, which opened its doors in a converted service station at the corner of 19th Street and Jackson Street runs much like most other car sharing services, except for its 100% EV fleet.

For those who are not familiar with car sharing, think auto rental but on an hourly basis.  There are two basic types of car sharing services: business-to-consumer (B2C), which provides cars on an hourly or daily basis to business or individual consumers, and peer-to-peer in which an individual provides access to his or her personal vehicle to the renter, much like you might rent a vacation home from an individual owner.   There is one other important distinction to be made between types of car sharing services and that is whether the vehicle must be returned to the point of rental (aka round-trip) or may be dropped off at a designated parking location (aka point-to-point, free floating, or one-way) whereupon it may be rented by another customer.  eThos is a B2C, round-trip operation.




The Denver market has three B2C round-trip car sharing services – eGo CarShare (a nonprofit), Zipcar, and Enterprise CarShare – and one point-to-point service, Car2Go.  Generally, their rates are similar starting at about $5 per hour at the low end and increasing from there, depending on the specific vehicle make and model.  Hourly rates typically include gas, maintenance, and insurance though you can pay more for a waiver to cover the company’s deductible, just like with any car rental.  Depending on the rate plan the customer signs up for, there may also be mileage charges and a membership fee.  The three round trip services have a number of access points throughout Denver and a variety of different vehicles.  Car2Go is unique in that its vehicles may be found parked at meters or other public parking spaces throughout Denver – wherever the prior renter leaves it.  Car2Go apparently is intended to provide transportation only within Denver and has only Smart Cars – those little two-seaters sold by Mercedes Benz.  The location of its 372 vehicle fleet can be found by checking their website or smart phone app.

So, how is eThos different?  As noted, eThos requires that the vehicle be returned to its home base which, for now, is its sole location in Golden.  None of the other car sharing services come out this far from central Denver or Boulder.  But eThos’ main difference is that its fleet consists of only electric vehicles which, at the present time, includes 8 Codas (more on that in a minute) and one Tesla.  Pricing is competitive with the other services at $7 per hour for up to 250 rental hours (for a Coda) down to $5 per hour for over 500 rental hours.  The Tesla rents for three times the hourly rate of the Coda. Would I pay $21 an hour to drive a Tesla?  No. Let me know when you've got an i8 and we can talk about it.

OK, so what’s a Coda?  Coda Automotive was a California based EV manufacturer that had a short, inauspicious life.  The company produced 5-passenger, 4-door EV sedans in 2012 and 2013 before succumbing to bankruptcy in May 2013.  Built on a frame imported from China, the Coda includes a 31 kWh battery pack and a drive train supplied by Colorado’s own UQM Technologies (which coincidentally also started out in Golden as Unique Mobility, Inc. before moving to Longmont).  At the time of its bankruptcy, the company had reportedly delivered only 117 vehicles.  The remaining stock of 50 vehicles and 100 gliders (no powertrain) were purchased by a couple of remarketers and sold at deep discounts from the $38,000 MSRP (you can read more about them on Green Car Reports.  Coda’s restructuring plan calls for it to morph into a provider of grid storage solutions.



eThos apparently acquired a dozen Codas (8 available and 4 awaiting delivery) and the one Tesla which comprise its current fleet.  At the Grand Opening, I went down and took a short test drive in a Coda (I’m not yet cleared to drive the Tesla).  It is a quiet, reasonable vehicle for getting around town though with a range of 100 miles or less and a 6-hour recharging time (Level II), you’re not going too far in it.  So the market appears to be people who have a need for a vehicle to tool around for a half day or so which is pretty much the market for any other car sharing service.  And, since my aging Vehicross seems to be giving me increasing trouble lately, I may need access to a car share so I signed up for an account ($50 membership fee that was waived on opening day plus a $25 DMV license check fee).

I had a chance to speak briefly with the firm’s two principals, founder Tim Prior and Assistant Manager Kathryn Saphire and wish them the best of luck with their new business. I think that Golden is going to be a challenging market for them, one that will be easier to access if they offer to pick up customers and bring them home after the rental (hint). On the other hand, Golden is a pretty techy community so hopefully it works as a launch point. Alas, it isn’t clear how they’re going to expand or replenish their fleet… unless there are more discounted Codas sitting around out there to be had.  If so, they need to find a red one.

Friday, November 07, 2014

A Message to California About Collecting Solar System Data

California, you will soon be deciding an interesting debate about who, if anyone, should be collecting data about the state’s distributed generation installations.  Under the California Solar Initiative (CSI), information about all of the systems that took advantage of the CSI incentives has been collected and published on the California Solar Statistics website.  One would think that such transparency that allows an agency and other interested parties to track the success of a program funded with public money would be a no-brainer.  Apparently, it isn’t.

California Public Utilities Commission, you will soon be issuing a final ruling on minimum reporting requirements that, with the phase out of the CSI, would now fall to the utilities.  This was reported recently in Solar Industry Magazine available here.  According to the article, unsurprisingly, certain large developers and the utilities oppose these eminently reasonable reporting requirements citing such specious arguments as the cost of providing this data ($7 to $22 on systems costing tens or hundreds of thousands of dollars).

For years, California and CSI you got this right.  As noted in Solar Industry Magazine The information is supposed to let manufacturers, contractors and investors know which equipment is being installed and where, provide academics and journalists with industry information, and help utilities understand more about their distributed generation fleet.”  This debate brings to mind a similar one that existed in Colorado back in 2006 concerning the collection of information on that state’s solar incentive program.
 
At that time, a Colorado PUC staffer was assigned to assess a utility application to implement a forward-looking tariff that would fund the nascent solar incentive program under Colorado’s new Renewable Energy Standard (RES).  That new tariff came to be known as the Renewable Energy Standard Adjustment (RESA) and it now collects a bit over $50 million a year in customer money from that one utility and a lesser amount from a second, smaller utility (the fact that the utilities have been allowed to be the administrators of those accounts raises other concerns but that is a different discussion).  At the time, the requested tariff of 1% of customer billings was projected to raise approximately $20 million per year.  The PUC staffer was charged with recommending to the Commission whether the tariff should be allowed to go into effect unopposed or should be suspended and set for hearing.

Ever the dutiful public servant, and having not inconsiderable experience in major industrial project management in the private sector, this staff member requested from the utility a pro forma budget indicating how the funds were to be spent.  He was told there was none.  The rest of the conversation was brief:

PUC Staff: Then why are you asking for $20 million?

Utility: Because we can.

The tariff was suspended and set for hearing initiating Colorado PUC docket 06S-016E.

Through subsequent negotiations, the initial RESA tariff would be set at 0.6% of customer bills on the condition that the utility provide the PUC with a monthly report from its database that included much the same data that CSI has been collecting.  That too took a strange twist.  That conversation went something like this:

PUC Staff: We’re interested in tracking the growth of the program and how the incentives will contribute to the development of the solar industry in Colorado.  So, we would like a monthly report from the solar registration database.

Utility: What database?  We’re not going to have any database.

PUC Staff:  Well, how are customers and installers going to apply for the rebates?

Utility: They’ll submit an application with all of the interconnection data and rebate information on it.  But, there won’t be a database.

PUC Staff:  Really?

Utility: If you want that information, we’ll send you paper copies of all of the applications each month and you can create your own database.

PUC Staff:  Oh, well.  OK.

And so for the next 6 months the utility dutifully submitted seven copies (as required by PUC rules for all paper submittals) of all of the solar rebate applications which an intern working for the staffer gleefully compiled into a database of system level data using Microsoft Excel.  That is, until the utility representative’s successor came back to the PUC staffer and said:

Utility: Look, we’re really tired of copying all of these applications and hauling them over every month.  How about we just send you a copy of the database on CD each month?

PUC Staff:  You mean the database that you don’t have?

Utility: Yeah, that one.

And so, the PUC staffer ultimately used this data to report to the public on the success of the solar rebate program, where the incentive payments were going county by county, how system costs were falling, the level of economic activity generated by the program, and many other statistics that the public and policy makers would (or should) want to know about how hundreds of millions of dollars in support for the renewable program was being spent.

The first publicly issued report of this data showing four years’ worth of data on solar installations in the state was welcomed by many and was especially of interest to the installer community.  Curiously, the report was not welcomed by certain PUC bureaucrats and the utilities who together sought to quash the report because they felt that it reflected negatively on their administration of the solar program. It didn’t but the report did contain some policy recommendations to manage the program more effectively in the public interest.  The agency even went so far as to deny an open records act request by the industry trade association seeking a copy of the report.

Ultimately, the report became the focus of a Whistleblower complaint and was released by a state personnel department administrative law judge who soundly rejected agency and utility arguments for a protective order that would bar disclosure of the information contained in the report.  Sadly, by the time all of this had occurred, much of the information contained in the report had become stale. Today, the arguments in many states are no longer over solar subsidies per se, but whether net metering and distributed generation itself provides a subsidy that disadvantages a utility and the general body of ratepayers.

The end of this story is that docket 06S-016E is still open and utilities still contribute monthly reports of their collections and expenditures of RESA funds. However, neither the utilities nor the agency appear interested in a comprehensive analysis of the systems installed using those funds and the policy implications thereof.  

The moral of this story for you California is that you’ve been doing the right thing in the collection and publication of this system level data all along and should continue to do so to foster transparency in the administration of programs in the public interest.