Christa Heavey, MAP Sustainable Energy Fellow, San Francisco - TopicsExpress



          

Christa Heavey, MAP Sustainable Energy Fellow, San Francisco California’s method of evaluating energy efficiency programs needs fixing because it hinders efforts to help Californians save more energy and the state’s progress on its climate goals. With some concerted effort, this broken system can be repaired. When utilities and their partners offer programs that help their customers optimize energy in their homes and businesses, these programs are evaluated to determine their impact. Unfortunately, the current evaluation process fails to provide planners and regulators with useful and timely information. Why does this matter? Future efficiency program investments and designs are based on the evaluation results of previous programs. When these evaluations are delayed, inaccurate, and/or unreliable, future program’s benefits may be underestimated and information that could help improve programs comes too late. This threatens confidence in using efficiency to avoid building power plants, puts program funding at risk, and jeopardizes continuation of good programs. Illustrating the problem: the Whole House Retrofit evaluation Proof of the flawed evaluation process came recently in the California Public Utilities Commission’s final evaluation report on the Whole House Retrofit program’s performance between 2010 and 2012. Also known as Home Upgrade under Energy Upgrade California, the program aims to provide a wide range of energy-saving improvements to single family houses. Nearly 5,000 California households participated in the Advanced Home Upgrade to cut electricity and natural gas waste between 2010 and 2012. While the program has its challenges, the upgrades help homeowners reduce energy use by about 30 percent. Assessment of who would have “done it anyway” One focus of the evaluation was to determine what is called “free-ridership,” or who would have made the efficiency investment and improvements without the help of the program. Understanding which customers would have made efficiency upgrades on their own is important to inform design approaches that most effectively reach home owners who would not otherwise make improvements. However, estimating free-ridership is challenging. The evaluation surveyed the participants to determine if they would have implemented the same efficiency measures in their home even without the program’s incentives. The survey results determined the degree of free-ridership, which was used to estimate the net savings to be credited to the program. Survey shortcomings Unfortunately, the methodology used to evaluate the program’s net savings was flawed for a number of reasons: * The survey was conducted in January 2014, up to three years after some of the upgrades. Realistically, a homeowner would not be able to recall with any level of accuracy whether they would have implemented an efficiency measure in the absence of the program and on what time frame, as the evaluation survey asks. Their answers might also be influenced by a number of factors, including the improvements’ performance, a desire to appear environmentally conscious, or a potential subconscious desire to justify past actions. * The survey gets even more complicated when asking about measures’ specifications and timing. For example, if a participant added insulation to their walls, the survey asked whether the customer – on his/her own – would have installed “more, less, or the same” amount of insulation, and would it be “more, less, or the same” efficiency level, and would it have been done “sooner, later, or at the same time.” It’s difficult to imagine that a customer would have specific data on costs, product quality, or timing that would be required to answer these questions for something that happened years prior. * The survey’s response rate was below 15 percent. It is difficult to know if this is representative of the entire population of participants. * The survey results had a huge range of uncertainty. The sensitivity analysis of a single key parameter – the net-to-gross ratio – showed that the estimate could range from less than 40% to over 70%, making it difficult to draw any clear conclusions. The main takeaway here is that the estimates of the program’s net savings – which directly impact its fate – are based on survey responses from customers’ guesses, years later, on what they may or may not have done in some parallel universe without the upgrade program. Even if the results were credible, evaluations should be conducted throughout program implementation to help correct problems that arise, and be completed as soon after the program ends as possible to ensure lessons learned are integrated into the next iteration. However, in this case, by the time the evaluation was released last month, the program had already been significantly revised. This two-year delay means the lessons learned from the 2010-2012 program are likely no longer relevant and the nearly $650,000 cost would have been better spent elsewhere. What now? NRDC has a couple of recommendations going forward. First, the evaluation’s estimates of net savings from the Whole House Retrofit program should not be used for future program planning. Relying on inaccurate savings estimates cannot effectively help plan the program’s future or predict savings. Second, future evaluations should utilize an alternative approach to calculating net savings. The retrospective approach used in the Whole House Retrofit evaluation (and others) isn’t helpful or accurate. Instead, California should move toward evaluating efficiency programs on a “dynamic baseline” approach (like in the Pacific Northwest). Programs are judged relative to the conditions in place when the program is implemented, rather than a retrospective estimate of what would have happened that is developed years after the program ends. A better way The Whole House Retrofit evaluation report is just one example of problems with the current evaluation system that relies too heavily on subpar methodologies, is continually delayed, and doesn’t produce information that is reliable for procurement or program planning. By looking to other regions, California can streamline the process, ensure the most important studies are done in a robust and timely manner, and continually improve programs. Evaluation approaches elsewhere offer better models. In the Pacific Northwest, the Regional Technical Forum (RTF) is comprised of a panel of independent experts, and works on savings estimates for different efficiency measures and recommendations for evaluation needs and processes. In the Northeast, the Regional Evaluation, Measurement, and Verification Forum works to develop consistent energy efficiency evaluations and data by bringing together public utility commissioners and representatives from state energy and air agencies. These forums play an important role in making evaluations more consistent and transparent, with more reliable results. California has an opportunity to adopt the methods of other regions through the new California Technical Forum (CalTF), which is composed of independent technical experts who are broadly representative of California’s energy efficiency community. The panel develops accurate and reliable savings estimates for California’s efficiency programs through peer-reviewed evaluations. NRDC recommends that the CalTF review the Whole House Retrofit evaluation to provide insight to the evaluation’s shortcomings and produce more accurate estimates of net savings. For future evaluations, the CalTF could be leveraged to assess the validity of the methodology prior to integrating the saving estimates into program planning. Through CalTF’s work, California could achieve a more accurate understanding of the savings, and better rely on efficiency to replace conventional power. Photo Credit: California Evaluation System/shutterstock bit.ly/1H7q7JR jobssprocket
Posted on: Sun, 28 Dec 2014 14:01:13 +0000

Trending Topics



style="min-height:30px;">
Question for anyone who knows plumbing. We have a double sink,
Could honestly write a book on the insanely strange things that

Recently Viewed Topics




© 2015