Watch Out Utilities: Here Come The Technology Giants


One of the more outspoken and forward-thinking executives in today’s energy industry is NRG Energy’s CEO David Crane.  When recently asked who NRG’s competitors will be in the future, he responded:

“If you ask me who I worry about beating us, I give very little thought to the traditional power companies, the utilities.”[1]

He then went on to discuss technology companies such as Facebook, Google, Apple, and Amazon as well as cable companies such as Comcast. This is because Crane believes the future will belong to distributed generation including rooftop solar photovoltaic and small-scale natural gas generation such as fuel cells, Stirling engines, and cogeneration units.[2]

Crane thinks the technology companies hold the key because if you believe in the distributed resource future, you have an operational problem. Right now, loads and supply are balanced at the bulk grid level by large utility or merchant-owned power plants, some of which are continuously ramped up and down by the system operator’s Energy Management System (EMS) in response to fluctuations in loads. If we move to a distributed model, someone or something is going to have to provide a similar function at the local level. As Crane says:

“One of the changes we’re talking about when you’re talking about balancing everyone’s power systems in their house is that it basically becomes an information technology-based industry.”[3]

Japan is one country to look at when pondering the possibilities. Since the Fukushima Daiichi nuclear disaster, Japan has had to figure out how to run a modern economy with a sudden loss of 30% of the country’s generation capacity while also creating a new grid that is more resilient to possible future shocks. One answer is to move to a model of local microgrids that can operate interconnected with the larger bulk grid or can be isolated and continue operating on their own using distributed generation.

One small area of the devastated port city of Sendai maintained power after the tsunami – the microgrid at the Tohoku Fukushi University.[4] Since then Japan has become deeply interested in developing microgrids and is working world-wide to create the right mix of energy assets and information technology.[5]

As I write this, Enerdynamics’ instructor Dan Bihn is flying to Jamaica to join the Japanese government in discussing microgrid applications with the Jamaica Public Service Company. Contributing to Japan’s push in microgrids are various Japanese technology companies such as Fuji Electric, Hitachi, Panasonic, and Toshiba.

Which leads us back to the question: If a distributed energy future is coming, who will own and operate the microgrids and in-home networks that will enable this future? Utilities will struggle with significant regulatory and institutional barriers. Technology companies will struggle with figuring out how to turn cool technologies into services that work for consumers while meshing with the complex energy marketplace.

Perhaps it is the energy services companies such as NRG, Direct Energy, and Constellation  that the utilities should fear most. Or maybe it is natural gas companies copying the activities of Tokyo Gas’Roppongi Energy Services Company that is now providing electricity and heat services to multiple customers connected to a microgrid in a  redeveloped neighborhood in central Tokyo.[6] Only time will tell.

Original Article on Enerdynamics


Review: The 2013 Energy Industry


Happy New Year!

In last week’s Part I of this post, we looked at a few energy industry trends that emerged in 2013: low electric load growth, the closing of nuclear plants for economical reasons, and debates on retail choice. But there’s still much to discuss as we look back on 2013 and ahead to 2014…

Will solar power change the utility to just a distribution system operator?
A new trend in 2013 was a concern over the impacts of distributed generation (DG), net metering, and solar. Several states decided to pay the full retail rate for distributed generation when buying back power — far higher than its wholesale value.

With solar, some charged that it was being subsidized by non-participants and receiving grid services at no charge. An EEI Foundation Study, “Value of the Grid to Distributed Generation Customers,” found that solar was not paying its fair share of costs given it uses the grid 24 hours a day[1].

Another concern is cost shifting of solar subsidy and solar back-up/grid costs to non-participating customers. In November, the Arizona Corporation Commission allowed Arizona Public Service (APS) to impose a $5 per month charge on solar.

Solar advocates claim that the underlying rates are the real issue (i.e. the PG&E and SCE residential rates are steeply inverted thereby inflating the solar savings). Advocates also point to DG and a “dynamic microgrid” as the utility business model of the future, which can help manage extreme weather events like Superstorm Sandy (see November Public Utilities Fortnightly[2]).

RTOs continue to expand
It’s never a dull moment in the Regional Transmission Organization (RTO) footprint world. Always remember that RTO membership is voluntary.

On Dec. 19, Entergy and several neighboring utilities such as Cleco joined Midcontinent Independent System Operator (MISO). This adds 25,000 MW or so of generation to the grid and a distinctly Southern flavor to the grid operator once known as the Midwest ISO.

Meanwhile, Western Area Power Administration or WAPA (Great Plains Unit/Basin/Heartland) has announced it may join Southwest Power Pool (SPP). This could also force Montana Dakota Utilities (MDU) to switch to SPP from MISO given much of MDU’s service territory is surrounded by the WAPA control area.

WAPA — a federal power marketing agency — and its decision to join an RTO is quite a catch for SPP. Power Marketing Administrations (PMAs) are not subject to FERC jurisdiction and have historically been suspect about the push into a RTO.

In any event, we now will have two mega north-south RTOs (MISO and SPP) as well as the granddaddy of big RTOs, PJM. Only the Southeast, the desert Southwest, and the Northwest remain as RTO hold outs.

Challenges of wind integration increase with growth in wind, other renewables
MISO and ERCOT both now have more than 10,000 MW of wind, and the amount of wind coming on line is growing. ERCOT reports the Competitive Renewable Energy Zone (CREZ) transmission lines are about to come on line at a cost of $7 billion to transport Panhandle wind to the load centers. MISO’s multi-value projects (around $6 billion) also continue to be planned or built to, among other things, allow more wind to get to load centers.

In 2013, legislative attempts to repeal state wind mandates have gone nowhere. So look for more wind to come on line, which means integrating the variable resource will be more important than ever. States like Illinois and Minnesota have a 25% mandate.

Cost recovery trackers and regulatory streamlining
Why wait for the PSC to process your case in a multi-year proceeding? Several legislatures have enacted measures to shorten or streamline the regulatory process. These include bills to fast track rate increase requests for mandated programs, transmission infrastructure, and the like.

Other states are limiting the time PSCs have to process cases:

  • In Illinois, a formula rates bill was passed that fast tracks smart meter program roll out.
  • Indiana’s SB 560 gives the Commission just a year to process cases and created a new transmission and distribution cost system improvement charge.
  • Michigan allows utilities to put filed rates in effect subject to refunds.

Look for similar bills to pop up in other states in 2014.

Gas, gas, and more gas
Low cost natural gas — buoyed by the dramatic increase in supply thanks to fracking — continues to be the fuel of choice for all new power plants. Natural gas for vehicles and fleets also is on the rise. My favorite is the new Laclede Siemens partnership to promote NGV fleets and fueling stations.

That’s all we have space for in this recap, but when it comes to the energy industry, there’s always more to discuss. I will save Order 1000, mergers, demand response, and the benefits of RTO capacity markets (or not), for a future article.

In short, 2013 has been an interesting year with many of the trends likely to continue into the New Year.

About the author: Bill Malcolm is an energy economist based in Indianapolis. He has worked for PG&E, MISO, and ANR Pipeline. He can be reached at

[1] Value of the Grid to Distributed Generation Customers, Edison Electric Foundation Study, updated October 13 at

[2] November 2013 issue of Public Utilities Fortnightly (articles on microgrid, game changers, and more).

Original Article on Enerdynamics

Incentive Regulation: Can It Save Utilities?


Utilities in the U.S. are caught in a quandary.  The traditional model for creating earnings growth for investors is built around increasing sales and growing capital investment.  Yet as we move forward into the 21st century, market forces and government/regulatory policies are constraining demand and encouraging alternatives to utility generation. Utilities fear a death spiral, where fewer and fewer customers are asked to pay more and more to support utility fixed costs.

Meanwhile, many in the industry believe that consumers are soon to awake from their century-long acceptance of monopoly utility services to demand options not dissimilar to phone and internet services[1].  Utilities in the United Kingdom (U.K.) are facing similar concerns, and over the last two years the U.K. regulator Ofgem (Office of Gas and Electricity Markets) has begun implementing a new incentive-based monopoly pricing mechanism designed to move their utilities into the new world[2].  Ofgem’s mechanism is worth studying as a possible future for U.S. utilities.

When the U.K. electric and gas utilities were restructured in the 1990s, the business was separated into four sectors – production (generation in the case of the electric business), transmission, distribution, and retail supply.  Transmission and distribution were defined as monopoly network companies subject to regulation.

But since the network companies no longer had generation resources or could make a margin selling supply to customers, the U.K. had to develop new methods of creating a profitable but fair revenue stream.  The U.K. decided on performance-based regulation (PBR) and has been implementing and refining this model since.

The latest methodology is called RIOO.  The name is taken from the equation Revenues = Incentives + Innovation + Outputs.  RIOO will set allowed prices for network companies for an 8-year period.  According to Ofgem, the regulation will encourage network companies to:

  • put stakeholders at the heart of decision making
  • invest efficiently to ensure continued safe and reliable service
  • innovate to reduce network costs for customers
  • play a full role in delivering a low-carbon economy and wider environmental objectives

The mechanism will allow the network companies to attain a specified rate of return based on performance as measured against criteria such as customer satisfaction, reliability and availability, safe network services, connection terms for new customers, environmental impact, and social obligations.  Note that return is not based on the amount of capital invested to achieve these criteria; that decision is left up to utility management.

So will we soon move to radically different regulatory models in the U.S. similar to those in the U.K.?  As most readers know, nothing moves fast in the utility industry. And the U.K. has been refining its model for over 20 years. But the U.K. experience provides some interesting ideas for states to consider as regulators and utility management try to figure out how to keep utilities viable as we move into a new energy world.


[1] See for instance the Utility Customer of 2020 from PwC:

[2] See Ofgem’s site describing the regulatory mechanism here:

Original Article on Enerdynamics

Net Metering: Benefit or Cost?


The historical utility business model is for utilities to build large power plants at remote locations, build transmission lines to transport it to load centers, and build distribution lines to distribute the power to end-use customers. Under this model, utilities create profits by earning a “fair” rate of return on capital investments and often can boost their earnings a bit by selling more power than forecast between rate cases.

This model worked pretty well for most of the 20th century and didn’t get much attention until the rate increases associated with expensive new power plants and fuel cost increases in the 1970s. At that point, interest in exploring alternatives lead Congress in 1978 to pass the Public Utilities Regulatory Policy Act (most commonly called PURPA).

PURPA required utilities to buy power from certain non-utility generators at the utility’s avoided cost of power. Over time the law led to the emergence of customer-owned renewables, cogeneration, and independent power producers.  Now the rapidly dropping cost of customer-installed photovoltaic solar (PV) systems is raising the possibility that some utilities may see their traditional business model fall apart as they grapple with issues such as net metering.

PURPA and Net Metering
While PURPA was a federal law, it left implementation up to the state legislatures and regulatory commissions. So each state defined avoided costs and other details differently.  One key issue is how to pay customers who install their own generation such as photovoltaic solar (PV) and then wish to sell excess power (meaning power not consumed internally in the facility) back to the utility.

For small renewable installations, many states implemented net metering.  Net metering is an electric billing methodology that allows eligible customers with distributed generation to sell power back to the grid when generation exceeds internal usage. These customers receive credit for that power to offset purchases of utility power used during times when the customer’s generation is insufficient to cover all loads in the facility. In essence, customers are paid the retail rate to sell power to the grid.

net metering chart 1

Is Net Metering Fair?
In essence, net metering sets the buyback rate at the customer’s retail rate.  In the early years of solar PV, state regulators often justified this by saying that solar was new and needed help getting started, that there were benefits to solar but it was too soon to be able to quantify them, and that the amount of solar is so small it doesn’t matter much.  But now that solar PV is growing and has the potential to grow significantly more, utilities and regulators are struggling to determine what is or is not a fair way to handle net metering.

net metering chart 2

Source: EIA Today in Energy May 15, 2012

Advocates for solar power argue that changing the rules will deter further growth and accuse utilities of wanting to discourage solar power to protect utility profits. Utilities and some customer and traditional generation groups counter that net metering customers are using the grid without paying their fair cost. It will be up to the regulatory commissions around the country to quantify benefits and costs of customers with PV systems connecting to the grid and, based on this, determine what is fair.

Benefits to the grid may include:

  • avoided costs of generation that would be needed in the absence of solar power
  • avoided costs for generation capacity (to the extent that solar power offsets system peak demand)
  • reduced costs for ancillary services
  • lower line losses on the transmission and distribution (T&D) system
  • reduced investments in T&D facilities (to the extent that solar power reduces the need to invest in facilities)

Costs to the grid may include:

  • integration costs to provide reserves power when it is cloudy
  • equipment upgrades on the distribution system
  • higher costs of purchased power when the utility pays the retail rate for net meter power instead of the wholesale rate it could have paid to purchase power instead

Debates over attempts to quantify benefits and costs are currently ongoing in numerous states including Arizona, California, Colorado, and Ohio[1] while regulatory commissions in Idaho[2] and Louisiana[3] have recently ruled against utilities in attempts to change net metering. It will be worth watching how remaining proceedings around the country play out as it will directly affect the business future of both utilities and PV solar companies as well as future opportunities for consumers to economically generate their own power.


Original Article on Enerdynamics

Natural Gas Fracking: Where Are We?


In August 2011, Enerdynamics wrote an Energy Insider article on the issue of natural gas fracking[1].  There we described what fracking is, what it had done for gas reserves as of 2011, and some of the concerns relative to fracking.

Tower for drilling horizontally into the Marcellus Shale Formation for natural gas, from Pennsylvania Route 118 in eastern Moreland Township, Lycoming County, Pennsylvania, USA (Photo credit: Wikipedia)

Since then, the rhetoric over fracking has only gotten stronger.  The famous Gasland movie is becoming a franchise with Gasland 2 now available[2]. According to some observers, fracking is unsafe and cannot be made safe.  And any information to the contrary is a conspiracy lead by the big oil companies.  Many citizens in our communities hear this and believe it.

Meanwhile, supporters of the oil and gas industry argue that fracking is safe when done properly and has been going on for decades. While a detailed analysis of the data concerning fracking is too lengthy for a blog post, following are highlights of the issues surrounding fracking and some links where readers can get more information.

What is new since we wrote our last Insider article?

  • U.S. gas supply projections continue to grow. Based on production increases from shale gas and other unconventional sources, much of which will be produced through use of fracking, U.S. gas supply is projected by the U.S. Energy Information Administration (EIA) to increase by 44% from 2011 to 2040.

fracking blog 8.30Source: EIA Annual Energy Outlook 2013

  • The U.S. is projected to shift from being a net importer of natural gas to a net exporter. For many decades the U.S. has been a net importer of natural gas. The EIA now projects that by 2020 the U.S. will become a net exporter. This represents a fundamental shift in U.S. gas supply and will have significant benefits from an economic and environmental standpoint.

fracking blog 2 8.30

  • Air quality around drilling and gas processing needs to be studied (but issues may or may not be related to fracking). Localized air quality issues associated with drilling activity have become apparent and need to be addressed. Studies have begun finding chemicals in air, some used for fracking some not[3].
  • Earthquakes should be added to the list of potential issues (but again, this may be related to gas production but not to fracking). A soon to be released study suggests earthquakes in South Texas are due to removal of fossil fuels and water from underground, not due to fracking[4].   Others have contended earthquakes are due to disposal of fracking fluids underground near faults[5].
  • Drilling (whether fracking is used or not) is a significant land-use issue. Here in my home state of Colorado, more and more drilling rigs are moving into suburban neighborhoods that didn’t exist 40 years ago during the last boom.  Residents in these neighborhoods are understandably concerned about rigs showing up within 500 feet of their homes, (which is the current allowed setback under Colorado regulations).
  • Fracking’s impact on water supplies remains under dispute. In some limited cases, data may suggest that fracking harmed water supplies.  But it is hard to know since data doesn’t exist on the quality of the water prior to fracking[6].  If you read the fine print on these studies, most suggest the water contamination occurred due to faulty drilling practices or spills. Other studies appear not to show any contamination[7].
  • Regulation continues to evolve. Rules at the local, state and federal level continue to be discussed and, in some cases, implemented.  Many states are looking at regulations developed by Colorado as a starting point[8].  But even so, fracking is very contentious here.

So what can we conclude?  Only that there are varying viewpoints. Some will read this blog and contend that we are apologists for the gas and oil industry.  Others will read it and think we are favoring anti-fracking activists. We are neither. Our company is in the business of teaching others about the energy industry, and fracking is simply one of the most contentious topics in the industry today. Thus, it is deserves a close watch and thorough discussion.

The battle will go on, but in the meantime, more and more natural gas will be developed. This will result in less and less coal and nuclear generation. Given the benefits of plentiful low-cost natural gas supply, fracking isn’t going to stop.  But what is highly likely is that at both the state and federal level we will see a number of new regulations attempting to address the potential negative effects.


[1] See “Fracking: What Is It and Why Does It Matter?” available at

Original Article on Enerdynamics

Fracking and Solar: It’s Complicated


Hydraulic fracturing or “fracking” has been widely criticized by environmentalists who cite concerns with water pollution and methane leakage from this high-volume method of extracting natural gas. However, the burning of natural gas is undoubtedly better for the planet than the burning of coal. Some experts are now saying that—toxic chemicals and contaminated water aside—fracking will facilitate our transition to a clean energy future.

It’s an interesting debate that has been gaining momentum in recent months.

A report by Citigroup fueled the fire when analysts indicated that shale gas (a natural gas trapped in shale formations) is not only “complementary” to renewable energy, but is “in many ways essential” the widespread adoption of wind and solar power.

The logic behind this theory, the report explains, is that wind and power are still intermittent energy sources and require a secondary power source that is able to go online and offline quickly (providing “peaking power”). Natural gas fills this role well. As renewables become more prevalent, they steal demand from the baseload power that provides continuous energy—a role traditionally filled by coal-fired power plants. But the baseload plants don’t have the ability to go on and offline quickly, so as demand decreases more coal plants will be shut down. Therefore, natural gas will be called on to provide both baseload and peaking power as renewables gain momentum toward mainstream energy.

Some experts believe that in the long-run, natural gas will become more expensive and unable to compete with solar energy, which is approaching cost-equality with other energy sources. As the grid becomes more robust and energy storage increases, natural gas will be phased out—by the year 2050, predicts Ernest Moniz, President Obama’s choice to lead the Energy Department. Until then, Moniz says natural gas will act as a ‘bridge’ from fossil fuels to clean energy. “In broad terms we find that, given the large amounts of natural gas available in the U.S. at moderate cost … natural gas can indeed play an important role over the next couple of decades in economically advancing a clean energy system,” Moniz said in a testimony on the Future of Natural Gas.

Citi analysts report that it will take a significant amount of natural gas to make wind and solar power the primary energy sources. This high-volume extraction depends on fracking, which cracks the rock layer deep underground to extract shale deposits filled with natural gas that were inaccessible by conventional drilling.

Fracking supporters praise its economic benefits and its aid in moving the U.S. toward energy independence. Critics highlight the drinking water contamination, air pollution and inherent danger—a house exploded in Ohio after methane gas seeped into its water well.

Aside from being highly flammable, methane gas is the second most prevalent greenhouse gas and 21 times more powerful at warming the atmosphere than carbon dioxide. However, the Environmental Protection Agency (EPA) estimates that the methane levels released by fracking are lower than originally believed due to stringent pollution controls.

After studying the impact of fracking on solar, researchers at the Massachusetts Institute of Technology (MIT) say: foe. They predict that the natural gas boom will limit the expansion of renewable energy, and comprise a third of the electricity market by 2050. That’s a far cry from being phased out completely. Because fracking has led to an abundance of shale gas (which comprises a quarter of all natural gas production in the U.S.), the price of natural gas will remain low for decades, making these plants more attractive than wind and solar, according to the report.

“While treating gas as a ‘bridge’ to a low-carbon future, it is crucial not to allow the greater ease of the near-term task erode efforts to prepare a landing at the other end of the bridge,” states the MIT study.

by Emily Hois,

Original Article on Enerdynamics

The Distributed Generation Revolution: How Much Longer?


My first job in the utility industry was as a cogeneration engineer with Pacific Gas and Electric Company (PG&E). PG&E was years behind schedule in getting its nuclear power plant Diablo Canyon online and needed new generation capacity quickly. My team’s job was to find good candidates for installing cogeneration[1] and demonstrate to the customer the economics associated with making the investment.

A number of larger customers installed cogeneration units, and, by the late 1980s, PG&E was buying power from more than 2,500 MW of cogenerators — more capacity than Diablo Canyon! Many old timers were cynical, but the younger members on our team were optimistic that a distributed generation (DG) revolution was underway and would rapidly change the industry paradigm of centralized generation.

It didn’t happen. Low variable-cost nuclear power soon became available as Diablo Canyon came online, and PG&E’s interest in encouraging cogeneration waned. I moved on to a number of other jobs, and when I later heard a utility executive say that “distributed generation is always the next big thing, decade after decade” I chuckled and thought maybe he’s right.

The problem was that utilities built their systems with centralized generation for a reason – the principle of economies of scale. With the technologies available, it didn’t make sense for every customer to have its own power plant except when the customer’s heat loads were very closely matched to generation needs. Stand-alone DG without cogeneration typically had heat rates of 12,000 Btu/kWh or higher while centralized gas combined-cycle generators could get heat rates down to 8,000 Btu/kWh or lower.

A higher heat rate means you have to burn more fuel to get the same amount of power, so this placed DG at a big disadvantage to the larger utility power plants. And fuel costs for DG were higher too since owners of DG have to pay for the cost of gas distribution unlike utility power plants that typically take gas off a high-pressure transmission line, or, even worse, have to compete with coal units that have an even lower cost of fuel.

Only recently has the dynamic potentially changed thanks to the significant reduction in costs for distributed solar photovoltaic (PV) generation[2]. Now in regions with good sunlight, high utility rates, and government incentives, PV installations can be funded through customer savings relative to the cost of utility power. And in some regions PV manufacturers are promising to be economic even without incentives[3]. In California, overall distributed PV generation is about 2,000 MW[4] or the equivalent of another Diablo Canyon.

Current distributed generation in the U.S. is approximately 5% of total generation. In a recent survey by Black and Veatch[5], utilities were asked the amount of distributed generation they expected to see by year 2020. About one-third of respondents expected 5% or less; over 40% expected 5 to 10%; just over 15% expected more than 10%; and 8% responded they didn’t know. So clearly utilities aren’t sold on the idea that change is coming quickly. But, had utilities been asked 10 years ago whether gas generation would surpass coal generation, most would have said no. And, as we’ve discussed elsewhere, that transformation is occurring.

So, besides the price tag, what are the barriers to more DG? The largest barriers involve regulatory and technical issues. The current electric distribution system is not designed for much DG, so grids will need to be redesigned and upgraded if DG is going to grow. And until that occurs, utilities may restrict the amount of DG connected to their systems[6].  Other key issues include the price that customers are paid for power that flows back onto the grid and the price for stand-by service that customers pay so that the utility is able to serve the customer’s load when the DG system is down.

So whether we will see significant growth in DG in the near feature is debatable. In addition to the PV “wildcard”, numerous entrepreneurs are trying to create cost-effective DG using other technologies. Two interesting examples include the fuel-cell-based Bloom Box[7] and design guru Dean Kamen’s Stirling Engine generator[8].

One thing we can say is that those closest to the industry often fail to see big changes until they are obvious, so while there is a lot of installed infrastructure that suggests our electric grid will stay mostly centralized, it is not implausible that a rapid DG revolution could occur.


1. Cogeneration is the production of electricity using steam created from waste heat from an industrial process or the use of steam from electric power generation as a source of heat.  It is also often called Combined Heat and Power or CHP

2. For a discussion of PV cost reduction, see Enerdynamics’ Insider “Is the Photovoltaic Industry Living Up to Its Hype?”  available at

3. See for instance:

4. See Black and Veatch 2013 Strategic Directions in the U.S. Electric Utility Industry, p. 44 available at:

5. See ibid, p. 44

6. See:

7. See:

8. See:

by Bob Shively, Enerdynamics President and Lead Instructor

Original Article on Enerdynamics

How China Will Transform The Energy Industry


When I started in the energy business back in the early 1980’s, the utility paradigm was to continually build more power plants to serve growing customer loads.  This lead to environmental and economic difficulties, especially as the cost of completing planned nuclear power plants skyrocketed.

At the utility where I worked, Pacific Gas and Electric, an outside scientist named Amory Lovins gained notoriety by forcefully suggesting an alternative path that included energy efficiency and renewable power.  As the story went around the company (and I don’t know if it is actually true), PG&E executives would hide when Lovins came in the building so they wouldn’t have to listen to him telling them over and over that they needed to change their business model.  Ultimately the California regulators did listen, and both California and PG&E have transformed their energy systems into one of the more efficient and renewable-based markets in the world. And in most U.S. states, at least energy efficiency has become a key focus of electric utilities [1].

But in the U.S. the debate still rages on concerning the right energy future.  Should other states follow California’s lead with use of renewables, or is it better to focus on more traditional energy sources?  Is climate change real, and should we change our energy mix to reduce carbon emissions?  It seems that in Washington, we just aren’t capable of consensus for action at this point.  Instead it’s possible the impetus for change will come from outside.

As we suggested in a recent blog post, behind-the-scenes efforts to work with China may result in that country leading significant movement toward a cleaner energy mix. [2]  One such effort is being led by Amory Lovins’ Rocky Mountain Institute (RMI).  Lovins recent book, Reinventing Fire, laid out RMI’s views on how the U.S. could transform its energy mix.  RMI is now working with high-level officials and organizations in China to develop a blueprint for how China might lead the clean energy future.

According to Lovins, it could result in change that is “one of the most transformative that’s ever happened in global energy.” [3] When a lot of folks talk that way, it just sounds like hype.  Especially when China is the world’s largest coal generator, and electric output in China is expected to more than double by 2035.  But then again, 30 years ago who ever thought that electric utilities throughout the U.S. would be trying to convince consumers to buy less of their product?

Forecasted Electricity GenerationSource: U.S. Energy Information Administration International Energy Outlook 2011

[1] See for instance, the U.S. Energy Information Administrations Today in Energy report U.S. Energy Intensity Expected to Continue Its Steady Decline Through 2040

[3] From the Google Hangout talk Reinventing Fire China:

Related articles

Original Article on Enerdynamics

How Could the Lights Go Out at the Super Bowl?


When I worked for Pacific Gas and Electric Company (PG&E) in the late 1980s, the San Francisco area hosted a classic World Series in which the San Francisco Giants played against their cross-bay rivals the Oakland A’s.  Going into the Series, PG&E spent significant time and energy ensuring reliable power to both stadiums.  My friend and co-worker Bill (whose last name shall remain anonymous), was the account representative for the Giants’ stadium, Candlestick Park. Just before the series began, he was quoted in the company newspaper saying it was virtually impossible for the stadium to lose power.

Sadly, the Loma Prieta Earthquake struck just prior to Game 3, and power at the stadium immediately went out due to loss of transmission into the area.  In the following days as PG&E struggled to restore power throughout the area, we were all reminded that power supply is never 100% reliable no matter how much time and money is spent on engineering and infrastructure.

Flash forward 23 years, and half of New Orleans’ Superdome lost power just after halftime of the Super Bowl. Players were forced to wait on the sidelines for 35 minutes before power could be restored. As the root-cause investigations go forward, we are beginning to learn what caused the outage. According the local utility Entergy’s press release:

Entergy New Orleans, Inc. announced today that it has traced the cause of Sunday’s outage to an electrical relay device.

The device was specifically installed to protect the Mercedes-Benz Superdome equipment in the event of a cable failure between the switchgear and the stadium.

While the relay functioned without issue during a number of high-profile events – including the New Orleans Bowl, the New Orleans Saints-Carolina Panthers game, and the Sugar Bowl – during Sunday’s game, the relay device triggered, signaling a switch to open when it should not have, causing the partial outage.

This device has since been removed from service and new replacement equipment is being evaluated.[1]

So what is a relay, and how could it cause an outage?  A relay is a device that automatically signals a breaker to open when the relay senses an unsafe power condition such as too high a current, too high a voltage, reverse power flow, too high a frequency, or too low a frequency.  Breakers interrupt the flow of power to prevent potentially harmful problems from occurring in electrical systems. Relays are used to protect equipment within a premise and/or to protect the utility lines serving a premise.  Modern protective relays such as the one at the Superdome are microprocessor-based digital relays, meaning that software within the processor on the relay is used to determine when to open the breaker.

This raises the question of whether the protective relay opened when it shouldn’t have due to a manufacturing defect, incorrectly programmed software, an external hack of the software or whether there was indeed a power condition that resulted in the relay properly opening the breaker.

The answer to this question has not yet been definitively announced. But the manufacturer of the relay, S&C Electric Company, stated on Friday that the relay opened the breaker as ”a result of the electric load current exceeding the trip setting for the switchgear relay as set by the system operators. Based on the onsite testing, we have determined that if higher settings had been applied, the equipment would not have disconnected the power.” [2]

Assuming the relay functioned properly, what could have caused the outage? One answer is simply that the relay was programmed with a wrong tolerance that was set too low. Loads in the stadium were apparently well below the capacity of the circuits since the circuits are designed to carry heavy air conditioning loads in the summer. The Beyonce halftime show was apparently powered primarily by standby generators within the stadium.  But at least one unofficial comment has suggested that some such outside power sources may have been connected into the stadium electrical system.  We don’t know if this is true, but if so, this certainly could have caused a power condition that propagated itself back to the relay, in which case the relay would have properly opened based on what appeared to be a fault or other problem with the internal system. So a second explanation could be power quality issues occurring inside the stadium.  And although there has been nothing to suggest that this occurred, microprocessors do have the potential of being hacked if they are connected to the internet.  This means that a third explanation could be that someone accessed the software on the relay remotely and maliciously changed the settings. This seems unlikely, but not impossible.

We’ll have to wait for more analysis to find out why the relay acted.  In the meantime, we are reminded again that no power supply can ever be expected to have 100% reliability.

Original Article on Enerdynamics

Electricity: The Key to U.S. Energy’s Future

pylon with blue sky

As we enter 2013, it is clear that  electricity is becoming the dominant form of energy that will drive society’s  future. Exxon forecasts that between now  and 2040, electricity will account for more than half of the growth of global  energy demand [1].  And in the U.S., the Energy Information Administration (EIA) forecasts that electricity use  will grow by 24% in that same period. Natural gas also is forecast  to grow significantly, but much of this is due to growth of natural gas as a  fuel for electric generation.

                  Source:  EIA Annual Energy Outlook 2013 Early Release Reference Case

So with electricity destined to increase in importance, what can we expect to see in the electricity industry in 2013? We’ll explore various issues including the  generation mix, energy efficiency and demand side management, infrastructure,  the environment, and the slow but seemingly inexorable movement to more  competitive markets.

The pace of change in the U.S. generation mix in 2012 was  stunning. Coal generation dropped  significantly to just 37% while natural gas generation increased to 30%. Renewables also continued growth driven by state renewable portfolio standards and decent economics for wind power.

              Source:  EIA Short-term Energy Outlook December 2012

Given this information, what should we expect in 2013? Much of the natural gas generation increase  was due to low natural gas prices which pushed the variable cost of gas units  below that of coal. So whether this  trend entirely continues depends on the price of natural gas. As discussed in our companion natural gas Energy Insider article, current futures prices for gas indicate a market expectation of  continued low prices. Also a factor will  be permanent retirement of numerous coal units, a trend that is expected to  continue in 2013 [2]. Despite the current uncertainty of whether  production tax credits will be extended for wind power and the possible  reduction in new projects, actual output for renewables will continue to  increase as projects completed in 2012 come online.

And although newer technologies won’t have  much effect on the overall generation mix, 2013 will be a good time to learn  more about possible future electric supply options. Two IGCC (integrated gasification combined-cycle) units are slated to come online in 2013 [3],  construction will continue on five new nuclear units, and numerous  demonstration projects will test storage technologies.

Energy efficiency and demand side management
Energy efficiency (reducing demand across all usage) and  load management (reducing demand during peak times) are increasingly  important in the U.S. The result is that efforts on the demand side reduce the  amount of generation that must be built.  EIA data shows that by 2010, over 33,000 MW of generation construction  was avoided due to energy efficiency and load management.

Source: U.S. Energy Information Administration, Form EIA-861, “Annual Electric Power Industry Report.”

Data from the North American Electric Reliability Corporation (NERC) indicates that this trend will continue in 2013 and beyond with just load  management growing to almost 50,000 MW by 2017 (that means at least 250 peaking units do not have to built due to this resource). We expect that more and more utilities  and ISOs around the U.S. will decide to  rely on this low-cost resource especially as new technologies make it easier to implement programs and access controllable loads [4].

  Source: NERC 2011 Long-term Assessment

Infrastructure spending will continue at a high pace in 2013, with a focus on transmission expansion and distribution upgrades. Hopefully 2013 will  be the year in which the term smart grid can be retired and we will simply begin talking about equipment upgrades (just like everyone talks about upgrading their cell phones!). Smart meter deployments will continue with  the number of smart meters likely rising beyond 30% of all consumers.

Source: U.S. Energy Information Administration, Annual  Electric Power Industry Report

But perhaps more important in the shorter term is the  deployment of modern technologies in transmission, substation, and distribution  facilities. While little noticed by the  public, deployment of technologies such as Phasor Measurement Units (PMUs), Flexible AC Transmission Systems (FACTs),  various transmission and distribution automation devices, and systems providing  volt/var optimization, feeder load balancing, and dynamic outage response will  fundamentally improve the efficiency and reliability of the electric grid.

With the re-election of President Obama and other “green” candidates in state races, we can expect that protection of the environment  to continue as a key issue in 2013.   We have already explored many of the key points in recent blog posts on Energy Currents [5],  so won’t repeat the discussion here. But  our expectation is that virtually all decisions in the electricity industry  will be made in the context of how they impact environmental impacts and  environmental regulation obligations.

Competitive markets
While certain areas of the country seem content continuing  with the monopoly utility model, the role of competitive markets in electricity  quietly grows. The amount of  power trading in wholesale markets  under an ISO — currently about two-thirds of U.S. power — will swell at the end of 2013 as the 35,000 MW of Entergy loads joins the  Midwest ISO [6]. And although not an implementation of an ISO,  the Western Energy Coordinating Council (WECC), which coordinates the western  grid, is moving forwards with implementation of a competitive real-time  balancing market [7].

Meanwhile on the retail side, markets in  specific states continue to grow with significant activity in Texas, Maine,  Pennsylvania, Illinois, Maryland, Connecticut, Massachusetts, New Jersey,  Delaware, New York, and Ohio.  And as  reported by the Distributed Energy Financial Group (DEFG) [8],  retail services have shifted focus from just providing the lowest possible rate to providing the most innovative services. As  stated by DEFG in a recent e-mail, these include “fixed-price contracts,  month-to-month pricing, time-of-use pricing with no-cost hours (or days),  prepaid energy with daily notifications about usage, green power for electric  vehicle charging, mobile applications to control thermostat settings, and  advanced analysis of personal usage with customized suggestions about reducing  electric bills.”

As we enter 2013, we can conclude that the electricity  industry will continue to grow and evolve in interesting ways. And as we commonly remind our audiences, much of the future change will be driven by a new workforce as  industry veterans hit retirement age. This means that for those of you on the younger side of the industry,  there will be plenty of opportunity to help drive the retooling of a critical societal  resource.


1. The Outlook for Energy, a View to 2040

2. For more information, see:

3. See: and

4. See our earlier Insider, The Impact of Demand Side Management on Wholesale  Electricity Markets

5. See Climate Change and Greenhouse Gas Emissions Back on Agenda in the U.S.

and Uncertainty in Clear Air Rules Continues to Impede  Planning

6. See:

7. See:

8. See:

By Bob Shively, Enerdynamics President and Lead Instructor

Original Article on Enerdynamics

In Focus: Renewables Development in the Emerging World


Do a Google search for renewable power and you will find thousands of links to sites covering renewable power development in the U.S. and in Europe. But outside of China and maybe India, you won’t find much about renewables development elsewhere. Interestingly, recent years have seen significant renewables development in other parts of the world, and there is no reason to believe the trend won’t continue in 2013.

According to CleanTechnica, more new wind and solar projects were installed in the developing world in 2011 than in the developed economies [1]. And over time, the developing world may become the leader in driving renewables forward. As we enter 2013, let’s turn our attention away from the U.S., Europe, China, and India to take a look at how renewables are developing in some other key countries.

According to Ernst and Young’s Renewables Energy Attractiveness Indices [2] emerging economies in the top 40 include Brazil, Romania, Poland, South Africa, Mexico, Morocco, Turkey, Ukraine, Egypt, Tunisia, Israel, Argentina, and Chile. Let’s take a look at a few developments in emerging countries and see how new ideas may come from seemingly unlikely places.

According to a recent report on the CleanTechnica blog, Morocco has more stringent renewables goals than the State of California. Morocco plans to build 2,000 MW of wind and 2,000 MW of solar by 2020. Based on current demands and the current annual 6% growth rate of demand, this will represent about 50% of demand if completed as planned. And if you are wondering if this is just talk, Morocco put out an 850 MW wind power RFP earlier this year and successful bidders are already signing contracts [3].

United Arab Emirates
Many Middle Eastern countries have begun to realize that using their precious petroleum reserves for electricity generation may not be the best strategy. An example is the United Arab Emirates, where the emirate of Abu Dubai has set a goal to produce 7% of its electric supply from renewables 2020. But more interesting is Abu Dubai’s plans to develop Masdar City as the world’s first zero carbon community. Plans also call for making the city a global hub for renewable energy and clean technologies [4].

The Mediterranean Solar Plan and DESERTEC
In addition to renewables development focused on serving domestic electric needs, emerging economies may also find renewables development for export a fruitful area. The European Union in 2008 launched the Mediterranean Solar Plan [5] with the goal of developing 20 GW of solar power in countries such as Algeria, Egypt, Libya, Morocco, and Tunisia. Even longer-running have been the efforts of the DESERTEC Foundation dedicated to finding the best locations for developing renewables [6]. DESERTEC preaches that the world’s deserts can create solar power greater than 160 times the current world demand. Coupled with a transmission grid built to move the power to load centers, they believe the world’s energy sources could be transformed.

The future will be interesting
These are just a few examples of interesting efforts going on around the world. Will all of them come to fruition? Certainly not. But it is hard to believe that somewhere there isn’t the gist of a whole new way of thinking that will completely change the way the world looks at energy. And implementation of those ideas may not come from the U.S., Europe, or the rapidly developing Asian countries.


1. See:

2. See:—All-Renewables-Index

3. For a video on Morocco’s Renewable Energy, see:

4. See: or the video at

5. See:

6. See: or the video at:

By Bob Shively, Enerdynamics President and Lead Instructor


Original Article on Enerdynamics


FERC Order 1000: A Big Deal

Many of our readers may have noticed at their companies a considerable effort expended on FERC Order 1000 compliance in the last year. The Federal Energy Regulatory Commission (FERC) introduced Order 1000 in July 2011 and re-affirmed it in May and October 2012 with Orders 1000-A and 1000-B. Order 1000 is seen by many as landmark energy legislation addressing both the transmission-planning process and cost-allocation procedures for new electrical transmission lines.

Its goal is to ensure collaboration among market participants so as to encourage a more coordinated build out of the US transmission grid in the future while encouraging equitable and economical cost allocations for any new regional transmission lines. The order focuses on three key areas:

  • requirements for regional and inter-regional transmission planning
  • cost allocation
  • the removal of the federal right of first refusal (ROFR) for incumbent utilities

The planning process
Historically, most transmission planning has been a bottom up process, although the emergence of RTOs where they exist has helped facilitate and resulted in more regional and inter-regional transmission planning.  The incumbent utility periodically evaluated its need for new transmission based primarily on reliability criteria and economic access to power for its native load.  The incumbent generally had little incentive to build new transmission lines to renewable projects owned by others or far away unless the incumbent needed that power to comply with renewable portfolio standards.

Order 1000 now requires that all transmission providers participate in a regional and inter-regional transmission planning process of some type of a more “top-down approach” that is required to recognize public policy mandates. Some have described this as the difference between utilities in one state participating in a state planning process as opposed to now participating in a multi-state or multi RTO planning processes.

Removal of federal right of first refusal
One of the contentious requirements of Order 1000 for many public utilities was the removal of the long-standing federal ROFR for transmission projects identified in a regional plan for the purposes of cost allocation.  The incumbent utilities no longer maintain the ROFR to build, own, and operate large-scale transmission projects located within their service territory.

Traditionally, outside transmission developers in many parts of the country were hesitant to commit resources to evaluation of new transmission lines when they knew the incumbent utility could decide at the last minute to take control of the project away from the developer and build it themselves.  As a result, many transmission projects that have traditionally been built by incumbent utilities based on geographic location and service territory may now be open to competition.  The ROFR removal has the potential to significantly change the way large transmission projects are identified and awarded in many parts of the country.  While FERC has indicated there is no requirement for competitive bidding for every transmission line, it appears that transmission lines meeting certain criteria would most likely be awarded and built under a competitive bid process.

Cost allocation — who pays?
Part of Order 1000 mandates public utilities to participate in a regional transmission planning entity that has an established cost-allocation methodology for new transmission lines as well as inter-regional lines.  This may ultimately be the most challenging and controversial part of the order.  Before Order 1000, a public utility that owned and operated a transmission line rolled its power and reliability costs into its ratebase, and the native load customers paid the costs.

Order 1000 recognizes that large regional and inter-regional transmission lines (particularly those connecting to renewables) may benefit customers over a much wider geographic area than one or two incumbent utilities in terms of both price and reliability.  The order mandates some form of cost allocation to those who benefit from the project.  As you would expect, finding equitable and economical cost-allocation solutions in a complex physical system like the grid will be a contentious matter. FERC has provided a list of six regional cost-allocation principles that must be met by any project.

FERC’s hope is that the top-down approach will result in more efficiencies, more clarity, and less risk to participants that want to build large regional and multi-regional transmission lines.  As you would expect, certain sectors of the industry including renewables and transmission construction companies hail Order 1000 as long overdue and a step in the right direction; many incumbent utilities see it as FERC over-reaching its regulatory authority and trying to not only tell states what transmission lines to build but how to allocate the costs.

Like most new FERC rules, people inside and outside the industry have all types of opinions on what they think the new rules mean and what the impacts will be.  As is often the case, the devil is in the details and we won’t really know the rule details and interpretations until the utilities and RTOs have made their initial compliance filings, FERC has analyzed the filings, and FREC has communicated its evaluations of compliance or lack-thereof.

When will it all be clear?
The initial compliance filings related to Order 1000 from regional transmission groups were due at FERC in October 2012.  The bulk of the filings from the RTOs so far have indicated they believe their current cost-allocation methodologies meet Order 1000 compliance without needing substantive changes.  PJM and ISO-NE are proposing changes to their existing cost-allocation methodologies related to cost allocation of new transmission driven by public policy requirements at the individual state level.

While the PJM and ISO-NE proposals vary, the general idea is to allocate costs of new transmission lines necessitated by individual state policy mandates to the electrical customers in those states.  Compliance filings from inter-regional transmission groups are due at FERC in April 2012.  Once these initial compliance filings have gone through FERC scrutiny and there has been a chance for public comment, the details of complying with the order will be much clearer.

Original Article on Enerdynamics

Why Natural Gas Prices Matter

U.S. natural gas prices have fallen to lows not seen in a decade and have stayed at the lowest levels we’ve seen in a decade.

As I write this in late October 2012, the Henry Hub price is $3.43, which is at least half what was considered “normal” fall pricing in the last 10 years.  Expectations are that prices will stay well below recent levels for a number of years due to the huge amount of supply that has been made available through exploitation of shale gas and other non-traditional sources of supply.

We have discussed many of these impacts in earlier writing on our blog, Energy Currents.

In May we wrote about how U.S. chemical companies are being buoyed by low prices. And in April we wrote about how the U.S. electric generation mix has dramatically changed and also how the much lower prices in the U.S. relative to the rest of the world may lead to LNG exports.

But, of course, we aren’t the only ones noticing this.  On Oct. 25, the Wall Street Journal published a front-page article titled “Cheap Natural Gas Gives New Hope to the Rust Belt.” The articles discusses how low natural gas prices in the U.S. as compared to other key manufacturing areas in the world (prices in Europe are at least two times higher than in the U.S., and in Japan are threefold), are resulting in significant U.S. expansion of natural gas-intensive industries.  These include chemical, fertilizer, aluminum, steel, and glass industries.  In some regions that assumed manufacturing jobs were gone forever, new facilities are being constructed to take advantage of low natural gas prices.

In case you missed the article, it’s worth passing on a few key quotes from industry personnel:

  • “I never would have expected that as a region we’d have a second chance to be a real leader in American manufacturing.  Suddenly we’re back in the game.” – Bill Flanagan of the Allegheny Conference on Community Development
  •  “The U.S. is now going to be the low-cost industrialized country for energy.” – Philip Verleger, energy economist[1]
  • “It has been a complete 180-degree change in our thought process.” – Steve Wilson, CEO of CF Industries
  • “We convinced ourselves that this is not a temporary thing.  This is a real, durable phenomenon, a potential competitive advantage for the United States.” – Peter Cella, CEO of Chevron Phillips Chemical Company

No doubt we’ve seen low natural gas prices before, followed by price rises.  But many in the energy industry now believe that prices may stay low for a number of years, allowing investments in natural-gas consuming facilities to really pay off.

[1] To hear more of Verleger’s views on how he believes the U.S. can become energy independent, see the YouTube video of a recent presentation at:

by Bob Shively, Enerdynamics President and Lead Instructor

Original Article on Enerdynamics

Costa Rica: Commited to Renewable Energy

With more than 90% of its electricity generated from renewable energy sources and goals to reach 95% by 2014, Costa Rica is certainly one of the greenest countries on the planet. It also is on track to become the world’s first carbon-free economy.

I recently returned from a 12-day tour sponsored by Global Renewable Energy Education Network (GREEN) and showcasing renewable and sustainable energy in Costa Rica.With this experience fresh in my mind, I thought I’d take this opportunity to share some of the educational highlights with Energy Currents readers.

Costa Rica: A renewables paradise
Mother Nature has greatly influenced Costa Rica’s commitment to renewable energy. The country is blessed by copious amounts of rainfall – most of the country receives more than 100 inches of rain per year. Thus, it’s no surprise that over 80% of Costa Rica’s electricity is generated by hydro facilities. The country also boasts considerable geothermal power as well as growing wind assets, solar, and biomass facilities.

ICE’s role in renewables
The Instituto Costarricense de Electricidad or ICE (pronounced ee-say) of Costa Rica is the state-owned electric monopoly that provides power to over 98% of Costa Rican homes. While many of the facilities that produce this power are ICE-owned, a small percentage is owned privately under rather non-traditional contracts. In many cases, these facilities are privately owned for a period of 15 years and are then handed over to ICE, which then owns and operates them. After a decade-long break from allowing such projects, ICE recently announced a plan to again accept bids for privately owned renewable projects (100 MW of hydro and 40 MW of wind). The plan intentionally aligns with Costa Rica’s goal of becoming a carbon-free economy.

ICE also implemented a net metering program in 2010 whose goals were, again, to increase renewable energy production and thus the country’s energy independence. The pilot program also allows ICE to study the effects of distributed generation on its grid as well as to promote new renewable technologies.

A countrywide commitment
Costa Ricans are very proud of their renewable and sustainable efforts, which come at a premium price. Average residential rates are over 30 cents per kWh, and this may soon increase. Yet oddly, citizens are not likely to complain. The dedication to a renewable/sustainable society seems to be a shared goal, and the monetary cost of this commitment is widely accepted as are the variables that can affect it.

For example, with such a large portion of electricity needs met by hydroelectric power, the country is hugely dependent upon rain. And in dryer years, as 2012 has so far been, ICE is concerned that it cannot generate enough supply to match demand. Less water means less  hydro power is available. This means costs increase (since power must be purchased from other sources) and so does the amount of power generated from fossil fuels.

The GREEN tour afforded unprecedented access to renewable facilities in Costa Rica. My group and I enjoyed guided tours of hydro facilities, a biomass plant/sugar cane refinery, a geothermal plant, and a wind farm. Not only were we inches from the equipment housed in these facilities (imagine access like this in the U.S.!), but also heard first-hand accounts of how such equipment is run and ICE’s unique perspectives on electricity production.

While GREEN is currently focused on providing this experience to college-level audiences, Enerdynamics and GREEN are discussing a partnership where this unique opportunity could be available to business professionals. For more information, please contact me at

by John Ferrare, Enerdynamics CEO

Original Article on Enerdynamics