2007 Policy #05 Energy Forecast

PDF [with Graphics]:  P05 Energy

POLICY:  Energy
The energy slogan of the year is about curing “America’s addiction to oil”. In the next breath there is always mention of what new form the addiction will take; akin to curing heroin addiction with opium. Just as China’s economic “miracle” is yet to factor in apocalyptic environmental devastation wrought by China’s industrial “miracle”, so too much of US legendary productivity has rested on the salad days of wastefully-used under-priced energy. The “energy” issue is really two-fold – energy security, and the environmental degradation associated with pursuit of that security. There is no single answer to energy security or to associated environmental issues; the real world offers a mix of compromises. Getting this mix right with a wise balance of short-term need and long-term results is the challenge for policy-makers.
The US consumes about one-quarter of all world energy, about 100 quads (quadrillion BTU) of about 450 quad world total. Almost two-thirds of world energy consumption is by six industrial powers: US 23%, China 13%, Russia 7%, Japan 5%, Germany 3%, India 3%. Until recent years, the US used about one-third of world energy but with the breakneck industrialization of China, the US relative share has dropped. From all views, the US is a profligate energy user. Its energy consumption per capita is over twice that of industrialized nations such as Japan. Certainly the US is a power-house economy but per capita energy consumption is not proportionate to the GDP per capita of other productive nations. The US is simply wasteful or inefficient in its energy use and to date any attempt to make significant impact on this profligacy has been seen as tantamount to unpatriotic. However, the aggregate usage figures over time show that although the US is hooked on high energy use the addiction is not getting significantly worse (or better).
Greenhouse Gases (GHG)
A key GHG goal of the Administration’s 2002 Global Climate Change Initiative is to reduce US GHG intensity by 18% over the decade 2002 to 2012. GHG intensity is the ratio of GHG emissions to GDP. Thus, if the goal is met but GDP grows by 18% or more during the decade (which it will), the absolute quantity of GHG will still increase but at a rate less than GDP. DoE forecasts that, on its present assumptions, the GHG Intensity will be around 17%, slightly under target, and well under actual GDP growth of 30% to 40% over the decade.
US Energy Use
The sources of US energy are oil 40%, gas 23%, coal 23%, nuclear 8%, sustainable sources 6%. There are two distinct sub-plots in this. Almost 70% of oil is used (as gasoline and diesel) for transportation and over 60% of all oil is imported. Over 90% of coal, over 1 billion tonne each year, is used for generation of electricity and at present all of this is mined in the US. These two – coal  and oil, both fossil fuels – are the superstars. If oil stops, the US stops moving; if coal stops, electricity stops and with it industry, commerce, PlayStations, Starbucks and chunks of the internet.
Electricity (generated from coal 50% gas 19%, nuclear 19%, hydro and other sustainable 9%) is used almost equally across sectors: residential (36%), commercial (33%), industrial (27%). A breakdown of residential consumption is instructive in figuring future trends: air-conditioning 16%, lighting 9% , water heating 9%, and a multitude of “other appliances” using 42% of total comprise must-haves such as clothes dryers 6%, color-TVs 3%, personal computers 1.5%. When some say a lump of coal is used each time an order is made on Amazon, they are exaggerating but are making an important point – 10% reduction in electricity use means 10% less generating fuels used, more than all contribution at present from “green” electricity. The projections for total US electricity sales in 2030 range from 4,828 million GWh to 5,854 million GWh. Slowing growth in electricity demand relative to GDP growth – due to greater efficiency in devices, better building insulation standards, and market saturation in white-goods – is expected to keep electricity prices at around 7.1 to 7.6 cents per kWh. All forecasts show an increase in coal consumption over future decades and that the US for the first time will become an importer of coal.
Gasoline (and diesel) are the lifeblood of the American road and the most politically-sensitive of all energy issues. About 40% of all US energy use is in the form of gasoline and diesel from largely imported oil. A naïve analysis of sources of world oil would conclude that the US would have (or would foster) good relations with Canada, Iran, Iraq, UAE, Kuwait, Venezuela, Russia, Libya, Nigeria. With the exception perhaps of Canada, US relations with all of these countries is either strained or problematic. In all forecasts, US domestic crude oil production declines in coming decades and import dependence increases. Prices are volatile, affected by world events and by policies of petroleum exporting countries. Retail prices can change rapidly by as much as 20% , unlike any other commodity. Fuel prices do not have a strong correlation with vehicle use but increases are felt in other parts of economy — and by low-wage earners — as cash is diverted from discretionary expenditure to gasoline. As most oil is imported, higher oil prices also add unfavorably to the international terms of trade. Despite gasoline prices being a raw electoral nerve, analysts claim that the US is no longer as vulnerable to “oil shocks” as it was in the 1970s, because the element of surprise is now lost and that much of the economic damage of the period was due to monetary policy, not oil prices alone. Forecasts for crude oil prices in 2030 range between $28 per barrel and $96 per barrel (in 2004 dollars), showing that all that is certain is uncertainty. This is one motivation for the current attention to “America’s addiction to oil”.

Irrespective of the resource used to produce energy, energy solutions generally are concerned with either of the two national energy systems: reticulated energy (grid electricity and reticulated gas) for industrial and domestic use and portable energy (gasoline, diesel, tanked natural gas). In each case, the energy issue is concerned with reducing GHG, and/or achieving greater output per unit input (efficiency), and/or blunting demand through frugality or better buildings and cars. Both of these energy systems are supported by infrastructures which need modification to accept new energy sources or new energy regimes.Alternatives

Greener Electricity
Electricity generation by coal powered steam turbine is still the dominant method for electricity generation, as it was 100 years ago. The technology is mature and coal is relatively plentiful; however the US will start importing coal for the first time in coming years and CO2 emissions from coal-fired power stations are the core of the GHG issue. Several broad-brush solutions to the electricity-CO2 nexus have been on the table for some time: CO2 sequestration (removal of CO2 from coal-fired emissions); increasing use of “clean, green” alternative power generation (wind power, solar power,…); and increase in the use of nuclear-fired generation, which has no CO2 emissions.
CO2 Sequestration

If CO2 could be removed from coal-fired emissions, the energy-GHG nexus is broken in one step. Current ideas for this CO2 capture – carbon sequestration – involves capturing the 3% to 12% CO2 of smokestack gases, compressing the gas to liquid for transport and then sequestering it in some place where it will remain indefinitely. This seems like sweeping detritus under the rug because it is. Where to “hide” it is the issue – deep cold oceans where it will remain solid because of the high pressure, or in exhausted aquifers, coal mines or oil wells are the main suggestions. Norway has for some years been pumping one million tonne of liquefied CO2 each year into depleted natural gas domes under the North Sea. An advantage of this or other emission capture processes is that SO2 (sulphur dioxide, the cause of acid rain), mercury and other undesirable emissions apart from GHG CO2 can also be removed. Sequestration, whatever the details and possible ecological hazards, will add cost to electricity generation. One way costs might be ameliorated is to sequester the CO2 into aging oil reserves to increase the pressure and hence the yield. One elegant advance on this method is to capture the CO2 before it gets to the smokestack. Following successful demonstrations in Algeria and Norway. BP and partners are building a 350MW power station in Scotland using this “decarbonised fuel”; natural gas is first “split” into hydrogen and CO2; The hydrogen provides clean fuel for power generation and the CO2 is piped directly for sequestration in North Sea oil reservoirs.Sustainable (“Alternative”) Electricity

A range of technologies that 30 years ago were known as “alternative” are now referred to as renewable or sustainable. DoE adds a realistic note to this loose terminology by referring to resources that are nondepletable on a time scale of interest to society and tend to have low and stable operating costs. Significantly, the qualification of negligible environmental impact is absent from the definition and must be added. This is why the term sustainable captures the idea better than renewable. Felling trees for wood-burning energy production is use of a “renewable” resource (trees grow) but it is not a sustainable resource. Strictly speaking, even the “greenest” of energy sources has some impact and intensive use can have intensive impacts. Damming of rivers for hydropower, huge wind farms, large tidal power schemes all the environment to varying degrees, changing and killing rivers in extreme cases, changing coastline sand deposition, killing or disturbing wildlife and natural processes. The low-impact quality of  truly sustainable resource use should be kept in mind in a critical examination of current proposals.
Wind-power electricity generation has grown twenty-times worldwide from 3.5GW in 1994 to 59 GW by 2006; and is now an accepted mainstream, albeit small, contributor to electricity generation in suitable areas, often offshore where the wind is not impeded by landscape. It is attractive for future applications in areas such as the Great Lakes, close to large energy markets. Work recently done by the National Renewable Energy Laboratory says many early wind energy models using single turbines are automatically a “worst-case scenario” and give unduly gloomy results. Multiple units better model the real-world “lumpy” nature of wind in an area. Mature technologies will including blade adjustment including feathering for dangerously high winds, and automatic mounting and demounting.
Hydro-electric power is a mature base-load power generation technology; in countries with the appropriate natural resources it provides the majority of electricity; in Brazil 75% of electricity is hydro. Research continues on other hydro technologies such as instream tethered turbines which exploit the force of fast-flowing rivers without large capital works such as dams. Similarly, in suitable places fixed turbines can exploit the force of tidal flow. In future, immense installations larger than the biggest modern oil rigs may be installed off-shore in permanent ocean currents. In all of these cases, the natural force of wind or water is harnessed in the uncontroversial tradition of the centuries-old use of wind-mills and water-mills. A high-technology variant of these methods, ocean-thermal generation, exploits permanent temperature difference at two different ocean depths. Although presently obscure, technologies such as this may in the long-term warrant immense capital investment in large mid-ocean works that will provide “permanent” and “free” electricity on a continental scale.
Geothermal energy exploits the heat of the Earth’s molten core. The quantity of energy available at the surface is estimated at over 40 Terawatt (million GW), around ten times the total electricity generation everywhere, but only about 9GW are used for electric generation and 16GW for direct heating (hydrothermal) mainly in Iceland, New Zealand and about 20 other countries. Geothermal could be of use in Alaska where it is not presently used. Although the earth’s prodigious heat source is for practical purposes inexhaustible it is easily accessible only in places which, by definition, are geologically unstable and for that reason is often distant from high population areas. In future it may be possible to “package” geothermal energy into some portable energy-intensive form such as liquefied hydrogen but no demonstrations of this have yet been completed.
Solar-Power — the costs of photo-voltaic (PV) cells which convert sunlight directly to electricity has dropped over the last 30 years from about $30 per Watt in 1970 to under $3 per Watt now and PVs are expected to continue dropping with research and economies of scale in manufacture. There is already over 2.5GW generated by PVs worldwide, led by Japan where PV partly serves 160,000 homes. Germany, Israel, Spain, Portugal – and the US and Australia – are also investing in commercial-scale PV. Spain will bring 354MW more online within a year. Solar PV technology is solid-state (durable with no intrinsic moving parts) and, once installed, produces “free” electricity while the sun shines. It is ideal, for instance, when integrated into a larger grid as it performs at peak at precisely the time air-conditioning demand in hot sunny weather puts grid-crashing loads on the system. PV is likely to develop as both commercial-scale generators and as an energy augmentation measure at a household level. Almost certainly the “zero-energy house” idea will gather increasing interest and will prompt developments such as optional roof cladding sections integrating PV cells; rather than an additional structure covering an existing roof, the PV panel will be the roof itself which will reduce the nett cost. Isolated domestic PV is problematic  because of the cost of storage in some type of accumulator such as lead acid battery banks adds appreciably to cost, complexity, footprint, and maintenance. With the appropriate grid operating framework, any domestic PV installation with the right black boxes can sell energy to the grid when it is not needed and draw from the grid when needed. The grid works like a virtual accumulator. Solar-thermal is a related clean, green approach which uses focussed solar energy to drive conventional steam turbine power generation. Climate, capital cost, demand patterns and other issues determine whether solar-PV or solar-thermal is indicated in any particular case. A primitive yet highly effective use of solar heating is solar water heating which has been in use over several decades and continues to improve in efficiency. It is one of the most elegant of energy subsystems; water is heated directly by the sun with no intermediate processes, avoiding the very wasteful use of electricity to heat water.
All of these technologies have an optimum context. They are not a list from which one technology will eventually emerge as triumphant – all of these are part of the answer. The fact that the answer is complex – rather than simplistic – is one of the major indicators that will be needed in energy infrastructures. DoE calculates current consumption from renewable energy sources in the US at around 6.1 quads, around 6% of total consumption, but along with the contributions of hydropower (45%), waste (9%), and wind and “other” (15%), is listed a 31% contribution from wood. In most contexts, wood is not regarded a “renewable” fuel source. There will be many debates over what “renewable” should mean in coming years which is why “sustainable” predisposes a better perspective taking account of all inputs and all outputs.
Nuclear

Since commissioning of the first commercial nuclear power station at Calder Hall (Sellafield, UK) in August 1956, nuclear (fission) power has had mixed fortunes and a mixed press. The “nuclear debate” has been given new life now because nuclear electricity generation offers savings in GHG emissions over coal-burning and other combustion power stations and some argue that nuclear is a valid clean, green alternative – it emits nothing (with luck) and waste can be safely handled using synroc storage technology (created in 1978 in Australia). Others argue it is the least green alternative conceivable because of radiation risks before, during and after use, and vulnerability to terrorist attack. The unique property that some nuclear power technologies can be produce weapons-grade materials puts nuclear in a separate category. Current dependence on nuclear varies significantly among countries — France (79%), Germany (28%), Japan (28%), UK (20%), US (20%). Some countries such as Sweden are actively downscaling their dependence on nuclear generation, but during 2006 several countries announced their intention to implement or expand nuclear power – Indonesia, Egypt, Belarus, Argentina, Nigeria, Iran. There is now an array of “fourth generation” nuclear fission reactor  designs addressing safety and cost issues, including the Pebble Bed Modular Reactor (PBMR – a South African initiative), the Gas-Turbine Modular Helium reactor (GT-MH – a Russian initiative partly aimed at consuming decommissioned weapons plutonium) and the International Reactor Innovative and Secure project (IRIS – a multi-nation consortium formed by Westinghouse). Also, in September 2006, DoE granted $8M for research into engineering “pre-conceptual design”, a full rethink of future nuclear plant design. Nuclear has a continued attraction as it offers small, self-contained power generation units that can be brought online and offline relatively quickly, ideal for cycling to meet daily peaks. Although safe nuclear-generated electricity may be more expensive than coal-fired stations, this ability to support peak demand – obviating the need to build more coal-fired capacity – is still attractive.
Nuclear Fusion
The ITER project (International Thermonuclear Experimental Reactor) comprising the EU’s EURATOM, Japan, China, India, South Korea, Russia, USA aims to demonstrate the scientific and technical feasibility of fusion power at the ITER device in Cadarache [France]. Fusion, if feasible in this application (a controlled hydrogen bomb), will provide benign and abundant electricity on a transcontinental scale. Nuclear fusion must be confused in no way with the fission reactors which have been in use since 1956.
Greener Portable Fuels
Oils Ain’t Oils
“Oil” comes in many different forms and this accounts for the frequent contradictions in forecast of reserves. “Light sweet crude”, low sulfur readily-refined oil has been the benchmark and most desired resource since the inception of the petroleum industry. This was the first class of oil to be exploited and it is now in shorter supply or has been exhausted in some places. The aggregate quality of crude oil is dropping. This coincides with the gradual tightening over decades in emission standards – of lead, sulphur, mercury – and both factors have put increasing obligations (and costs) on refiners Average sulphur content (the “sweet-sour” parameter) has increased from 0.9% to 1.4% over the last 20 years. Imported crudes are becoming heavier and more corrosive. The anecdotal shortage of refining capacity in the US is due to this convergence of tighter output specification (often varying by state and season) and declining feedstock quality. Adaptation often requires capital-intensive upgrade, replacement or addition to refinery plant.
Synthetic Crude Oil (Syncrude)
Generous estimates of petroleum reserves still available generally include resources such as oil sands (bitumens), shale oil, and extra heavy crude. Much of Canada’s reserves are in the form of oil sands; Venezuela’s estimated 1.36 trillion barrels petroleum deposits are largely Orinoco extra heavy crude, a high-sulphur oil. As crude oil prices rise, known technologies will be applied to produce syncrude (synthetic crude oil) and traditional petroleum end-products from these resources. If crude oil prices remain above around $30 per barrel bitumens (oil sands) and extra-heavy crude will be economic to refine. The viability of shale oil is less certain. Shale oil is largely kerogen, a “young” form of crude oil. It can be burned directly as a solid fuel in place of coal or can be processed into syncrude at the rate of about of 25 gallons (0.6 barrel) of syncrude per tonne of oil shale. There is an estimated 2.9 trillion barrels in syncrude in known shale oil deposits, about 750 billion barrels in the US – equivalent of about 100 years of current demand. Where’s the catch? The processing cost of shale oil require a crude oil price of around $70 to $95 per barrel to be competitive. Also, a shale oil industry has apocalyptic environmental impacts. For each 1 million barrel per day of syncrude production, mining and remediation of 500 million tons of rock is needed each year, and 3 million barrels of water are required each day.
Non-crude Portable Fuels
Coal-to-Liquids (CTL) is a proven technology largely used in the US but is competitive only when  crude oil is above about $40 per barrel and the price for suitable quality coal is modest (about $1 to $2 per MBTU). The process is extremely dirty – there are challenges of waste disposal, water supply, and waste water disposal or recycling. CTL activity is sited in coal regions in the US mid-West and DoE forecasts the process will continue to be used, producing 1M to 2M barrel per day in 2030. Gas-to-Liquids (GTL) technology is more complex than oil refining; it converts natural gas into a range of petroleum fuels. The process is viable when the crude oil price is over about $25 per barrel and natural gas is in the range of $0.50 to $1.00 per MBTU.
Biomass-to-Liquids (BTL) originates from renewable sources, including wood waste, straw and agricultural waste, garbage, and sewage sludge. BTL fuels are several times more expensive to produce than gasoline or diesel with wholesale costs of around $3.35 per gallon now (a crude oil equivalent price of $80-$90 per barrel), but this is expected to drop to around $2.40 per gallon by 2020. There is no commercial BTL in the US but DOE commissioned some investigation from Bechtel in 1998. The world’s first commercial BTL plant, with a capacity of 4,000 barrels per day. is scheduled to come on line in Germany around 2008, with others to follow. BTL front-end technology is new and evolving and has parallels with cellulose ethanol process in its use of sophisticated enzymatic technologies. BTL in the short-term is limited to use as fuel extenders rather than primary fuels. In the long-term, in the absence of a major energy breakthrough such as fusion power, BTL may become a mainstream source of portable fuels.
Renewable Portable Fuels (Biofuels)
Biofuels are seen as a certain hope on the energy horizon but partisan positions often put an overly optimistic or overly pessimistic view. In all cases, projected costs should take account of the energy used in the fuel-making process, as well as any catalysts, other chemicals, and labor. Making fuels of the future from straw and similar materials has a labor-intensive component absent from the petroleum industry and, with agriculture as the main source, only solid planning will ensure a year-round supply of raw materials. Those processes restricted to using “waste” are noble causes but will have a tough time ensuring continuity of materials supply in a local area, and those which use agricultural crops directly (sugar cane, corn) are competing with food supply for land use and may produce unintended social consequences.
Ethanol — Ethanol (ethyl alcohol, an intoxicant) is currently the most widely used biofuel. It is produced from plant sugars — sugar beets in Europe, sugar cane in Brazil, and corn [wheat] in the US., and cassava experimentally in China. Production costs are generally low – around $0.75 per gallon in Brazil – but supply can be disrupted by drought or any adverse affect on the source crops. US Department of Agriculture expects corn-based ethanol annual production to soon exceed 7 billion gallons per year (=167M 42-gal barrels) and forecasts 60 billion gallons per year by 2030, almost 4M barrels per day. Ethanol production uses only a small part of the plant; the residue which can not by recycled as a soil conditioner, an animal food or as a building material is an addition to the world’s pile of agricultural “waste”. With difficulty, ethanol can also be produced from this cellulosic plant waste. This a more complex (and expensive) process but does not detract from food supply in the way the common process does, and uses materials presently regarded as waste. DuPont has invested in a commercial-scale cellulosic ethanol plant suggesting there is some long-term business return in that industry. The Departments of Agriculture and Energy have recently awarded a further $17.5 million (a tiny amount) in grants for research into biomass research, and development and demonstration of commercially viable processes for converting agricultural waste into ethanol. Given the biochemistry involved in this, breakthroughs (and windfall profits) are certainly out there to be discovered, probably in the form of the right enzyme and heat-treatment processes. Ethanol can already be readily blended into gasoline up to 10% and there is pressure on manufacturers to produce engines that can use blends of up to 85% ethanol.
Biodiesel — “Biodiesel” is one of the most-used terms in relation to current rethinking of energy sources. Biodiesel can be produced from a wide range of “sustainable sources” — vegetable oils and animal fats;  rapeseed and sunflower in Europe, soy oil in the US, and soy or palm oil in Asia. There have also been studies of this use for coconut oil. The oil feedstock is put through a well-established catalytic process of esterification with an alcohol (methanol or ethanol) to produce methyl or ethyl esters, and glycerin and fatty acids as by-products. The by-products have some value but less than the esterification cost of around $200 per tonne. Biodiesel has been in reliable use for over a century – mainly in stationery large-plant applications and during times of diesel shortage – but it is a stronger solvent than conventional diesel and can destroy fuel lines and other components not designed for it. The oils used all have a long-standing value as human or animal food. This and the cost of processing make biodiesel an expensive alternative to petroleum-based diesel oil. Although methyl esters have long been used as a component in soaps and detergents, it is only price competitive to diesel oil when oil prices are high. But the “renewable” nature of biodiesel rather than price has been a reason for interest. Governments may legislate use of some blend using biodiesel for import replacement reasons or green-motivated consumers may create increasing demand. Vehicle engines are now designed with the solvent properties of biodiesel in mind and to handle problems with quality variation and clogging that can cause damage to older engines. Typically a 20% biodiesel blend is the maximum recommended but some manufacturers now allow up to 100% biodiesel. Popular anecdotes that any vegetable oil will work in a diesel engine – such as filtered oil from deep fryers – is true to the extent that a wide range of substances will “burn” in the compression-ignition diesel but unprocessed oil wills eventually damage the engine.
Gas Fuels
Fossil gas fuels yield less GHG emissions (mainly CO2), hydrocarbons, nitrous oxides, particulates, and sulphur. Liquefied Petroleum Gas (LPG; “autogas”, “bottled gas”) is best compatible with gasoline (spark ignition) engines, with about 75% of the energy density of gasoline. It is generally propane (C3H8) and/or butane (C4H10), and other hydrocarbons (depending on source) and can be liquefied at normal temperatures. In contrast Liquefied Natural Gas (LNG) is mainly methane (CH4) which is best suited as a diesel substitute, but has an energy density of only about 60% of diesel. Where LPG can be compressed at normal temperature, LNG must be compressed (and stored) at around minus160°C and 8 bar pressure, a costly requirement and one limiting its use to heavy applications. The liquefaction process removes almost all impurities, producing almost 100% methane. The ratio of hydrogen to carbon in a hydrocarbon is a measure of how much CO2 will be produced; the higher the H:C ratio the better – hence methane (CH4) is a cleaner fuel than propane (C3H8) when fully combusted.
Some energy initiatives such as the use of landfill waste as an energy source have double benefits. Methane (“marsh gas”) is emitted by all rotting organic waste. It is a major GHG and landfills are the largest source of US methane emissions. Capturing this gas displaces the use of other fuels and prevents the methane joining the GHG load. The City of Memphis has operated a landfill gas project since September of 2004, displacing the use of more than 67M gallons of gasoline in its first two years of operation, according to the EPA.
Natural gas hydrates (NGH) are a yet-untapped source of gaseous hydrocarbons. NHG is generally methane trapped in sponge-like structures of watery ice. The US Geological Survey estimates there is 320,000 trillion ft3 of NGH in deep water offshore the US coast and around 600 trillion ft3 in Alaska’s North Slope. This NHG potential dwarfs US natural gas annual production of around 20 trillion ft3, and reserves of around 200 trillion ft3. Commercial exploitation of NGH has not been attempted and there is no pressure at present on the Alaska reserves while large natural gas deposits remain at Prudhoe Bay and elsewhere. Interest is likely to continue in the offshore deposits but all recovery methods presently envisaged (heating and/or pumping water into the deposit) are energy-intensive or problematic.
There has been a move to run vehicles on LPG in many countries because it is cheaper or has been made cheaper by government incentive towards slightly greener fuel. In the pattern of world energy usage it must be noted that LPG plays a crucial part in the lives of billions of people in the developing world as  the ultimate in portable energy used for cooking, heating, refrigeration, and lighting. Coupled with modern small turbines as generators and water pumps, LPG is a strategic commodity on a global scale.
Greater Efficiency
Greener Buildings
With 20% of electricity used for heating and cooling and 16% for lighting, significant improvement in efficiencies in just these two areas would have impact of many GWh across the country. Work in solid-state lighting (SSL) already offers possible 50% savings in energy costs for lighting and the impact of insulation and glazing on heating-cooling costs are already well known, but DOE’s Zero Energy Homes (ZEH) program is attempting a “whole house” approach in systematic achievement of energy savings. Importantly, here as across the whole spectrum of energy issues, there is no “magic bullet”, no single technology that will alone save the day. Already, ZEH prototypes have shown that an energy-efficient building shell, efficient appliances, and the appropriate mix of solar water heating and photo-voltaic (PV) can produce a dwelling with near zero net energy purchases.
Lighting (domestic and commercial) consumes around 16% of US electricity and is a good target for improved efficiencies that will have widespread impact with noticeable effect on the energy bottom line. Solid-state lighting (SSL) – using light-emitting diode (LED), organic LED (OLED), polymer (PLED) technologies – offers promise of over 50% reduction in power consumption for equivalent light output. SSL promises efficacies of 150 to 200 lumens per watt, twice the efficacy of fluorescent lighting and 10 times the efficacy of incandescent lighting (the common Edison “globe”). Also, SSL units have typical lifetimes of 100,000 hours, the light is produced without heat and units can be incorporated architecturally in ways not possible with conventional lighting.
The AC/DC Problem
Electricity is electron flow — direct current (DC) – and all early work in electricity such as Edison’s work with electric illumination was concerned with these simple flows. But direct current does not travel well over significant distances and it was soon discovered that alternating current (AC), particularly in high voltages, does travel without equivalent losses. So electricity is routinely distributed from power stations as high voltage AC and converted using transformers down to domestic voltages, between 110V and 250V in various parts of the world. But scores of electronic devices in every office and home use low voltage DC, which is why millions of little black transformers now litter the power outlets of the world. Some significant energy could be saved if buildings had a reticulated DC circuit. A standard may arise which pipes DC (perhaps 18V or 24V) from a single efficient large transformer in each building.
© 2006, OSS Copyright – all rights reserved to Title, Concept, Format, & Original Content.
Copyright in items cited remains property of the respective copyright owners. 9
Greener Transport
As vehicles account for 40% of US energy use mostly drawn from imported crude oil, any improvement in vehicle efficiency has a direct effect on the US bottom line. Much has been done by mandate to clean up vehicle emissions in the last few decades but little impact has been made on absolute fuel use. Vehicle innovation take two forms – incremental improvement in present designs, and radical rethinking of power-plant  (motor) and power-train (transmission) design. Over the last 20 years, motor vehicles have increased slightly in weight, have increased about 80% in power, and have improved fuel-consumption by about 20% to a typical 29 miles per gallon. There will be continuing incremental innovation in lightweight materials, aerodynamics, friction reduction, and low rolling-resistance tires. These improvements are marginal but they are independent of type of power-plant. Some innovation of conventional power-plant and power-train design are relatively low-tech and have been satisfactorily demonstrated in buses and heavy vehicles. These include dynamic energy transfer to flywheels or to hydraulic pressure reservoirs – braking energy is transferred to the storage system and taken back again to assist with starting off, resulting in up to 50% in fuel economy.
The greatest single radical change to vehicle thinking is a range of electric-powered designs. The electric motor is a highly desirable power plant – it has a low-parts count, is intrinsically efficient, is compact, low-maintenance, and gives the highest torque at greatest load (when starting). Compared to the properties of the electric motor, the internal combustion ignition engine is one of the greatest mistakes in history. The efficiencies of electric rail and light rail (streetcars, trams) is legendary and has never been bettered. The first generation electric automobiles had almost 100 years of innovation, from the 1830s into the 1920s, when Henry Ford’s mass production, and the availability of West Texas crude, killed them. Electric vehicles were still hand-made and were marketed to the well-to-do for town use; Ford’s vehicles were one-third the price and could travel the highways appearing all over the US. Then as now, the energy efficiency of the electric motor was no match for the challenge of providing a portable high-endurance source of electricity. The challenge is two-fold – the cost, weight, and design-life of the electric source, and the vehicle range before recharge. Three answers have emerged – the all-electric vehicle with the cell technology available ideally suited to urban travel, the hybrid gasoline-electric vehicle (first implemented in 1916) that charges its battery while under gasoline power, and the fuel-cell that provides continuous electricity fuelled typically from a tank of hydrogen fuel. The fuel-cell is not a battery of electric cells; it is a catalytic electrochemical device that produces electricity while fuel is provided. All else being equal, the hydrogen fuel-cell electric vehicle – often called the hydrogen car — has all the qualities of the vehicle of the future, and market penetration will certainly grow as fuel-cell costs drop and hydrogen filling stations spread. The first hydrogen filling station in the US was opened by BP in Michigan in October 2006; it manufacturers hydrogen on the premises using mains energy. DoE forecasts that the new vehicle sale of hybrids will grow from 0.5% now to around 9% by 2030, but sales of all-electric vehicles will continue to be almost non-existent at around 0.1% [sic] by 2030. Substantial decreases in the cost of electric vehicles may change those figures dramatically.
Futures
Infrastructure
The strategic plan of the US Climate Change Technology Program (CCTP) recognizes that transition away from GHG-emitting (and uncertain) fossil fuels to renewables will require “continued improvements in cost and performance of renewable technologies”, which is obvious but, most significantly for real progress, the plan calls for “shifts in the energy infrastructure to allow a more diverse mix of technologies to be delivered efficiently to consumers in forms they can readily use.” This means two things – energy infrastructures must link to “a portfolio of renewable energy technologies” in situ, and – most radically – the grid must fully accommodate two-way energy flows to and from local areas and individual consumers. Energy corporations must not now be in the business of selling electricity (or gas) anymore; they now must sell connectedness to a dynamic and smart grid that is evolving every year (or every day) into a greener network of resources. Reticulation itself is the new industry.
It is the mix of a “portfolio” of new energy technologies that brings a new order of energy security. The distributed nature of DARPA’s internet is the key to its ruggedness and distributed generation is the only possible answer for electricity grids of the future. Also, the distributed generation concept opens an  entirely obvious new possibility in world development. For instance, micro-turbines fed on biomass gases (or reciprocating engines fed on cow dung) could bring electricity to clusters of Indian villages now, rather than waiting for massive investment in huge central coal-fired power stations and high-voltage distributors.
In its simplest forms, local generation of energy is not new – elegantly simple solar hot-water technology has saved millions of GWh during its decades of history; solar powered remote telephone exchanges and satellite uplinks has also for decades brought communication to remote regions throughout the world.
Distributed Generation
One raft of clean, green solutions for distributed energy come from unexpected sources – the jet engine, and an engine design dating from 1816. The laws of physics allow the turbine (“jet”) engine to scale very well from the very large to the very small, which is not possible with the internal combustion (gasoline or diesel) engine; hence, micro-turbines run from portable or fixed gas supply can generate electricity with high efficiency anywhere anytime. The Stirling external-combustion reciprocating engine is the Beta tape of the engine world. The whims of investment rather than technological supremacy allowed the gasoline engine to kill off not only the electric vehicle in the 1920s but proven mature technologies such as the reciprocating engine that is highly efficient, has few moving parts, and is open to a wide range of fuels.
Large scale gas turbines in commercial power generation waste around two-thirds of energy input through heat that is dissipated into the atmosphere. Cogeneration – that harnesses and delivers this heat as a valued service — is an approach that more than doubles energy efficiency and halves GHG emissions. Some large institutions now use cogeneration to provide off-grid electricity and heating (or refrigeration). Large commercial power plants are generally distant from populations; cogeneration applications based on small turbines can be installed in a matter of days in the center of population centers where the electricity and heating/cooling services are used. Micro-turbines – with or without cogeneration – are quiet, have a higher power density (power to weight) than piston engines, extremely low emissions and very few moving parts (sometimes just one). Some are designed to be air-cooled and can operate without lubricants or coolants. They can use a range of fuels — propane, diesel, kerosene, methane, or other biogases from landfills and sewage treatment plants. The transportable turbine generator can be brought to the source of the biogas and latched into the grid rather than needing the biogas to be somehow moved to a power plant. This cavalier fashion in which generators can be attached (and unattached) to the grid firstly depends on a regulatory framework that provides for that; there are no technical obstacles – modern power switching “black box” technology is highly sophisticated and inexpensive.
Dumb Grids and Smart Networks
In just 13 minutes the power grid of the 80,000-square-mile Canada-US Eastern Interconnection area was toast.  – Steve Silberman, “The Energy Web”
The electricity distribution network – heavy-duty copper cable terminating in every home and office in the developed world – offers a high-technology convergence that is yet to get the attention it warrants. Broadband Over Power Lines (BPL) has been demonstrated for some years in places such as Australia and in April 2006 a regulatory framework was approved in California. BPL exploits very basic physics. For many decades, consumers have been able to opt to have electric water heating turned off during peak demand for a lower tariff. The signal switching heating circuits on and off is the crudest and earliest  application using signaling over power lines. BPL at speeds of 12Mbps have already been commissioned and the theoretical maximum bandwidth for each consumer is many times higher. BPL will allow remote customers beyond ADSL telephone service or cable to have broadband services. In areas already serviced, BPL will provide a “third pipe” ensuring even keener competition for broadband services.
More importantly, BPL enables smart grid technologies that have been the subject a very detailed theoretical development. A smart-grid would allow domestic devices to talk back to the power grid; the consumer could set simple equipment, probably from their existing PC, to negotiate prices and qualities of electricity supply. Doubtless a deluge of BPL interfaces and software will appear once penetration reaches a critical point and will give the term “smart building” a real meaning. A new activity, hobby, obsession of “energy tuning” will emerge that will not only switch electrical devices but will decide which should be adjusted based on the prevailing tariff. If tariffs soar during peak load, BPL-based controllers may switch a device such as an air-conditioner off, or adjust its temperature setting. Conversely if a grid supplier sees demand slipping, they could tout for extra demand from controllers that are allowed to switch suppliers. When critical mass is reached, the presently dumb electricity grid starts to become a smart web-like (or web-based) network. BPL offers obvious price advantages to consumers but it also gives electricity suppliers the opportunity to smooth peaks and troughs in demand through real-time price “negotiation”. This would lead to the second generation use of the “smart grid”. Obviously, the consumer does not get just those electrons their chosen supplier sends to them but in aggregate the system does work like that. If enough consumers specify 60% of their requirement must come from green sources, the system will warn household controllers (or simply start switching things off) when the network aggregate demand reaches the level of supply. The market would become “perfect”.
Another issue concerns quality. For crude electric uses such as heating and lighting, the quality of the supply is secondary to continuity – dim lights are better than no lights – but there are an increasing range of manufacturing processes, and home-office requirements that are greatly inconvenienced by as few as one or two blackouts or brownouts each year. A growing demand for power conditioning is certain. Whether this will be manifested in sophisticated (and expensive) home-office systems or local / neighborhood systems remains to be seen. What is certain is that vast electric grids stepping high-voltage AC down to districts and then down again within each local area is simply unable to guarantee the level of quality that high-tech equipment needs. Lightning events, storms, or a road accident bringing down lines, all threaten the Goliath hub-and-spokes model of electric grid. Traditionally, electricity suppliers have had a supply goal of 99.9%, (“three-9s”) representing an outage of about nine hours in a year. In India, Iraq, and all of the developing world that goal is a distant dream; in the middle of Manhattan, or Tokyo, nine hours over two or three incidents a year is no longer tolerable. More importantly the quality of this supply is below specification for far less than 99.9% of the time. New goals in the electricity industry – and the high-tech equipment lobby – speak of “nine-9s” (99.9999999%) as the new reliability and quality standard. For practical purposes this is impossible to meet without a systematic decentralization of the grid. To achieve the next generation, electricity sub-stations and transformer points throughout the grid must be able to disentangle themselves from a grid crisis and continue to serve local areas with acceptable quality for a practical period of time. All of these possibilities need just one more “black box” in each building’s meter box, where the grid meets the consumer. This would be the intelligent junction for any or all of the following:
•  Tail-end from any local photovoltaics, wind power (DC probably 12V);
•  tail-end from any local mains voltage generation;
•  head end of building mains power circuits;
•  head end for building DC circuit/s;
•  head end for vehicle charging circuit;
•  tail-end from the electricity grid (the “supply”).
But, above all, policy-makers must create regulatory framework that permits the nation’s grids to join the digital age and mandates standards and installation safety.
Near-Term:
Just as “climate change” in a few years has moved from an assertion of the lunatic fringe to scientific fact, just so the realization that there is some-thing very wrong with public energy policy will soon enter common discourse. Much data about the relative virtues of various energy initiatives is misleading (by accident or intent) because it does not take into account all the costs (cradle-to-grave) of each techn-ology. Now that water security is more precarious than even energy security, technologies that consume or render water unusable should be seen as high-cost. Grain ethanol technology – the shining star of the moment – is certain to come to tears at some stage as it competes direct-ly with the food supply for land. Broadband Over Power Lines (BPL) is within reach now and awaits only sound policy frameworks. The opportunity to provide basic broadband services over an existing link will attract third parties to invest in the line and head-end infrastructure which will achieve most that is required for an intelligent two-way decentral-ized (internet-like) energ.
Mid-Term:
Allah has been most merciful with distribution of oil reserves, to Arab nations and to non-Arab Muslim nations such as Iran. It seems it will be ultimately a political imperative rather than green consciousness that will put a brake on crude oil usage. Crude oil, the source of a vast array of unique plastics, is a resource to valuable to burn while there are alternatives. Fifty years ago, rail supporters said policy-makers would rue the day they neglected rail in favor of road transport – that day may be here now. Energy prices for portable fuels are certain to cause even more pain to road transport as rail lines sit growing weeds. In the mid-term, critical mass may arrive in the fuel cell market and lead-acid batteries and many of their modern counterparts can be consigned to recycling centers along with the internal combustion engine. With the right regulatory framework, and regime for safe, authorized connection to the grid, the smart grid can get started. Architects in droves will join their cutting edge colleagues who now design buildings for efficiency (or even self-sufficiency). Cities in windy areas will have tasteful wind generators; sunlit cities will have integrated photovoltaic roofs. Home energy enthusiasts in their millions will drive rapid innovation in gadgets, gizmos all aimed at “energy tuning”.
Long-Term:
Hydrogen probably is the terminal point for all energy endeavors in search of a clean, green, portable energy source but the takeup rate will depend on the numbers of early-adopters willing to pay a premium while the price is still high. Within a few years nuclear fusion may appear, deus ex machina, to solve the world’s energy problems forever. Or not. In all events, if China, India, and the US continue as now, the next major wars may be not over ideology, or water, but energy.
After some experimentation, the “energyplex” or “eco-industry park” concept will mature –suites of co-located industries will use “waste” energy (often heat) and “waste” output material of one operation will be an input for another; process water will be recycled. This will not motivated by clean, green sentiments but by cost savings.
Energy reticulation itself is the new industry.