Marlo Lewis

In their latest report on climate change, officials at the U.N. Intergovernmental Panel on Climate Change (IPCC) once again fail to address important developments in climate science that conflict with their narrative of fear. (See: Threat from global warming heightened in latest U.N. report)

Specifically, the IPCC press release ignores: (1) the growing divergence between observed global temperatures and the computer model projections on which scary climate impact assessments depend, (2) 20 recent studies indicating that climate sensitivity (an estimate of how much warming results from a given increase in atmospheric greenhouse gas concentrations) is about 40 percent less than the mean estimate of IPCC models, and (3) studies indicating that the three main climate doomsday scenarios — ocean circulation shutdown, rapid ice sheet disintegration, runaway warming from melting frozen methane deposits — are scientifically implausible (for references, see pp. 23-26 of CEI’s comment letter on the social cost of carbon).

Worse, as usual, IPCC officials say nary a word about risks of carbon mitigation policies. Those include:

  • The public health and welfare risks of carbon rationing schemes or taxes that raise business and energy costs.
  • The economic, fiscal, and energy security risks of anti-fracking climate policies that endanger the shale revolution.
  • The economic development risks of coal power plant bans and other policies that limit poor countries’ access to affordable energy.
  • The risks to international peace and stability of impeding developing country economic growth through carbon caps or taxes and carbon-tariff protectionism.
  • The risks to scientific integrity when government is both chief funder of climate research and chief beneficiary of a “consensus” supporting more regulation and higher taxes.
  • The risk to the democratic process when governments promote “consensus” climatology to justify bypassing legislatures and marginalizing opponents as “anti-science.”
Post image for Why and How I’m Celebrating Human Achievement Hour

“Better to light one incandescant bulb than curse the darkness”

Tonight is Human Achievement Hour, a time to celebrate human progress and the market institutions that facilitate and protect it. It’s also a time to laugh at the regressive ideology that implores us to turn out the lights to honor the Earth. Hence the wonderful acronym for our cheerful occasion: HAH!

Our friends at CFACT nail the contrast between our event and the other team’s when they proclaim: “It’s always Earth Hour in North Korea.”

Earth Hour CFACT

HAH is an alternative and antidote to Earth Hour, the premise of which is that carbon-based energy is bad for people and the planet. That’s about as wrong-headed about the big picture as one can get.

Carbon energy supports all the technological advances that sustain and improve a world of seven billion people who on average live longer, healthier, and with greater access to information than the privileged elites of former ages.

Fossil fuels have been and remain the chief energy source of what Cato Institute scholar Indur Goklany calls a “cycle of progress” in which economic growth, technological change, human capital formation, and freer trade co-evolve and mutually reinforce each other. Progressive civilization is the very context of modern life. It is the most valuable of all public goods. Without carbon energy, humankind would be dramatically smaller, poorer, and sicker.

The fundamental contribution of carbon energy to social progress is reflected in the strong correlation between carbon dioxide (CO2) emissions, per capita GDP, and population.

Global Progress 1760 - 2009 smallest

A survey by the National Academy of Engineers identifies 20 engineering achievements that made the greatest improvements in the quality of human life during the 20th century. Number One is electrification. All the others presuppose electrification either for their manufacture, operation, or mass production.  Here’s the list as presented on About.Com:

  1. Electrification – the vast networks of electricity that power the developed world.
  2. Automobile – revolutionary manufacturing practices made the automobile the world’s major mode of transportation by making cars more reliable and affordable to the masses.
  3. Airplane – flying made the world accessible, spurring globalization on a grand scale.
  4. Safe and Abundant Water – preventing the spread of disease, increasing life expectancy.
  5. Electronics – vacuum tubes and, later, transistors that underlie nearly all modern life.
  6. Radio and Television – dramatically changed the way the world received information and entertainment.
  7. Agricultural Mechanization – leading to a vastly larger, safer, less costly food supply.
  8. Computers – the heart of the numerous operations and systems that impact our lives.
  9. Telephone – changing the way the world communicates personally and in business.
  10. Air Conditioning and Refrigeration – beyond convenience, it extends the shelf life of food and medicines, protects electronics, and plays an important role in health care delivery.
  11. Interstate Highways – 44,000 miles of U.S. highway allowing goods distribution and personal access.
  12. Space Exploration – going to outer space vastly expanded humanity’s horizons and introduced 60,000 new products on Earth.
  13. Internet – a global communications and information system of unparalleled access.
  14. Imaging Technologies – revolutionized medical diagnostics.
  15. Household Appliances – eliminated strenuous, laborious tasks, especially for women.
  16. Health Technologies – mass production of antibiotics and artificial implants led to vast health improvements.
  17. Petroleum and Gas Technologies – the fuels that energized the 20th century.
  18. Laser and Fiber Optics – applications are wide and varied, including almost simultaneous worldwide communications, non-invasive surgery, and point-of-sale scanners.
  19. Nuclear Technologies – from splitting the atom, we gained a new source of electric power.
  20. High Performance Materials – higher quality, lighter, stronger, and more adaptable.

Note too that those technologies are highly developed and deployed at scale only in societies with access to plentiful, reliable, affordable energy, most of which comes from fossil fuels.

Ah, but our greener friends will say, HAH, as the very name suggests, is “anthropocentric.” What about the biosphere? Shouldn’t we turn off the lights to show respect for non-human nature?

Nope. As Goklany also explains, by improving the productivity and efficiency of food production, distribution, and storage, fossil fuels not only rescued mankind from a penurious Nature but also rescued Nature from an ever-growing humanity. [click to continue…]

Post image for Answering Michael Lind’s Question: Why Is No Country Libertarian?

Last week at Salon, Michael Lind raised a question he thinks “libertarians can’t answer,” namely, “If your approach is so great, why hasn’t any country anywhere in the world ever tried it?” He elaborates:

Why are there no libertarian countries? If libertarians are correct in claiming that they understand how best to organize a modern society, how is it that not a single country in the world in the early twenty-first century is organized along libertarian lines?

Lind regards the non-existence of a libertarian country as a slam-dunk refutation of libertarianism. “If socialism is discredited by the failure of communist regimes in the real world,” he asks, “why isn’t libertarianism discredited by the absence of any libertarian regimes in the real world?”

Maybe because when political communities adopt libertarian institutions, principles, and policies such as property rights, freedom of speech and association, freedom of contract, free trade, and legislative checks and balances, the results are generally good, and when communities adopt antithetical institutions and policies the results are generally bad.

Reflecting on those big-picture realities, one is led to the ideal of a society of free and responsible individuals. The ideal is a polestar that helps direct our aim. But that is all. I don’t know a single libertarian who thinks that, if we just keep pushing, some day we will all live in Libertaria.

Why are there no full-blown libertarian regimes in the real world? The answer to this question is so obvious it’s a wonder Lind hasn’t thought of it. Libertarians have actually explicated it in detail. It’s called public choice theory.

[click to continue…]

Post image for Court’s Obamacare Decision — What Would John Locke Say?

Richard Epstein of the Hoover Institution and the University of Chicago Law School gives the Chief Justice some tough love in “What Was Roberts Thinking? The Chief Justice was neither an umpire nor a statesman. Only a lawyer.”

There are many wise words in Prof. Epstein’s column, which I heartily encourage anyone visiting this site to read.

My only quibble is that the professor could have been harsher on the Honorable John Roberts. Really, Roberts held that the Obamacare individual mandate is a penalty not a tax so the Court could take jurisdiction but that the mandate is a tax not a penalty so the Court could uphold mandate’s constitutionality. Why do Congress’s words (“penalty”) not the provision’s alleged function (“tax”) count for determining standing but the alleged function not the words count for determining constitutionality? This is “too clever by half,” as Epstein observes. The only “logic” operating here is political: pick and choose which meaning is convenient to get the outcome you want.

Even this ruse fails, as Epstein argues, because the mandate is in fact a penalty, not a tax. In the dissent, Justice Antonin Scalia notes that the word “penalty” occurs 18 times in the portion of the statute dealing with the individual mandate, whereas “tax” occurs in other provisions, demonstrating that Congress chose “penalty” deliberately, because, after all, the thing so labeled is not a tax. As Scalia argues, Roberts “saved” the Affordable Care Act (a.k.a. Obamacare) by “rewriting” it. Thus, Roberts’s “judicial modesty” was actually a case of “judicial overreach.” Roberts joined the liberals to legislate from the bench.

What Roberts the “statesman” doesn’t get is that when the judges engage in policy-driven, results-oriented, jurisprudence, they forfeit their claim to impartiality. Each time they do this, they reinforce the conclusion that the system is rigged and that justice is to be found only in the strength of one’s own party or faction — or one’s own arms. In other words, when justices are no better than politicians in black robes, they undermine the social compact and bring back the state of war.

Seventeenth century English philosopher John Locke, with his usual clarity, said it all in the Second Treatise (Chapter III, Of the State of War):

Sec. 20. But when the actual force is over, the state of war ceases between those that are in society, and are equally on both sides subjected to the fair determination of the law; because then there lies open the remedy of appeal for the past injury, and to prevent future harm: but where no such appeal is, as in the state of nature, for want of positive laws, and judges with authority to appeal to, the state of war once begun, continues, with a right to the innocent party to destroy the other whenever he can, until the aggressor offers peace, and desires reconciliation on such terms as may repair any wrongs he has already done, and secure the innocent for the future; nay, where an appeal to the law, and constituted judges, lies open, but the remedy is denied by a manifest perverting of justice, and a barefaced wresting of the laws to protect or indemnify the violence or injuries of some men, or party of men, there it is hard to imagine any thing but a state of war: for wherever violence is used, and injury done, though by hands appointed to administer justice, it is still violence and injury, however colored with the name, pretences, or forms of law, the end whereof being to protect and redress the innocent, by an unbiased application of it, to all who are under it; wherever that is not bona fide done, war is made upon the sufferers, who having no appeal on earth to right them, they are left to the only remedy in such cases, an appeal to heaven. [emphasis added]

Post image for The ALEC Controversy — Much Ado About Nothing

I listened to the NPR segment about The Nation magazine and Center for Media and Democracy’s (CMD’s) alleged exposé on the American Legislative Exchange Council (ALEC), a national association of conservative state legislators. Calling their project “ALEC Exposed,” The Nation and CMD try to make hay out of the well-known fact that ALEC’s task forces, which include both public- and private-sector members, draft model legislation “behind closed doors.” As if any lawmaker ever places in the public record his deliberative discussions with staff and lobbyists on legislation he is drafting!

The Nation and CMD also neglect to mention that although private-sector members have a “voice and a vote” in the task forces that develop model legislation, only the public-sector (state-legislator) members of ALEC’s board of directors decide which proposals become ALEC model bills.

More importantly, ALEC model bills only become law if they go through the same process of hearings and debate that other bills introduced in state legislatures do.

I checked out The Nation‘s landing page, then “Business Domination, Inc.” (the first of five articles posted on the site), then the CMD site and its archive of 800-plus ALEC model bills. There’s no there there. What The Nation and CMD are waging is just another lefty campaign to drive the marketplace out of the marketplace of ideas.

In “Business Domination, Inc.,” the authors claim that ALEC believes that “Any force in civil society, especially labor, that contests the right of business to grab all social surplus for itself, and to treat people like road kill and the earth like a sewer, should be crushed.” Didn’t Karl Marx say stuff like that about all capitalists?

[click to continue…]

fascinating article in the New Yorker (September 13, 2010) by Jonah Lehrer describes how publication bias, selective reporting, and sheer randomness mistaken for causal connections can mislead even the most disciplined researcher. The article is a sobering reminder of “how difficult it is to prove anything” even using carefully-constructed, replicable experiments such as double-blind clinical trials.

A few excerpts should whet the appetite to read the essay in full:

But the data presented at the Brussels meeting [on second-generation antipsychotic drugs] made it clear that something strange was happening: the therapeutic power of the drugs appeared to be steadily waning. A recent study showed an effect that was less than half of that documented in the first trials, in the early nineteen-nineties. Many researchers began to argue that the expensive pharmaceuticals weren’t any better than first-generation antipsychotics, which have been in use since the fifties. “In fact, sometimes they now look even worse,” John Davis, a professor of psychiatry at the University of Illinois at Chicago, told me.

But now all sorts of well-established, multiply confirmed findings have started to look increasingly uncertain. It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable. This phenomenon doesn’t yet have an official name, but it’s occurring across a wide range of fields, from psychology to ecology.

If replication is what separates the rigor of science from the squishiness of pseudoscience, where do we put all these rigorously validated findings that can no longer be proved? Which results should we believe?

[Biologist Michael] Jennions . . . argues that the decline effect is largely a product of publication bias, or the tendency of scientists and scientific journals to prefer positive data over null results, which is what happens when no effect is found.

[Publication] bias was first identified by the statistician Theodore Sterling, in 1959, after he noticed that ninety-seven per cent of all published psychological studies with statistically significant data found the effect they were looking for. A “significant” result is defined as any data point that would be produced by chance less than five per cent of the time. This ubiquitous test was invented in 1922 by the English mathematician Ronald Fisher, who picked five per cent as the boundary line, somewhat arbitrarily, because it made pencil and slide-rule calculations easier. Sterling saw that if ninety-seven per cent of psychology studies were proving their hypotheses, either psychologists were extraordinarily lucky or they published only the outcomes of successful experiments.

While publication bias almost certainly plays a role in the decline effect, it remains an incomplete explanation. For one thing, it fails to account for the initial prevalence of positive results among studies that never even get submitted to journals. It also fails to explain the experience of people like [Jonathan] Schooler, who have been unable to replicate their initial data despite their best efforts.

Richard Palmer, a biologist at the University of Alberta, who has studied the problems surrounding fluctuating asymmetry, suspects that an equally significant issue is the selective reporting of results—the data that scientists choose to document in the first place.

The problem of selective reporting is rooted in a fundamental cognitive flaw, which is that we like proving ourselves right and hate being wrong. “It feels good to validate a hypothesis,” Ioannidis said. “It feels even better when you’ve got a financial interest in the idea or your career depends upon it. And that’s why, even after a claim has been systematically disproven”—he cites, for instance, the early work on hormone replacement therapy, or claims involving various vitamins—“you still see some stubborn researchers citing the first few studies that show a strong effect. They really want to believe that it’s true.”

Although such reforms would mitigate the dangers of publication bias and selective reporting, they still wouldn’t erase the decline effect. This is largely because scientific research will always be shadowed by a force that can’t be curbed, only contained: sheer randomness. Although little research has been done on the experimental dangers of chance and happenstance, the research that exists isn’t encouraging.

A recent study by the Manufacturer’s Alliance/MAPI finds that EPA’s proposed revision of the “primary” (health-based) national ambient air quality standard (NAAQS) for ozone (O3) would have devastating economic impacts.

NAAQS Basics

NAAQS are emission concentration standards expressing EPA’s judgment of how low air pollution levels must fall to “protect public health” with an “adequate margin of safety” and to “protect public welfare” from harmful effects on agriculture, animal life, and buildings. The Clean Air Act obligates States to come into attainment with NAAQS via EPA-approved emission control measures known as State Implementation Plans (SIPs). The Act requires States to attain primary NAAQS within five or at most 10 years. There is no statutory deadline for attaining “secondary” (welfare) NAAQS. Failure to attain NAAQS results in sanctions, such as loss of federal highway grants.

Staggering Job and GDP Losses

In January, EPA proposed lowering the primary ozone NAAQS from 75 parts per billion (ppb) to between 60 and 70 ppb. MAPI estimates that a primary ozone NAAQS set at 60 ppb would:

  • Impose annual compliance costs of $1.013 trillion between 2020 and 2030 (equivalent to 5.4% of projected GDP in 2020).
  • Reduce GDP by $687 billion in 2020 (3.5% below the baseline projection).
  • Reduce employment by 7.3 million in 2020, a figure equal to 4.3% of  the projected 2020 labor force.

In a companion report, the Senate Republican Policy Committee (SRPC) shows the MAPI-estimated job losses and “energy tax” burden (compliance cost + GDP reduction) each State would incur if EPA implements a 60 ppb ozone standard. The biggest losers are California, Pennsylvania, and Texas, although nearly all States face multi-billion dollar energy taxes and thousands to tens of thousands of lost jobs:

  • California, with a 12.4% unemployment rate and 2.2 million unemployed job seekers, would incur a total State energy tax of $210 billion and lose 846,000 jobs, during 2020-2030.
  • Texas, with 8.3% unemployment and one million unemployed job seekers, would pay a $452 billion energy tax and lose 1.6 million jobs.
  • Pennsylvania, with 9.2% unemployment and almost 585,000 unemployed jobs seekers, would pay an $85 billion energy tax and lose 351,000 new jobs.

Costs Increase as Intensity and Scale of Effort Increase

How can the impacts be so punitive? One reason, says MAPI, is that “the marginal cost of incremental reductions increases very rapidly as the standard is tightened.” As is often said, picking the low-hanging fruit is easier and cheaper than harvesting from the top of the tree. As MAPI puts it:

Initial reductions in ozone are relatively less expensive because the reductions can be achieved by using existing technologies (“known controls”) to reduce ozone precursors. As standards are tightened, more expensive technologies are required and at some point new technolgies (“unknown,” yet-to-be-developed controls) are presumed [by EPA] to emerge and then be implemented.

Another reason is that ever-larger reductions in ozone-precusor emissions are required to achieve the same incremental decline in O3 concentrations. On this point, MAPI sites EPA’s July 2007 Regulatory Impact Analysis (p. 4-12):

  • Reducing O3 from 84 ppb to 79 ppb requires 102,000 tons of additional nitrogen oxide (NOx) reductions.
  • Reducing O3 from 79 ppb to 75 ppb requires 321,000 tons of additional NOx reductions.
  • Reducing O3 from 75 ppb to 70 ppb requires 1,004,000 tons of additional NOx reductions.
  • Reducing O3 from 70 ppb to 65 ppb requires 2,239,000 tons of additional NOx reductions.

The implication of those numbers is startling. To reduce O3 from 84 ppb to 79 ppb, States must reduce NOx emissions by 20,400 tons for each 1 ppb decline. However, to reduce O3 from 75 ppb to 70 ppb, States must reduce NOx emissions by 136,600 tons for each 1 ppb decline. To reduce O3 from 70 ppb to 65 ppb, States must reduce NOx emissions by 247,000 tons of NOx emission reductions for each 1 ppb decline. In other words, achieving a 5 ppb decline in O3 from 70 ppb to 65 ppb takes 12 times the NOx reductions required to achieve a 5 ppb decline from 84 ppb to 79 ppb. The effort is greater by more than an order of magnitude. Presumably, an even greater effort would be required to reduce O3 from 65 ppb to 60 ppb.

The dramatic increase in the scale of effort is evident from the sharp increase in the number of counties that fall out of attainment as the standard is tightened from 84 ppb down to 60 ppb.

85 Counties with Monitors Violate the 1997 (84 ppb) Ozone Standard


322 Counties with Monitors Violate the 2008 (75 ppb) Ozone Standard


Up to 650 Counties with Monitors Violate Proposed (60-70 ppb) Ozone Standards


Source: EPA,; Congressional Research Service:

Of the 675 counties nationwide that have ozone monitoring stations, 85 counties violate the 84 ppb (1997) ozone standard, 322 violate the 75 ppb (2008) standard, and 515 to 650 counties violate proposed standards ranging from 70 to 60 ppb. More than 96% of all counties with monitoring stations violate the most stringent standard EPA is considering. Most of the nation’s 3,140 counties do not have monitoring stations. Many more than 650 would likely have to deploy both new technologies and “unknown” technologies to come into attainment with a 60 ppb standard.

How Dangerous Are Current Ozone Levels?

A predictable response to the MAPI and SRPC reports is that ozone kills and we should do everything possible to protect “the children.”

Joel Schwartz and Steven Hayward of the American Enterprise Institute analyze the literature on ozone and health in their book, Air Quality in America: A Dose of Reality on Air Pollution Levels, Trends, and Health Risks.  They present substantial evidence that ozone at current levels is a relatively minor health risk:

  • In about one third of the cities examined in a Johns Hopkins air pollution study, ”higher levels of particular matter and ozone were associated with lower risks of premature death.”
  • After adjusting for “publication bias” (the tendency of researchers to submit for publication only those studies that confirm their initial hypothesis), a World Health Organization (WHO) analysis “concluded that higher ozone was associated with lower respiratory mortality.”
  • When properly analyzed, a much-touted California Air Resources Board (CARB) study on ozone and childhood asthma actually shows that no areas in California have ozone levels high enough to affect childhood asthma risk.
  • The same CARB children’s health study found no association between ozone standard violations and growth in children’s lung function.
  • Large increases in asthma prevalence have coincided with large declines in air pollution indicating that “asthma incidence and air pollution are unrelated.”
  • EPA’s proposal to revise the standard down to between 60 and 70 ppb is based on a study that found a small (1-1.5%) average reduction  in lung function in 30 healthy young adults who breathed laboratory air averaging 60 ppb for 6.6 hours. To get this result, the subjects alternately exercised on stationary bicycles and tread mills for six 50-minute periods. This is equivalent to several gym workouts in a row, well beyond the exertions that people in  ”sensitive populations” (infants, people with respiratory disease, the elderly) typically undertake.
  • Moreover, the ozone concentrations measured by outdoor monitors may exceed the actual levels people breath by as much as 65%, because surfaces near the ground (streets, buildings, even clothing) destroy ozone. A laboratory study of the effect of 60 ppb ozone is more likely monitoring the effects of outdoor ozone of at least 100 ppb – well above the current standard.

EPA and CARB characterize ozone as a deadly peril, which is hardly surprising. Regulatory agencies exist to regulate. The scarier the assessment, the greater the apparent rationale for expanding the scale and scope of regulation. On the flip side, as my colleague Ben Lieberman observes, the “non-attainment industry” would take a huge hit if the Nation finally did come into attainment with all applicable air quality standards. To stay in business, the regulatory establishment must continually campaign for tougher standards as U.S. air quality improves.

Schwartz and Hayward ask: If current ozone levels are so deadly, then how come EPA and CARB project such tiny health benefits from reductions in those levels? For example, EPA estimated that switching from the pre-1997 ozone standard of 120 ppb averaged over 1 hour to the tougher standard of 84 ppb averaged over 8 hours would reduce hospitalizations for asthma attacks by only 0.6%. CARB estimated that adopting its even tougher 70 ppb standard would reduce emergency room visits for asthma by 0.35%. Even these small benefits are likely to be overestimates since the projections are “based on a selective reading of the health effects literature that ignores contrary evidence,” Schwartz and Hayward argue. And I’ve got to wonder, given the multitude of factors that influence hospitalization rates, how would EPA and CARB ever know whether a tiny reduction in hospitalization rates were due to their regulations rather than to a host of other unrelated causes?

Wealthier Is Healthier, Poorer Is Sicker

The irony is that adopting costly new air quality standards may actually impede improvements in public health. The resources available to protect public health, safety, and the environment are finite. Consequently, policymakers should set priorities to target limited resources on the most serious risks. Forcing the private sector to spend trillions of dollars to achieve miniscule or non-existent health benefits hinders rather than advances public welfare. Moreover, because people use income to enhance their health and safety, regulations that destroy jobs, lower wages, and increase the cost of consumer products can literally be lethal. Spare-no-expense, health-at-any-cost regulation ignores the obvious connection between livelihoods, living standards, and life expectancy.

A prosperous economy supports the development of improvements in health care and makes those improvements more widely available. In contrast, a faltering economy diminishes investment in R&D and curbs spending on life- and health-enhancing goods and services. Unemployment is stressful and is associated with unhealthy habits such as smoking and excessive drinking. Several studies (here, herehere, here, and here) confirm what common sense tells us — that poverty and unemployment increase the risk of sickness and death. As the late Aaron Wildavsky observed long ago, wealthier is healthier. An ozone NAAQS that imposes trillion-dollar energy taxes on our struggling economy and destroys over 7 million jobs is likely to do much more harm than good.

In a message titled, ”EPA WILL REGULATE GLOBAL WARMING IN STATES WITH OR WITHOUT AUTHORITY,” the ever-vigilant Maryam Brown of the Senate Republican Policy Committee reports:

As you likely saw, Senator Baucus [D-MT] said yesterday that he would strip U.S. EPA’s authority to regulate greenhouse gas emissions under the Clean Air Act: “That would put too much power into few hands.” (Source:  E&E News) Senator Baucus’s apprehension to EPA’s power over all activity is well placed.

On October 5th, EPA officials said that those states not cooperating come January 2nd would face a gap in permitting authority that could prevent sources from receiving the necessary permits.  [In plain English: If states don’t come along, the Obama EPA will hold up projects (and the jobs that go with) in your state.]  (Source:  BNA Daily)

Because these statements echoed states such as Texas’s fears that EPA has a “plan for centralized control of industrial development through the issuance of permits for greenhouse gases,” EPA issued a clarifying statement on October 6th: “EPA has a mechanism in place to ensure permitting can occur without disruption in any states that currently do not have authority to regulate GHG.”  [In plain English: Whether there is authority or not, the Obama EPA will regulate the states.] (Source:  BNA Daily)

Baucus’s opposition to EPA regulation of greenhouse gases is noteworthy for three reasons.

First, as E&E News observes, Sen. Baucus “is considered a key vote to obtain in order to pass any climate bill and a bellwether for many other moderate Democrats on the issue.” Second, Baucus voted against Sen. Lisa Murkowski’s resolution (S.J.Res.26) to overturn EPA’s Endangerment Rule — the trigger for a cascade of greenhouse gas regulation under the Clean Air Act. If he is a “bellwether,” then other opponents of S.J.Res. 26 may also have come to their senses and realize that Congress should not let EPA legislate climate policy.

Third, although Baucus may not acknowledge it, his “too much power into few hands” argument is tacit criticism of the Supreme Court’s ruling in Massachusetts v. EPA, which both authorized and pushed EPA to regulate greenhouse gases via the Clean Air Act. The Court authorized EPA to regulate greenhouse gases when it declared that “greenhouse gases fit well within the Clean Air Act’s capacious definition of ‘air pollutant’” (they don’t, as I explain here).

In addition, the Court pushed EPA to regulate greenhouse gases by pre-judging EPA’s endangerment proceeding. The Court held that EPA must make a positive finding of endangerment if it decides that “greenhouse gases cause or contribute to climate change” — as if climate change per se = endangerment. Since greenhouse gases by definition have a greenhouse effect, the Court left EPA only one alternative — declare that “the scientific uncertainty is so profound that it precludes EPA from making a reasoned judgment as to whether greenhouse gases contribute to global warming.” An impossible alternative for an agency that had been a certified member of the alleged “scientific consensus” for many years.

The key point regarding Mass. v. EPA, though, is that Sen. Baucus is almost uniquely qualified to rebut the claim that the Clean Air Act authorizes EPA to regulate greenhouse gases from new motor vehicles. During congressional deliberation on the Clean Air Act Amendments of 1990, Baucus  introduced legislation requiring EPA to do just that. As originally introduced on September 14, 1989, S. 1630, the Senate version of the 1990 Clean Air Act Amendments, contained a Section 216 on “Carbon Dioxide Emissions from Passenger Cars.” The provision would require the Administrator to establish tailpipe emission standards for CO2:

SEC. 216. (a) PROMULGATION OF REGULATIONS- The Administrator shall promulgate regulations providing for standards applicable to emissions of carbon dioxide from passenger automobiles (as defined in 15 U.S.C. 2001(2)). Such standards shall require that for model years 1995 to 2002, the average of such emissions from passenger automobiles manufactured by any manufacturer shall not exceed two hundred and forty two grams per mile, and for model year 2003 and thereafter, such average shall not exceed one hundred and seventy grams per mile.

However, the Senate declined to adopt that provision.  Another part of Baucus’s draft legislation, Title VII of S. 1630, would have made “global warming potential” a basis for regulating ”substances manufactured for commercial purposes,” such as chlorofluorcarbons and halogens. Although Title VII declared reductions in CO2 and methane emissions as a national goal, it did not explicitly provide authority to regulate those gases, which are byproducts of combustion and agricultural activity rather than “substances manufactured for commercial purposes.”

In any event, the House-Senate conference committee ultimately rejected even that limited basis for global warming regulation while also dropping Title VII’s goal of reducing CO2 and methane emissions. The only trace of Title VII’s climate language that survived is Section 602(e) of Title VI, which directs the Administrator to “publish” the “global warming potential”of ozone-depleting substances. To ensure that trigger-happy regulators would not go off half-cocked, the phrase “global warming potential” is immediately followed by this admonition: “The preceding sentence shall not be construed to be the basis of any additional regulation under [the CAA].”

So with the possible exception of Rep. John Dingell (see pp. 65-66 of this committee print), who chaired the House-Senate conference committee on the 1990 Clean Air Act Amendments, probably nobody on Capitol Hill knows better than Sen. Baucus that Congress never authorize EPA to regulate greenhouse gases for climate change purposes. Baucus tried to persuade the Senate to approve greenhouse gas emission standards for new motor vehicles — and failed. House and Senate conferees also rejected the other greenhouse gas regulatory provisions he had proposed. A lawmaker doesn’t forget stuff like that!

And now, 20 years later, Baucus is willing to break ranks with his own party leadership and incur the wrath of the green establishment because EPA is amassing powers that, in the last major re-write of the Clean Air Act, he tried and failed to confer on the agency via legislation. Sen. Baucus, I salute you! OK, I will salute you if you match your brave words with action and do something to stop EPA!

The Court in Mass. v. EPA ignored its own better judgment: “Few principles of statutory construction are more compelling than the proposition that Congress does not intend sub silentio [by its silence] to enact statutory language that it has earlier discarded in favor of other language.” INS v. Cardozo-Fonseca, 480 U.S. 421, 442-43 (1983) It is not too late to correct the Court’s error. If Sen. Baucus is indeed a bellwether, that correction may not be long in coming.

Energy Secy. Steven Chu kicked off a three-day federal “sustainability” symposium today by announcing that the Department of Energy will install solar rooftop water-heating panels on . . . the White House.

“Around the world, the White House is a symbol of freedom and democracy,” Chu told an audience of federal employees, according to Greenwire, the online energy & environment news service. “It should also be a symbol of America’s commitment to a clean energy future.”

Apparently, nobody interviewed in connection with the article sees anything goofy about the mighty DOE and the White House trying to save the planet one rooftop at a time. Nor anything comedic in talking about the future of presidential bath and shower water.

Chu’s announcement came one month after eco-activist Bill McKibben led a demonstration demanding that President Obama install rooftop solar panels. To show that if you will it, it is not a dream (okay, I’m editorializing here), McKibben presented White House officials with a solar panel from Jimmy Carter’s White House.  Initially, they rebuffed him. But now, they’ve taken one small symbolic step back to the future Carter!  Of course, McKibben hails Chu’s pledge as a giant step for mankind.

“The White House did the right thing, and for the right reasons: They listened to the Americans who asked for solar on their roof, and they listened to the scientists and engineers who told them this is the path to the future,” said McKibben, the co-founder of the nonprofit “If it has anything like the effect of the White House garden, it could be a trigger for a wave of solar installations across the country and around the world.”

Yup, hardly anybody “across the country and around the world” would be planting flowers or “installing” flower gardens  if the White House had not shown the way via those Rose Garden tours!

Apparently, nobody interviewed by Greenwire wanted to mention the elephant in the room, namely, that McKibben’s symbolic victory is a far cry from the political victory Team Obama and eco-campaigners boasted they would win by enacting cap-and-trade.

Ive got nothing against solar technology, which has come a long way since the Carter days. Nonetheless, outside of certain niche markets and applications, solar is not competitive with fossil energy or even with other so-called non-hydroelectric renewable energies. See slide #21 of the Energy Information Administration’s Power Point presentation on its 2010 Annual Energy Outlook report.

Yes, solar power has enjoyed a rapid growth spurt in Germany, but that is due market-rigging subsidies known as feeder tariffs. If an industry cannot sustain itself without special policy privileges, does it really deserve to be called “sustainable”?

If approved by the California electorate this November, Proposition 23 will suspend the implementation of AB 32, the California Global Warming Solutions Act, until the State’s unemployment rate declines to 5.5% or less for four consecutive quarters. AB 32 requires a reduction in the State’s greenhouse gas emissions to 1990 levels by 2020 — about 25-30% below the baseline projection.

In a just-published study for the Pacific Research Institute, Dr. Benjamin Zycher estimates that adoption of Proposition 23 will increase aggregate employment in the State by a bit less than 150,000 in 2011, about half a million in 2012, and 1.3 million in 2020, relative to the case in which AB 32 goes into effect.

The California Air Resources Board projects that AB 32 will decrease State-wide energy consumption by 4.5% in 2012 and 9.4% in 2020. Energy, of course, is used to support economic activity: “workers use energy to accomplish their tasks.”

Zycher derives AB 32′s employment impacts from CARB’s energy-consumption projections during 2010-2020 and data on the historical relationship between energy consumption and employment in California.