Red Team EPA’s Endangerment Finding

The foundation for existing EPA regulations controlling greenhouse gases, primarily carbon dioxide—CO2—is the agency’s Endangerment Finding that was issued following the 2007 Supreme Court decision that greenhouse gases could be considered pollutants under the Clean Air Act (CAA) and that EPA could regulate them if it found that they endangered human health and the environment.  In 2009, EPA did exactly that. A challenge that followed was rejected by the DC Court of Appeals that found the “Endangerment Finding … (is) neither arbitrary nor capricious … and EPA’s interpretation of the governing CAA provisions is unambiguously correct.”

Although the appeals court decision appears to be an insurmountable hurdle to challenging since the Supreme Court has shown no interest in revisiting its 2007 decision, it is still being challenged even though Administrator Pruitt seems unwilling to initiate the rulemaking process to withdraw the Finding. And, based on commentary by environmental lawyers on the prospects for challenging the Finding and the procedures of the Administrative Practices Act, success is not certain.  However, the significance of the Endangerment Finding make the effort worthwhile.  Otherwise, a new Administration can undo the current regulatory rollback efforts that are now underway.

None the less, the Texas Public Policy Foundation (TPPF) has filed a petition with EPA challenging the Endangerment Finding claiming that “in its rush to regulate greenhouse gases in 2009, the Obama Administration missed an important step. It failed to submit the greenhouse gas endangerment finding to the Science Advisory Board for peer review, as required by statute, and that violation is fatal to the endangerment finding”.

In addition to challenging on the basis of a procedural flaw in issuing the Finding, it would be prudent to also attack its very foundation.  In issuing the Finding, EPA concluded, based on of peer-reviewed research, from the Intergovernmental Panel on Climate Change, the U.S. Global Climate Research Program, and from the National Research Council, that there was compelling evidence that carbon dioxide emissions endanger public health and welfare by warming the planet, leading to extreme weather events and increased mortality rates.

Last month, Steven Koonin, a well respected physicist and former Under Secretary of Energy in the Obama Administration wrote in the Wall Street Journal proposing that climate science be subjected to a “Red Team” exercise.  As Dr. Koonin wrote, “The national-security community pioneered the “Red Team” methodology to test assumptions and analyses, identify risks, and reduce—or at least understand—uncertainties. The process is now considered a best practice in high-consequence situations… .

Taking on the entire field of climate science would be a daunting undertaking and it is doubtful that Congress would set up a commission to do so.  Instead, petitioners should focus on just the foundation of the Finding which would be much more manageable and potentially more successful.  First, the assertion that carbon dioxide is a pollutant is vulnerable to a large body of science and the greening of the planet that has been taking place for decades.  Second, the contention that increasing levels of CO2 lead to temperature increases that cause extreme weather events like hurricanes and premature deaths is vulnerable to scientific facts and accumulating empirical evidence.  Hurricanes have not been increasing and claims of increased premature deaths are products of computer models that cannot withstand careful scientific scrutiny.

In 1993, the Supreme Court in Daubert v. Merrell Dow Pharmaceuticals established standards for the admissibility of scientific evidence in judicial proceedings.  The Court’s decision provides guidance that can be used to rebut EPA’s findings about CO2.  In particular, the Court stated, “in order to qualify as scientific knowledge, an inference or assertion must be derived by the scientific method.”  It went on to state, “Scientific methodology today is based on generating hypotheses and testing them to see if they can be falsified.”  The hypotheses that are the foundation for the Finding can be falsified as a Red Team exercise would show.

Professional Fakery

A major source of fake news and alternative facts is professionals who engage in political reality spinning while using their credentials as a source of legitimacy.  A recent example is the opinion piece in the Wall Street Journal by Alan Blinder, Princeton economist and former vice chair of the Federal Reserve and member of the President’s Council of Economic Advisors.

Blinder’s snarky assessment of the Trump tax plan, well it’s not a plan but a set of principles, is the work of political hacks; not that of a well respected academic.  Since the President released a one page summary of reform principles, Blinder must think that he is clairvoyant to divine the form legislation will take and the breakdown between winners and losers.  Since he is not clairvoyant, he relies on what he calls the “Republican Tax Cut Formula”.

Like all progressives, Blinder calls the proposal, which he translates into a plan, “remarkably regressive because it lowers the top tax rate from 39% to 35%, the corporate rate from the highest in the developed world to 15% and makes that rate apply to Sub-Chapter S corporations.

According to a report from Yardini Research the top 10% of tax payers, those with incomes over $100,000—hardly rich—pay 77% of federal income taxes.  The top 50% of tax payers pay a whopping 97% of federal income taxes, leaving the bottom 50% paying 3%.  How can reducing the top rate while also lowering the lowest rates be regressive?  Blinder must have an econometric model that makes it so.

The average corporate tax rate in developed countries is 22.5% while ours is 39% when state taxes are included.  This rate makes US corporations less competitive and creates an incentive to move investment offshore and keep foreign earning overseas.  It is estimated that US companies hold about $2.5 trillion overseas and untaxed. We live in a global economy, so making our corporations more competitive increases domestic capital investment, the route to increased productivity and job creation.  Blinder equates the change for Subchapter S corporations as a sop to hedge funds, real estate developers, and law firms—organizations that don’t have great public images.  The reality is far different than Blinder’s illusion.  According to the Tax Foundation, there are 24.7 million US corporations.  23 million are subchapter S which are subject to personal tax rates.  They are far more and far more diverse than hedge funds and others that the public scorns.

In his opinion piece, Blinder says “The system would remain complicated, unfair, and inefficient.  But the richest would pay much less.  Since legislation has not been written, how can he know this?  He doesn’t, it’s just alternative facts and fake news.

Republicans have a once in a generation opportunity to begin the process of putting our fiscal house in order since they control both the Congress and White House.  They realize this and also realize that picture could change next year.

That reality should give a sense of urgency to comprehensive tax reform.  The current tax code has 4 million words, which according to the Washington Times is seven times the length of War and Peace.  The Times also points out that 75 years ago the IRS 1040 had 2 pages and now it has 206.  Major reform that involves simplification is not an impossible objective.  Part of achieving it involves abandoning using the code to push industrial policy or other social objectives and for both political parties to work together.

Since entitlements represent two-thirds of federal spending, fixing social security, which Blinder didn’t mention, is an imperative.  The 1986 Bi-partisan Commission on Social Security represented a good start but it is clear that without another effort, both the national debt and deficit will continue to grow until our economic wellbeing is undermined.

People like Blinder and his colleague Paul Krugman should be using their professional talents and analytical rigor to address real fiscal problems instead of marketing partisan hobgoblins.

 

 

The Misguided March for Science

Although the Earth Day March for Science was billed to emphasize that “science upholds the the common good and to call for evidence-based policy in the public’s best interest,” it was nothing more than a reaction to the Trump Administration’s agenda and comments by the President that have been interpreted as hostility to science.  The only difference between the President’s remarks and the actions of the Obama Administration is that he has been blunt, even if misguided and ill-informed, where the Obama Administration wrapped its abuse of science in politically correct language.

Obama’s war on Climate Change was cloaked in science but was nothing more than a war against fossil fuels, especially coal.  EPA used science much as a drunk uses a lamp post—for support and not illumination.  Its Clean Power Plant regulation claimed benefits that were preposterous on their face.  The claim that reductions in air pollutants would reduce the incidence of asthma and premature deaths was accepted without challenge by many of those involved in the March for Science.  If they really wanted science based public policy, they would have challenged the basis for those claims by pointing out that the incidence of asthma has been increasing even though air quality kept improving and that the estimates of premature deaths avoided implied epidemiological precision of greater than 99%.  There is no scientific basis, beyond political science, that can justify those claims.

EPA’s abuse of science has been going on for a long time and has been accepted by many in the scientific community because it advanced the agenda of environmentalist elites who use science as a tool to increase the political power of government to promote their policy preferences. How else can you explain black box modeling and the one hit, linear dose response approach to toxic impacts?

Here are a few examples.

In evaluating the health effects of air pollutants and carcinogens, EPA takes the most conservative approach possible by assuming that there is no safe exposure and that dose response is linear.  Professor Judith Curry of Georgia Tech described one example this way: “ … EPA decided that there is no safe level of ambient PM 2.5 – however near to zero — at which risk of ‘early” death ceases. Statisticians call this analytic approach a “no threshold linear regression to zero analytic model.” …   This methodology … contradicts a foundational principle of toxicology, that it’s the dose that makes the poison. …”

In 2014, a former EPA scientist wrote Confessions of a Computer Modeler.  The example he used was his development of a large model to assess sewer treatment and drinking water quality.  As the regulations became more stringent, his analysis concluded that the agency “had hit the point of diminishing returns.”  EPA didn’t want to hear that.  He was told to rework the analysis and “sharpen my pencil”.  That kept happening until he asked his supervisor what result he was looking for?  In the end, he concluded “my work for the EPA wasn’t that of a scientist, at least in the popular imagination of what a scientist does.  It was more like that of a lawyer.  My job, as a modeler, was to build the best case for my client’s position.”

The agency draws on scientists to conduct research and participate on its advisory panels.  But, many of these groups end up reviewing the work of their members.  This has been shown to represent a clear conflict of interest.  Group Think virtually guarantees that these groups will pursue enlightened  self interest by being collegial and agreeing with each other’s research.

Past attempts to mandate transparency and data quality have not brought an end to EPA mischief.  And, it is the prospect that Congress will be more vigorous and vigilant going forward that rattled some of the organizers of the recent march.  If the marchers are serious about promoting “evidence-based policy” they will coalesce around accepted principles of science and complete transparency.  A good place to start would be support for the Reference Manual on Scientific Evidence and the rules set out by the Supreme Court in its 1993 opinion in Daubert v. Merrell Dow. 

 

Is Tax Reform Like the Horizon?

Since the inauguration there has been a steady drumbeat that tax reform was going to happen quickly but like the horizon it recedes as we approach it.  The Treasury Secretary promised that tax reform would take place by August but just two days ago admitted that date was unlikely.  The more action is delayed the more it begins to look like the Administration and Congress face stiff opposition in accomplishing one of the most important actions that would be a positive jolt to the economy.

In 1986, the Reagan tax reform was not easy for a popular president.  It took two years to achieve and during that period of time the process moved in fits and starts.  The President, his advisors and members of Congress would do well to read the history of the Reagan accomplishment:  Showdown at Gucci Gulch by Jeffrey Birnbaum and Alan Murray.

The longer a tax reform initiative is delayed the harder it will be to accomplish since democrats will see delay as a means to regain control of the House next year.  What could be better than running against a do-nothing Congress and an unpopular president?

Comprehensive tax reform will bring out special interests who will like reform as long as they can protect their favored provisions. And, that may be enough to stall the process this year.  So, what is the alternative?

How about doing tax reform in stages with each stage being able to garner bi-partisan support while minimizing special interest opposition.

Today’s New York Times contained an opinion piece by Steve Forbes, Arthur Laffer, Larry Kudlow, and Stephan Moore that made a very sensible suggestion that could get the reform ball rolling.  It proposed three actions: reduce the corporate tax rate from the world’s highest at 35% to 15%; immediate expensing of capital purchases; set a low rate on repatriating the $2 trillion that is held offshore.  To help win democrat support, the writers suggest using the money from the repatriation tax for infrastructure investment—roads, bridges, the electric grid, broad band access, and airports.  That means union jobs a favorite of democrats.

To really generate strong support, there is one additional action that would please most taxpayers while only alienating accountants and accounting firms.  That action would be to follow the precedent of Japan and the Netherlands and let the IRS prepare “pre-filled forms”.  Since the IRS already collects all of our financial information through wage statements and 1099s, it could send tax payers a pre-prepared form that they could check, sign, and submit.  That is how it is done in Japan and the Netherlands.  And, in Japan the tax form is a post card.

If the architects of comprehensive tax reform are serious about getting something done this year, they would do well to adopt of the philosophy of the icon Vince Lombardi—do a few things but do them very well.  Streamlining the tax code represents the battle of interests and like the 1986 will not happen quickly.

Hand Wringing or Serious Threat?

New Jersey Senator Robert Menendez is expressing concern that the potential purchase of Citgo assets from Venezuela by Russia’s Rosneft could represent “threats posed to our national security, economy and energy independence.”   Currently, Rosneft, a Russian owned oil company, holds 49.9% ownership in Citgo as security for a loan made last year to Venezuela.  Menendez and several other members of Congress have written to the Treasury Secretary expressing concern about the prospect of Rosneft gaining control of Citgo.

Given Russia’s meddling in last year’s election and what could involve broader use of cyber warfare to be politically disruptive, a higher level of concern is justifiable.  Instead of jumping into the competition for news coverage, it would be more reassuring if those members of Congress first gathered facts and analyzed them.

Citgo has had a substantial presence in the US for decades.  Its gasoline is sold through about 6,500 independent outlets in 28 US states and Citgo Petroleum owns oil refineries in Illinois, Louisiana, and Texas.  While that would represent a substantial Russian investment and footprint in the US, it would not be the only one.  Since 2000, Lukoil, one of the 20 largest oil companies in the world, has been selling petroleum products in 11 east coast states and the District of Columbia.

While adding Citgo to its US portfolio would allow Russia to engage in serious disruptive mischief, the question is would it?  Probably not.  Putin and his oligarch cronies, who have enriched themselves by plundering Russia’s resources, have moved their wealth overseas to protect it and see it grow.  Putting a large investment such as Citgo at risk would not be in their self interest.  Any action to disrupt our energy market could be countered in a number of ways and not just including freezing their assets.

Oil is fungible, so any effort to manipulate the flow to Citgo refineries could be offset after initial disruption. When refineries go down, except in California, others move quickly to make up the shortfall. Any action that constrained product sales would just expand competitors market opportunities  without a long term effect on price.  That is how competitive markets work.

Foreign investment can contribute to a more productive relationship between the US and Russia, which is sorely needed.

Congressional due diligence is appropriate; ill informed fear mongering isn’t.

 

 

Irrational Exuberance

That is the phrase made famous by Alan Greenspan in describing the dot com bubble of the late 1990s.  Today, it could be applied to the investment bubble in Tesla.  The Wall Street Journal reported that Tesla has surpassed Ford Motors in terms of investor value even though it sales are 1% of Fords and Tesla’s earning per share and net income have been negative since it came into being. Since there are no signs of it earning a profit anytime soon, especially if government subsidies are taken away, investors, except the shrewd ones, are displaying irrational exhuberance.

The LA Times ran an article on Elon Musk two years ago and made the point that Tesla wouldn’t be around without the $4.9 billion in government subsidies.  Musk has raised crony capitalism to an art form and has perfected the Bootlegger and Baptist scheme for getting very rich by becoming an environmental icon.

Without continued taxpayer funding, Tesla would have gone bankrupt as Solyndra and A123 systems did.  As the LA Times pointed out, the Model S, which is the wealthy’s symbol of environmental political correctness, “sells for more than $100,000, but that is literally tens of thousands of dollars less than it costs to manufacture and sell.”

The Times went on to say, “Every time a Tesla is sold, we witness a transfer of wealth to a rich hobbyist (most Teslas are their owners’ third or fourth car), while average Americans are on the hook for at least $30,000 in federal and state subsidies. Tesla is more a regulatory arbitrageur than an auto manufacturer.”

In addition, to the $7500 federal tax credit, a number of states provide credits or rebates to Tesla buyers.  Not surprising, the most outrageous is California which created a zero emission mandate requiring an arbitrary number of “zero-emission” vehicles to be sold each year. Tesla’s Model S earns four emission credits per unit sold which it then sells to other manufacturers for $20,000 with the cost borne by California taxpayers.  A few years ago, Tesla received over $129 million in these credits but still lost $61 million in its manufacturing and sales.  Clearly you can’t go broke as long as you are spending someone else’s money.

The image of a zero emission, high mileage vehicle that is affordable for most is almost every driver’s dream.  And, Tesla is promising a new lower cost Model-3 in the near future–$35,000.  Investors must be betting on economies of scale lowering cost enough, including battery costs, to make the Model-3 really affordable with a range greater than 200 miles between charges.  Absent a battery technology breakthrough, which doesn’t appear likely any time soon, lithium ion batteries and their costs will limit Tesla’s range and the potential loss of subsidies limit Tesla’s future.  Ford has the brighter future.

Early investors who sell before reality strikes will make a killing, leaving cult investors who bet on a dream holding the proverbial bag.  Tesla stock is likely to be our generation’s version of Holland’s Tulip Mania in the 1830s.

 

EPA Budget Cuts: Challenging Inertia

President Trump’s proposed budget reduction has set off howls of protest from democrats and a wide range of environmentalists.  The instantaneous reactions tell their own story and it is not one of thoughtfulness.  Smaller budget proposals are a mechanism for organizations—public and private—to reassess, prioritize, and rethink missions.

EPA was created in 1970 and its mission and approach have not changed since then.  Over the past 47 years, true to the economic theory of public choice, its scope and bureaucratic control has been an example in mission creep.

No successful organization remains the same today as it was in 1970.  Most organizations resist change as do most human beings.  In the private sector, competition, reduced demand, a new CEO or potential obsolescence are motivators for change.  With the exception of a new President, those motivating forces do not exist in the public sector.  So, rather than begin with defense for the status quo, environmentalists and democrats should let the process play out during the budget hearing phase to determine which proposed changes to support or oppose. It is almost a given that the proposed 31% reduction will not get enacted.  EPA’s budget is $8.1 billion and the proposed reduction would take it to $5.5 billion the level it was at in 1990.

When EPA was created the nation faced serious environmental problems with air and water quality, waste disposal, and exposure to toxic substances.  Since then, tremendous progress has been made as can be seen in EPA’s Report on the Environment.  As one example, air quality has shown tremendous improvement since measurements were started in the late 1970s.  Ambient levels of pollutants specified in the Clean Air Act have been reduced from 27% in the case of ozone to 89% for lead, reflecting significant reductions in emissions of covered pollutants.  Water quality has also improved but unlike air pollution, is very difficult to monitor on a national basis. Hazardous waste and toxic substances are addressed through the Resource Conservation and Recovery Act and the Toxic Substances Control Act.  Solid progress should be the major reason for rethinking its mission

While a case could be made that in the early days of environmental management, command and control was a means to make sure that all states developed programs to implement environmental improvement and compliance programs, the only justifications for command and control today are that is how it has always been done and environmental advocates don’t want to lose influence with bureaucrats.  Each state has set up environmental departments and those departments issue regular reports on compliance and progress.

A strong national commitment to environmental protection and in place compliance mechanisms make it possible to delegate implementation, compliance and enforcement to states with EPA focusing on research, technical assistance, oversight, incentives to continue making progress, and most important identifying the points of diminishing returns. Instead of micromanaging, the agency should focus on results.  How a state achieves specific environmental objectives is less important than their timely achievement.

Over the past eight years, EPA was a regulatory machine on steroids.  The number of major regulations—those with an impact of $100 million or more increased from 76 during the Bush Administration to 229 during the Obama Administration. The economic impact rose from $38 billion to over $100 billion annually.

Many of the Obama regulations were justified by questionable benefits flowing from equally questionable research and analysis.  For example, EPA asserted that further tightening of the ozone standard was justified because it would reduce the incidence of asthma attacks.  However, the incidence of asthma attacks has been increasing even as their quality was continually improving.  In justifying its Clean Power Plant rule, the agency claimed it would avoid 2700 to 6000 premature deaths.  According to CDC, there are 900,000 premature deaths annually.  EPA would have us believe that its epidemiological methodologies are sufficiently precise to measure changes between .003 and 1percent. EPA manufactured absurd results through modelling and research designed to support its beliefs; not to illuminate environmental conditions and impacts.  That approach needs to change and its research should fill gaps in knowledge while meeting objective scientific standards.

A smaller budget  changes incentives and helps reveal real environmental priorities.

EPA’s Shaky Foundation for Climate Regulations

During the Obama Administration, its climate agenda was driven by EPA using a bad Supreme Court decision to justify regulations that had no sound basis in science or economics.

In making its decision, the Court reached a number of conclusions of questionable validity.  It concluded, “Because greenhouse gases fit well within the Act’s capacious definition of “air pollutant,” EPA has statutory authority to regulate emission of such gases from new motor vehicles”… and,” A well-documented rise in global temperatures has coincided with a significant increase in the concentration of carbon dioxide in the atmosphere. Respected scientists believe the two trends are related.”  The Court also accepted the IPCC’s conclusion about humans having a “discernable influence on climate”.

For years, courts have given deference to agencies based on the Supreme Court’s Chevron decision that found that when there is ambiguity in a law, deference should be given to an agency’s interpretation.

The Bush Administration could have strongly rebutted the arguments that led to the Court’s conclusions but instead opted for challenging the petitioners standing to bring suit and some technicalities.

Beginning with the conclusion that EPA could determine that CO2 was a pollutant, the Court then easily moved to the decision that the Agency could regulate it if it found that CO2 endangered human health and welfare.  What the Court ignored was that Congress in passing the 1990 Clean Air Act amendments explicitly decided against granting EPA authority to regulate CO2.  Clearly in this instance, deference should have been given to Congressional intent.  The Court also ignored the established fact that CO2 was a nutrient; not a pollutant.  In citing the rise in global temperatures and increase in CO2 concentrations, the Court made the serious blunder of equating correlation, which really doesn’t exist, with causality.

By allowing EPA to stretch the definition of pollutant, the Court had to accept that increased concentrations of CO2 in the atmosphere would cause harm and that regulating emissions would avoid that harm.  Even if you concede the Court’s assumption, there is no way that EPA regulations would produce any beneficial effect.  To make significant reductions in atmospheric concentrations, it would be necessary for all nations to reduce emissions.  At the time of the Court decision, the only global agreement was the Kyoto Agreement, which exempted developing countries, the major source of emissions.  So, it should have been abundantly clear that granting EPA the authority to regulate would impose large costs and no warming reduction or climate benefits.

Beyond the errors cited, the Supreme Court also ignored a1993 Supreme Court decision –Daubert v. Merrell Dow Pharmaceuticals which set forth a standard for scientific evidence.  The Court stated, “ … in order to qualify as “scientific knowledge”, an inference or assertion must be derivied by the scientific method.”  It went on to state, “…in determining whether a theory …is scientific knowledge … will be whether it can be (and has been) tested.  Scientific methodology today is based on generating hypotheses and testing them to see if they can be falsified.”  The conclusions of the IPCC and the work it uses in drawing a conclusion that human activities are the primary cause of global temperature increases and associated climate events are based on climate model outputs and not the results of experiments that can be falsified.  The models have not been validated, nor have most of the assumptions incorporated in them.  It is that simple.

 

 

 

Scott Pruitt’s Heresy: Telling the Truth and Not Being Politically Correct

The chattering climate apocalyptics are in a state of high state of agitation as a result of  EPA Administrator Pruitt’s comment that “I think that measuring with precision human activity on the climate is something very challenging to do, and there’s tremendous disagreement about the degree of impact, so no, I would not agree that it’s a primary contributor to the global warming that we see.”

The main stream media and some charter members of the climate club reacted predictably by trotting out the infamous 97%, other elements of the climate orthodoxy, and hanging the label of “denialist” around his neck.  It would have been refreshing if at least one major newspaper could have asked for an explanation of why he held a view that was at odds with many scientists.  Reporting is supposed to be about digging for facts; not pre-emptive dismissal.

Mr. Pruitt did not say that the earth had not warmed, he did not deny that climate changes, and he did not say that there was no human influence on the climate system.  His sin was to challenge the certitude of the adherents to the climate orthodoxy.  There is an abundance of evidence that the case that humans are mostly responsible for warming since the middle of the last century is a construct that rests on a shaky scientific foundation.

No one disputes that CO2 is a greenhouse gas that warms the earth and no one challenges the conclusion that a doubling of CO2, absent positive feedback, would increase global temperatures by 1 degree C.  The IPCC and other climate advocates assume enhanced feedback that will cause temperatures in the future—note that it is always decades in the future—will be significantly higher.  While these assertions are made as if there is no uncertainty, the IPCC estimates that climate sensitivity is somewhere between 1.5 degrees C and 4.5 degrees C.  A factor of 3 difference undermines the certainty with which advocates make their announcements and predictions.

In addition, if the case for human causality was so compelling, climate advocates would not have to resort to statistical tricks and manipulation to mislead the public.  Each year, NASA announces with great fanfare that the current year is one of the warmest on record or in the case of 2016 was the warmest on record.  What NASA doesn’t tell is that for most of the years in question, the difference from a prior year is measured in tenths of a degree and often within the margin of measurement error.  In the case of 2016, it was warmer than recent years because of El Nino but it was not the warmest ever recorded.  1934 according to NASA data was the same as 2016.

Even the extent of warming is open to debate.  Since the end of the Little Ice Age, the US temperature record shows a cyclical pattern but also a warming trend.  Over the course of more than 120 years, surface measuring stations and devices have changed.  The location of measuring stations, the development of urban areas, and the switch from mercury thermometers to electric thermistors affect temperature recordings.  Urban development creates heat islands and adjustments are necessary to correct for the heat they retain. Those adjustments are based on beliefs and assumptions.   An audit of the more than 1200 measuring stations by Anthony Watts found that “89 percent of the stations – nearly 9 of every 10 – fail to meet the National Weather Service’s own siting requirements that stations must be 30 meters (about 100 feet) or more away from an artificial heating or radiating/ reflecting heat source.”  https://wattsupwiththat.files.wordpress.com/2009/05/surfacestationsreport_spring09.pdf

In 1954, Darrell Huff wrote the classic How to Lie with Statistics.  A sequel could be written just relying on the analytical tricks and misused statistics used by climate advocates.

The case for skepticism about the climate orthodoxy provides a strong platform for Mr. Pruitt to stand on.  If he holds the EPA staff to accepted standards of scientific evidence, the case built during the Obama Administration by Gina McCarthy will wilt away.  That would be a real public service.

 

 

 

 

Starve a Feeding Bureaucrat

Few things in life are predictable with certainty but the reaction of the climateers to the proposed cut in government climate research certainly was.

The Global Change Research Program dates back to the 1990s and the Clinton Administration.  During that time, the views of Vice President Gore and the strength of his office dominated how federal research dollars were spent. Anyone who questioned the orthodoxy need not apply.

With the election of George W Bush, the program was reoriented and placed under the Department of Commerce.  The new Climate Change Research Initiative (CCRI), was intended “to reduce significant uncertainties in climate science, improve global observing systems, develop science-based information resources to support policymaking and resource management, and communicate findings broadly among the international scientific and user communities”.

Not surprisingly, the program got captured by a bureaucracy that reflected the beliefs of Al Gore and NASA’s James Hansen.  The goal quickly shifted from gaining new scientific knowledge to “research” that would confirm the indictment of CO2 and an impending catastrophe caused by human activities such as driving, heating and lighting homes and commercial facilities, and carrying on a life that was enriched by the use of fossil fuels.

Between 2002 and 2016, the climate research program grew from about $5 billion to $7.5 billion in 2009.  According to the Congressional Budget Office, there was a dramatic increase to almost $38 billion as part of the American Recovery and Reinvestment Act.

What did we get from those dollars?  Based on the most recent report of the Intergovernmental Panel on Climate Change, not much.  We don’t know much more about natural variability, climate sensitivity, although its lower bound has been reduced, cloud formation, water vapor, or solar impacts.  In the early 2000s, NOAA did initiate an enhanced oceans observation system which has improved knowledge of oceans.

The history of the federal government’s climate research program makes clear that unbiased research is not likely to come from the federal bureaucracy.  A CATO Institute paper by Patrick Michaels– Is the Government Buying Science or Support? A Framework Analysis of Federal Funding-induced Biases—provides an insightful explanation for the biases that have been widely documented.  Michaels provides a reference to a critique of the USGCRP by Professor Judith Curry, recently retired from Georgia Tech.  She describes it as “off the rails” because it is designed to promote policies rather than advance climate science.

None of this should be surprising.  The beginning of the CATO report includes a reference to Thomas Kuhn: “fundamental beliefs can take over a scientific field”. He called these entrenched beliefs “paradigms” and noted that they tend to direct scientific thinking in specific directions. Once these beliefs become entrenched they are difficult to dislodge, despite growing evidence that they may be incorrect. Science, like any human endeavor, is subject to fads and fashions”.

James Buchanan, Nobel Laureate, who developed the public choice theory of economics proved that bureaucrats are like the rest of us.  They pursue their own self interests.  Holding government positions doesn’t convert them to angels who selflessly guard the public interest.  That doesn’t make them venal, only human.

There are a lot of questions about the climate system than need answering but the approach taken over the past several decades won’t provide robust answers.   And, just cutting budgets isn’t the answer either.  Our nation has a long history of supporting basic research that has produced new knowledge in every field of science.  Funding basic research in physics, meteorology, oceanography at well-known centers of excellence through competitive bidding is a route, but not the only one, to producing improved  understanding about our climate system as well as human impact on it.

So yes, starve a feeding bureaucrat but not our thirst for knowledge.