The Overlooked Burden on Economic Growth

A number of economists, such as Harvard professor and former Treasury Secretary Larry Summers, are urging a major increase in infrastructure spending to get the economy out of the 1% growth ditch. But the overlooked drag on the economy is not lack of government spending, but the expansion of the regulatory state.

Since 1980 the code of federal regulations has grown from 100,000 pages to almost 180,000, according to the Mercatus Center. A number of organizations have estimated that those regulations carry a cost of about $1.9 trillion annually.

A 2013 study—Federal Regulation and Economic Growth by John Seater, North Carolina State University and John Dawson, Appalachian State University concluded that federal regulation has “statistically and economically significant effects on aggregate output and factors that produce it.” They conclude, “Federal regulations have reduced real output growth by about two percentage points on average over the period 1949-2005.” According to their calculations, that reduction in the growth rate has led to an accumulated reduction in GDP of about $38.8 trillion as of the end of 2011. That is, “GDP at the end of 2011 would have been $53.9 trillion instead of $15.1 trillion if regulation had remained at its 1949 level.” Critics can point out that 1949 is the wrong base year and in recent decades national issues, like the environment, have created new regulatory needs. Although this is obviously true, there is no doubt that the effect of regulation has been negative and larger than needs to be. The literature is rich with work on the growth of federal regulations and their negative impact on the economy.

The regulatory burden just keeps growing. Large companies have to allocate increased resources to compliance and lobbying that could otherwise be invested in improving their business. Small companies don’t have that option. Regulations are a tax that is a deterrent to their job creation and growth. An OECD study on small business stated that “regulatory burdens remain a major obstacle” because these firms are not well equipped to deal with them. An article in by Scott Shane of Case Western Reserve University found that regulation had gone from being small business’s fourth major problem in 2008 to being its primary one in 2013. This burden is not trivial because small businesses account for over 50% of all domestic sales and since 1970 66% of all net new jobs. (SBA data).

Although Congress has passed legislation to deal with regulatory burdens—Paperwork Reduction Act, Congressional Review Act, and the Data Quality Act for example—their effectiveness has not been sufficient to reverse the trend to an over governed society. The growth of the regulatory state needs to be halted.

The next Administration and Congress have an opportunity to translate reform rhetoric into effective action. Here is a list of actions that could help to reign in excessive regulation.

  • Congress should set an annual regulatory budget with a percentage of the savings from reductions in obsolete and ineffective regulations added to the following year’s budget.
  • Require that new regulations include a sunset provision that can only be avoided by a rule making process to extend them.
  • Wayne Crews of the Competitive Enterprise Institute has recommended a Regulation Reduction Commission that would prepare a “packages of rules for an up or down vote.” This would be modeled after the Base Realignment Commission. Such a commission could use a triage process starting with rules promulgated before some agreed upon date and considered major.
  • Currently major rules, those costing more than $100 million annually. are submitted to Congress for a 60-day review under the Congressional Review Act, allowing Congress to pass a resolution of disapproval. Instead of a resolution of disapproval, Congress should be required to take affirmative action certifying that regulations meet the underlying legislative requirements and are cost-effective.
  • The OMB Office of Information and Regulatory Affairs (OIRA) has review authority over all proposed regulations from Executive Branch agencies. Its performance has been uneven because it is part of the Executive Branch and reflects the philosophy of the current Administration. There should be bi-annual performance reviews conducted by the Government Accountability Office to assess OIRA’s effectiveness in carrying out its legislative mandate.

The Paperwork Reduction Act (PPA), passed in 1980 and amended by the Data Quality Act of 2000, is the vehicle that guide the regulatory review process. The purpose of the PPA was to reduce the burden of information/data requirements imposed by Executive Branch agencies, to improve the quality of information coming from federal agencies and improve the efficiency of government programs, including regulations.

While the next Congress and Administration consider cooperation on regulatory reform, Congress could make a major contribution by tasking the Government Accountability Office or Congressional Budget Office with conducting a review of PPA to determine its effectiveness and changes needed to improve data/information management and to increase transparency and objectivity of analyses and research conducted in support of rule making. The system of checks and balances that is supposed to make government function effectively is not working in the regulatory process and it needs to reign in unelected regulators.








Declare Victory and Stop Helping Us

After the 1973 oil embargo, the government got involved in a big way in responding to energy scarcity and the price increases that go with it. Relying on market forces was not an option because the public demanded that government do something. It did with mandates on fuel efficiency standards—CAFÉ—as well as standards for HVAC systems, appliances, and housing. It also provided consumers, states, and industries with information on energy saving opportunities—Green Lights for example.

New businesses sprung up to assist consumers and businesses make cost-effective decisions on energy trade offs. There also were increased investments in energy technologies, many in response to environmental standards. The overall effect was that consumers became more educated and conservation became part of our society’s value system. The evidence that confirms that is the reduction in energy intensity. According to the American Council for an Energy Efficient Economy (ACEEE) between 1980 and 2014, GDP grew almost 150% while the energy required to produce that growth only increased 26%.

An August ACEEE report stated, “…a diverse group of scientists, analysts, and policymakers began to develop strategies to reduce energy waste and use less energy to deliver the same or better services to consumers and businesses. What they produced was an array of energy-efficient technologies, policies, and innovations that consumers take for granted—“an unqualified success story, both economically and environmentally, although one often unseen by the public.” The increase in energy efficiency runs the gamut from home appliances, HVAC systems, industrial use, commercial buildings, highway vehicles, to energy transmission and distribution system.

With energy efficiency embedded in our economy and investments in energy technologies being made to meet future needs, the time has come for the government to declare victory and let the market continue to achieve progress. That would enable the federal government to do something as strange as a dog walking on its hind legs, cancel energy standard setting programs and eliminate their costs from its budget.

CAFÉ: The Triumph of Dogma

Congressman Scott Peters wrote an impassioned defense of CAFÉ in the Washington Post’s Power Post. His article reveals an argument built on dogma and assumption. He is dismissive of any criticism of CAFÉ and accuses republicans of wanting to “ gut fuel efficiency standards. In defending CAFÉ as “one of the most successful energy conservation programs”, he embraces conservation without limitation or understanding of how CAFÉ works.

CAFÉ was a creation of the post oil embargo panic. Its goal then was reducing oil imports by reducing gasoline consumption. The initial goal could be defended, as there was a concern that unchecked imports would transfer too much wealth and power to the Persian Gulf. Also, at that time, there was the widely held belief that the world was running out of oil, the end of oil was predicted to be at the end of the 20th century. While the goal could be defended, the mechanism, CAFÉ, can’t. It shows Congress’ lust for power and lack of faith in market forces. A gasoline tax would have been far more effective and much easier to implement. Analyses have concluded that a gasoline tax would be 6 to 14 times more effective.

Since the 1970s, CAFÉ has morphed from fuel conservation, which it did not achieve because as the cost of driving declined drivers increased miles driven, into a means for emissions reductions, specifically CO2. EPA has a long record for setting fuel and tailpipe standards as a mechanism for improving air quality, so CAFÉ was not needed to do that. While a case can be made that the standard setting process has been based on worst-case assumptions, which it has, there is no question that air quality has achieved tremendous improvements. Using CAFÉ to achieve CO2 objectives is nothing less that a case of orthodoxy driving regulation.

CAFÉ is a two-sided coin. One side reflects the air quality improvements, including reductions in CO2 emissions. The other and darker side represents the consequences of government fuel efficiency mandates. Until the most recent standards, manufacturers had to build lighter and smaller cars to meet fleet mileage requirements. Over the decades since the 1970s there has been accumulating evidence that lighter, smaller vehicles have led to greater highway fatalities.

Since most buyers prefer larger vehicles, smaller ones had to be sold at a loss. Cross subsidization allowed for those lower prices by making larger vehicles more expensive. Cross subsidization is compelling evidence that CAFÉ is in conflict with consumer preferences.

The current CAFÉ standards have eliminated that flaw by using vehicle class standards—vehicle foot-print—for calculating applicable standards. The current approach according to a paper from the Energy Institute at the University of California at Berkeley—New CAFÉ Standards: The Good, the Bad, and the Ugly, January 2016—is counter-productive “… the footprint-based targets may actually incentivize manufacturers to increase the average footprint of their fleet. This may make sense from a political point-of-view because domestic manufacturers produce large numbers of SUVs and pickups, but it doesn’t make sense from the perspective of reducing GHGs.” This explains why manufacturers were willing to accept more aggressive standards. They can sell more larger vehicles which are more profitable while appearing to be good corporate citizens.

As Milton Friedman observed, there is no such thing as a free lunch. Analyses of the new standards have concluded that they increased manufacturing cost initially by $1000 and by $3000 by the time they are fully implemented. This is because manufacturers have to substitute lighter material for metal and develop new drive train technologies. While advocates hail new technologies as a benefit of CAFÉ, they are assuming that those technologies would not have come about from meeting consumer preferences and competition among manufacturers. There cannot be any dispute that technology forcing leads to higher costs than competition.


The CAFÉ program is now dogma and a costly tool to reduce greenhouse gas emissions. It will do nothing to impact climate change but has allowed manufacturers to game the system and increase profitability. This is just another example of the Bootlegger and Baptist theory! It also is strong evidence why there is a need for strong regulatory reform that includes “look back” provisions and a strong role for effective agency oversight by Congress. No major rule such as CAFÉ should go into effect until approved by Congress.


Samuel Johnson on OPEC

Samuel Johnson and then Oscar Wilde observed that second marriages are a triumph of hope over experience. That observation can also be applied to OPEC agreements to cut production.

The history of OPEC agreements on production quotas is that they haven’t lasted much longer than it took for the ink on them to dry. The reason is quite simple. As long as member nations have spare production capacity, they have a strong financial incentive to cheat. Saudi Arabia in the past has been the swing producer and its reductions in production have been offset by other members cheating and increasing theirs. As long as demand exceeds supply and some OPEC member can increase its production, that will continue to be the case.

Earlier this week, oil markets firmed up for a brief time when it was reported that OPEC had reached an agreement to reduce production from around 33.2 million barrels a day. Then reality set in and markets slipped back down.

Agreeing that production should be reduced is different from agreeing on  the size of a reduction and how it will be allocated. It is highly unlikely that Saudi Arabia will agree to a reduction as long as Iran is pushing to increase its production to at least 4 million barrels a day. The Wall Street Journal recently reported that Iran’s “ability to reach pre-sanction levels above four million barrels a day is now in question. Iran has been pumping 3.85 million barrels a day.” but IEA has reported that Iranian production had slipped back to 3.6 million barrels a day.  What Iran can do in the near term is not at all clear.

There also have been reports that with Libya and Nigeria production recovering, actual OPEC production could be 1 million barrels above the suggested range of 32.5 to 33 million barrels.

So, where will cuts come from and how will they be enforced? Those two questions have bedeviled past agreements and in the end OPEC agreements have turned out to be triumphs of hope.   In the current situation, it appears that most members are producing at or near capacity, so the ability to cheat this time may be constrained, unless Libya and Nigeria can substantially increase theirs. In addition,the proposed reduction is not large–2% at most–, so it is unclear how much impact it would have on price?  In reality, the impact is likely to be small at best.

What is different today is US production, enhanced by “fracking” and horizontal drilling. Shale oil producers have been able to withstand the price reduction better than had been estimated, so any reduction by OPEC will be offset by increased domestic production.

This time hope will triumph and the US economy benefits if it is.

Letting Markets Work

On September 12, Colonial Pipeline, which supplies gasoline and other petroleum products from Texas to New York, was shut down because of a leak. The leaking line supplies about 1.4 million barrels/ day to states from Tennessee to New Jersey.

The shutdown had a noticeable impact on gasoline supplies from Georgia to Virginia with spot outages reported in some states. The governors of those states were quick to make state of emergency declarations, which had the primary effect being to relax constraints on hours of work by tank truck drivers and restrictions on tanker truck size. The Governors also made obligatory political statements about not tolerating price gouging, an ill-defined concept.

The media as expected focused on price increases and stations that ran short of gasoline without giving much coverage to alternative sources of supply that minimized the disruption caused by the shutdown.

Suppliers of gasoline reacted by moving more gasoline by waterborne tankers to ports along the east coast and then using tanker trucks to resupply affected areas. Colonial also began to utilize a second pipeline that typically moved distillate fuel. Gasoline suppliers had a strong incentive to reallocate supplies from other areas to minimize shortages because of they were losing sales revenue from the pipeline shutdown.

Anytime there is a shortage of any product, its price increases. That is how supply and demand works. Price increases do two things. First, they bring demand in line with available supplies and second they create an incentive to increase. For those two reasons, railing against prospective “price gouging” is not in the consumers’ best interest.

The history of price and allocation controls in the 1970s demonstrates that when government attempts to manage markets, it makes problems worse and they last longer. Then drivers had to search for stations that had supplies, often relying on word of mouth or media reports. Today, information technology and a 24 hour news cycle allows motorists to quickly find stations and areas with available supplies.

Governors apparently had learned the lesson of the 70s and avoided the “do something” overreaction while letting markets work. The lesson for the future is that the best course of action for officials and the media is to inform the public with complete and accurate facts, the actions being taken to mitigate adverse impacts, avoiding political rhetoric, and giving market forces a chance to work.






Yogi Berra’s Carbon Tax Advice

Yogi Berra once observed, “In theory there is no difference between theory and practice. In practice there is”. Academic analyses often ignore this wisdom because of oversimplified assumptions and a failure to adequately address real world complexity. Such is the case with an article in the September 14 Wall Street Journal—The Coming Price on Carbon by Amy Meyers Jaffe ( University of California, Davis).

The crux of her argument is that the complexity of increasing regulations aimed at carbon is leading companies to move toward a carbon tax because business places a high value on transparency and predictability. If the choice, ignoring for a moment the scientific issue, was a carbon tax or a series of regulations, the answer is straightforward. But, anyone who thinks that by embracing a carbon tax will save them from more regulation is related to Rip Van Winkle.

The Obama Administration and its environmental allies have been pursuing a strategy of increasing complex carbon related regulations as a way to get companies to beg for a carbon tax. Ms. Jaffe might be right that we are at a tipping point on this issue. If so, business might lobby aggressively for a carbon tax, get it and still be saddled with more regulations. The Obama Administration and environmental zealots have been waging a war on fossil fuels and business conceding a carbon tax will just embolden them more. They will only be satisfied when fossil energy is like whale oil, a thing of the past.

Contrary to the assertion in Ms. Jaffe’s article that “some sort of carbon price is needed because emissions have costs” there has been a de-facto price on carbon since the first Clean Air Act in 1970. Constraints on pollution related emissions have imposed costs on fossil fuels. As regulations became more stringent, environmentalists hoped or believed that coal, oil, and gas would be priced out of the market. They seem to have won on coal, but not in other countries. Petroleum based fuels have withstood the regulatory onslaught because oil is abundant, versatile, has high energy density, and is cost-effective relative to alternatives, in spite of their generous subsidies. A carbon tax would add to the cost currently being borne by carbon fuels, not reduce or replace any of them. And, that is the Achilles heal in carbon tax logic. There is no trade-off to be realized.

The case for putting a price on carbon is also based on the assertion that “technological advances have made it cheaper to move away from carbon emitting technologies. While it is true that the cost of producing solar cells and lithium ion batteries has been declining, it is not true that wind, solar, or electric vehicles are cost competitive with fossil energy. The apparent competiveness is based on tax credits and subsidies. Take away the subsidies and government bestowed incentives and the demand for alternatives would dry up. It is also misleading to imply that these alternatives are replacements for carbon. What happens when the wind doesn’t blow and the sun doesn’t shine? Coal or natural gas powered electricity. Germany, which is a leader in promoting alternatives to fossil fuels, has an electricity price about 3 times greater than the US average and has been turning back to coal to compensate for the lack of storage technology and periods when the sun doesn’t shine.

As documented in the George C. Marshall Institute report, A Skeptical Look at the Carbon Tax, a carbon tax, like Clinton’s BTU tax, faces a host of practical problems. The support that Ms. Jaffe sees comes from a coalition of “Bootleggers-and-Baptists” comprised of environmentalists, corporate rent seekers, and government dependents “who will be actively involved in designing its provisions in ways that benefit special interests, undercut any beneficial effects and accentuate its harmful side effects”. And, judging from past experience, a carbon tax will not substitute for other taxes or improve their efficiency, nor will it be implemented without political favoritism. It will be complex and become the equivalent of an ATM for Congress to increase spending.

Ms. Jaffe’s article is based on the assumption that human activities are the dominant cause of climate change. The theory of human induced climate change has not been demonstrated by scientific validation. It is a creation of climate models that have been constructed to create a self-fulfilling prophecy, which they do. The pattern of climate change over human history and the last 100 years is inconsistent with the assertion that increasing CO2 has lead to increasing temperatures since the end of the Little Ice Age. The recent pause/hiatus in warming is also inconsistent with climate orthodoxy. While advocates who claim that the science is settled use the IPCC science assessments as their foundation, they conveniently ignore the list of major uncertainties cited by the IPCC and its gradual reduction in estimated climate sensitivity.

Business and environmentalists may eventually get their desired carbon tax and if they do they will discover that Yogi Berra was right—in the real world there is indeed a difference between theory and practice. They will have been willingly snookered.






The Vanishing Conspiracy That Never Was

New York’s Attorney General has announced that he is abandoning his investigation that claimed ExxonMobil and other oil companies engaged in a public deception about the risks of climate change just as tobacco companies did in misleading the public about the danger of smoking. The Attorney General’s investigation, which was joined by a number of his fellow AGs, Senator Whitehouse, and Al Gore has so to speak gone up in smoke. There simply was no there there.

Given the extent of uncertainty documented by the Intergovernmental Panel on Climate Change (IPCC), the advocates’ gold standard, and the National Academies of Science (NAS), it would be virtually impossible to make a case of deliberate deception. The IPCC continues to list major uncertainties in knowledge about forces affecting warming and it has gradually reduced its estimate of climate sensitivity. The NAS was quite explicit in a report on key questions about climate change in stating that estimates of future warming were subject to revision, up or down, based on new knowledge.

The combined efforts of this group of attorney generals, some members of congress, and environmental zealots were never about what oil companies knew, when they knew it, or the funding of think tanks and others to create public confusion. From start to finish, their goal was intimidation and silencing dissent. For them, turning the First Amendment on its head was a small price to chill debate that would expose the climate orthodoxy as a Trojan Horse. Now in a face saving move, the New York Attorney General is trying a new tactic. He wants to determine whether ExxonMobil’s forecasts of the value of its reserves are misleading by not including the “full impact of climate change.”

So, it is no longer what ExxonMobil and others knew but how clear are their crystal balls. In the case of ExxonMobil, it has made disclosures to investors and publicly stated that it assumes a price on carbon in evaluating the viability of capital investment projects. The world’s appetite for energy is growing and every independent analysis concludes that fossil fuels will continue to provide 80% of that energy for decades to come. Since these estimates are made taking into account policies to reduce greenhouse emissions, they acknowledge that there is no cost competitive alternatives to oil and gas on the horizon. Assumptions about value reflect this reality.

Could there be a black swan in energy technology that renders oil and gas less valuable? Perhaps but it is not likely in the foreseeable future.

The current investigatory scheme will prove to be just as much of a boondoggle as the one that preceded it. Eventually the truth about CO2 and the complexity of the climate system will make it evident that man’s impact on the climate system is not creating an apocalypse. None the less, environmental history over the past 40 plus years demonstrates that zealots will not abandon their pursuit of power, environmental purity, and efforts to constrain market forces to promote greater economic well being.





Vodoo Energy and Fuel Delusions

At the democrat national convention, an energy plank was adopted that is totally detached from reality and inconsistent with the goal of helping lower income citizens.

In part, the energy plank states, “We are committed to getting 50 percent of our electricity from clean energy sources within a decade, with half a billion solar panels installed within four years and enough renewable energy to power every home in the country. We will cut energy waste … through energy efficient improvements; modernize our electric grid; and make American manufacturing the cleanest and most efficient in the world. … We will transform American transportation by reducing oil consumption through cleaner fuels, vehicle electrification increasing the fuel efficiency … .”

 Today, according to the Energy Information Administration, we get 13% of our electricity from renewables and will get 23% in 2025. What kind of magic will take the projected 23% to 50%? Even with an aggressive program of even more wasteful subsidies and even more stringent regulations, there is not realistic way to increase the amount of electricity generated by renewables by a factor of 3.8 in a decade. The economic impact of shuttering coal and gas fired power on such an accelerated pace would easily exceed $366 billion between 2017 and 2031, the estimated cost of the Clean Power Plant rule according to the nationally recognized firm NERA.

There are over 7600 operational power plants in the US according to EIA. It is beyond wishful thinking to assume that a large fraction could be replaced in the next nine years. Permitting is often extended because of citizen and environmentalist opposition and once permits are granted, construction can take 1-2 years. In addition, capital constraints further limit how many plants a utility can finance simultaneously.

Renewables serve a niche market—areas with lots of sun and wind. There is currently no viable storage technology for excess power that is generated by renewables. Germany’s experience in forcing renewables into the market should be cautionary lesson for democrat dreamers. The cost of electricity in Germany is about 3 times higher than the average cost in the US and Germany has to rely on coal fired power for nighttime power generation, undercutting its drive to reduce CO2 emissions.

A report by the Institute for Energy Research concluded, “The high use of renewable energy in eastern Germany driven by government green energy policies is causing instability to its own electric grid as well as to neighboring countries, … Electricity bills are also expected to go up by 10 percent this year. With residential electricity prices in Germany already about 3 times higher than prices in the United States and increasing…”

 The bottom line on renewables and electricity in the democrat platform in the words of Vice President Biden, is malarkey.

The platform also promises “We will transform American transportation by reducing oil consumption through cleaner fuels, vehicle electrification increasing the fuel efficiency of cars … “. The Obama Administration has already almost doubled the CAFÉ standard for light duty vehicles, raising it from 27.5 mpg to 54.5 mpg by 2025. The incremental cost has been estimated to range from $3000 to $6000 per vehicle, which excludes the cost of subsidies for hybrids and electric vehicles. Unless the price of gasoline rises steeply, some estimate to the $4 dollar range, the fleet average will fall short of 54.5 because consumers prefer larger vehicles than the government preferred smaller ones that have higher mileage. The increased purchase price keeps older vehicles on the road longer and penalizes lower income purchasers.

The story is the same with fuels. The government continues to push ethanol even though it is not necessary, is more expensive, and can damage fuel systems. Although the government has been excessively optimistic about the technology to produce cellulosic ethanol, it is not commercially viable. As a result, corn remains the dominant feedstock for ethanol. As such, it diverts corn from food to fuel, raising the price of a range of food products.

As a nation, we need more factual statements about energy and less rhetoric since energy is and will remain a critical input to our economic well being. Politicians should learn that setting objectives is their job, achieving them is the market’s



Never the Twain Shall Meet

The debate between climate skeptics and those who subscribe to the climate orthodoxy that human activities are creating a climate apocalypse because of consumption of fossil fuels has been on going for close to three decades and shows no signs of abating. On a range of topics, debates among reasonable people lead to greater understanding and common ground.

That has not happened in the climate debate even though empirical evidence in the form of the climate’s behavior has rebutted a number of the advocate’s claims. Temperature increases essentially halted after 1998. Extreme weather events, like hurricanes, have not increased. And, sea level rise has not accelerated. Why is it that contrary evidence has not led to advocates moderating their position?

Psychologists and some communication experts have done research on why some people refuse to accept evidence that is contrary to their beliefs. They label this phenomena “confirmation bias”. Climate advocates want to believe that human activities that produce CO2 emissions are the primary cause of global warming or climate change. The climate orthodoxy leaves little room for reflection or reassessment.

Researchers have concluded that when someone wants a concept to be true, they often stop searching for information that would refute their belief. Shaman Heshmat, a professor emeritus at the University of Illinois, has written, Confirmation bias suggests that we don’t perceive circumstances objectively. We pick out those bits of data that make us feel good because they confirm our prejudices. Thus, we may become prisoners of our assumptions”. This is characteristic of environmental zealotry.

More recent research on confirmation bias demonstrates the difficulty of getting some people to accept corrective information and facts. Attempts to counter confirmation bias often backfires and actually strengthen the biases. This is confirmed in a research paper, When Corrections Fail: The Persistence of Political Misperceptions, by Nyham and Reifer. Based on their experiments, they conclude that “corrective information in news reports may fail to reduce misperceptions and can sometimes increase them for the ideological group most likely to hold those misperceptions” and “responses to corrections about controversial political issues vary systematically by ideology”.

A related article in the Harvard Business Review, Why Debunking Myths About Vaccines Hasn’t Convinced Dubious Parents by Christopher Graves, a communication expert, reinforces the conclusions of Nyham and Reifer about why it is so hard to change strongly held views. Graves concludes that arguing facts makes a situation worse for some who will only accept evidence that is consistent with their beliefs. He also believes that people who hold incorrect views will be more likely to change their minds through a process that does not challenge them and that makes it easy to replace their incorrect narrative with an alternative one. In the case of climate advocates, that would require an epiphany by some who subscribe to the climate orthodoxy.

The research on confirmation bias makes it clear that common ground on that the climate change is to illusive and the debate will continue for some time to come. The continued accumulation of evidence that CO2 emissions are not THE primary cause of climate change may be necessary but it won’t be sufficient.   Since the Club of Rome and Limits to Growth movement, environmental elites have moved from one catastrophe to another. In time some other narrative is likely to come along to replace climate change but that will not change the deeply held belief that human activities are threatening life on earth. Climate change proves that dread is wealth generating.

Magicians and Climate Modelers

Magicians possess skills that make us want to believe that their illusions—pulling a rabbit out of hat, making someone disappear, or sawing them in half—are real. Climate modelers possess similar talents. They build elaborate and complex computer models that are used to forecast what the climate will be like decades in the future. The forecast is always gloomy and they and the climate establishment want us to believe that the forecasts are accurate..

Over the past 28 years, these models have been used to tell us that the globe’s temperature would increase more than 6 degrees C, that coastal areas and cities like New York and Washington DC would be underwater, and that there would be increases in extreme weather events like hurricanes. These forecasts have been consistently wrong and yet policy makers, the media, and climate advocates continue to push a message of doom because their real agenda is to increase central political power and to wage war on carbon.

This may seem like a harsh indictment but environmental zealots have been using computer model wizardry since the time of the Club of Rome, which also predicted doom–famine, the exhaustion of natural resources, and population growth that would outstrip the world’s capacity to sustain it. The forecasts of dread are wrong for at least three simple reasons—advances in technology, complexity that defies the ability to capture it in a model, and the “fatal conceit” of those who believe that they are smart enough to reduce a complex world to a mathematical model and develop policies for how the world should work and how we should live our lives.

The climate system is one of the most complex systems being studied. It is comprised of the atmosphere, oceans, ice, and land surface. Each of these components is comprised of complex, interconnected parts and workings that are not fully understood. Lack of understanding about these components and how they interact make it impossible to write the thousands of equations that are needed to construct an accurate climate model.

The International Panel on Climate Change in its most recent report lists 11 factors that contribute to warming—CO2, other greenhouse gases, aerosols, ozone, clouds, land use, water vapor, and solar effects. It then ranks the understanding of them from Very High to low. Of the 11, only 5, or 45%, are ranked as high or very high. Translating the uncertainty associated with these processes and other variables into the probability that models accurately represent how the climate system operates would show that it is quite small. For example, if a model contained just the 11 factors listed by the and the probability of each being correct was .75, which is a generous assumption, the probability of the model output being correct would be 0.04223513603210449. But, models have many more than 11 variables, which means that the probability that its output is correct is close to zero.

As demonstrated by Professor Ed Lorenz of MIT and the father of chaos theory, the climate system is chaotic—being non-linear where small changes can have large effects. Sensitivity to initial conditions, according to MIT’s Technology Review “has a profound corollary: forecasting the future can be nearly impossible.” Lorenz in his work reached the following conclusion, “In view of the inevitable inaccuracy and incompleteness of weather observations, precise very-long-range forecasting would seem to be nonexistent.”

There is only one reason why the climate establishment uses models as its foundation. To create illusions like magicians, so that their policy prescriptions and limits on fossil fuels appear justified to avoid catastrophe.