The Groucho Marx Explanation for Climate Change

The late comedian is known for a number of one liners.  One that is often repeated is “who are you going to believe, me or your lying eyes”.  That question is often the approach taken by climate advocates.

A case in point is the annual ritual of claiming that the latest year’s temperature is one of the highest on record even though satellite records since 1998 show that temperatures have not risen.

Advocates are quick to criticize satellite temperature measurements because they don’t conform to their orthodoxy, cover too short a time period, and are only an indirect way of estimating surface temperature.  And, they have readymade explanations for the adjustments that they routinely make to surface measurements that always make the surface temperatures warmer.

In the past few months, two analyses have shown how these advocates engage in scientific and statistical manipulation to bolster their argument that human activities are the primary cause of warming that has occurred over the past 50 years and that the warming hiatus is a mirage.

Climate scientists, James Wallace, Joe D’Aleo, and Craig Idso, have prepared a report—On The Validity of NOAA, NASA, & HADLEY SURFACE TEMPRATURE DATA— that examined the surface data adjustments in three data sets—NOAA, NASA, and the Hadley Center in the UK.  In their study, they make the point that “ the notion that some “adjustments” to historical data might need to be made is not challenged.”  They go on to make the logical observation that it would be expected that sometimes the adjustments would raise temperatures while others times they would lower them.  Instead, what they found was the adjustments were consistently to increase temperatures and had the result of “removing the previously existing cyclical temperature pattern.”  Clearly, climate advocates at these agencies know what they want temperature record should look like—keep going up—and adjustments are the way to get the desired result.

John Brignell has written a wonderful book—Sorry, Wrong Number, that covers in great depth the abuse of measurement, which has been going on for a very long time and is far broader than climate.  Conclusions don’t flow from data; data are arranged to confirm conclusions.

The steady increase in temperatures reported, generally with great fanfare, by these three organizations is used as a way of validating climate models, which on their face should be viewed with a healthy dose of skepticism.  As John Christy showed in recent testimony before the House Committee on Science, Space and Technology the models that predict ever increasing temperatures require the addition of “extra GHGs.” When the extra GHs are removed, model results “overlap the observations almost completely”.

Examples like this provide a strong justification for the Red Team exercise proposed by Steve Koonin and embraced by the EPA Administrator.



A Gift to Environmentalists

President Trump’s Unleashing American Energy speech, especially the portion dealing with the OCS will no doubt unleash a fundraising blitz by environmental organizations who will use scare tactics about OCS development off the Atlantic coast to fill their coffers.

The Five-Year plan that the Interior Department has prepared, and which is open for comment, would not go into effect until 2022.  If leases were offered there is no guarantee that companies would bid on them or that oil and gas would be found.  Much depends on the outlook for natural gas and crude oil prices, advances in technology as well as the positions on leases taken by coastal states.  For decades, coastal states, and communities on the Atlantic coast, have opposed any drilling because of fears of spills that would impose severe economic costs because many rely on tourism.

The Coastal Zone Management Act(CZMA) provides a mechanism for states to be actively involved in the leasing process.  Under a reauthorization to the CZMA in 1990, Congress gave states greater authority by eliminating the requirement that there had to be a “direct” effect on a state’s coastal zone for requiring a consistency determination that a proposed lease sale will not impact its land or water use or natural resources in its coastal zone.  If the state disagrees with the consistency determination, Interior must follow certain procedures to achieve resolution before holding the lease sale. In addition to requiring compliance with CZMA, there are other environmental laws could limit natural gas and oil activities by imposing additional requirements. These include: The National Environmental Policy Act, the Clean Air Act, the Endangered Species Act, the Clean Water Act, and the National Fishing Enhancement Act.

The fears expressed by coastal communities are not  grounded on a solid foundation.  Since the Santa Barbara accident in 1969, there has only been one major production accident, The Deep Water Horizon in the Gulf of Mexico.  The fact that 40 years lapsed between major production accidents with more than 50,000 wells drilled in US waters is testament to the technology and safety of offshore drilling.  Oil spills cause damage but it is not permanent and there are few things in the modern world that are risk free.

Seismic testing to determine the resource base of the Atlantic offshore, which occurs before leasing, does not translate into lease sales and drilling.  The abundance of onshore oil and gas made available through fracking technology combined with coastal politics, and the cost of offshore platforms affects the decision calculus of companies when considering whether to bid on an offshore lease.

So, while environmental organizations will make money from the decision about the next five-year plan, the reality is that the outburst of opposition is at least premature and not well informed.  Coastal communities own interests will best be served by emotion free analysis and an honest consideration of trade-offs.


For 30 years, the Federal Government has been studying, debating, and procrastinating on the permanent storage of nuclear waste.  According to Secretary of Energy Perry, there is over 70,000 tons of nuclear waste that is stored at 120 sites in 39 states.  That is clear evidence of irresponsibility.  The time has come to cut the Gordian Knot either with a scalpel or meat ax.

The history of nuclear waste disposal goes back not 30 years but 60 when the National Academies of Science recommended that the best disposal would be in a deep rock formation.  There has been no study since then that has changed that recommendation.  DOE began studying Yucca Mountain in 1978.  President Reagan, based on reports covering about 10 years of study selected three sites for intensive evaluation and in 1987 Congress directed DOE to consider only Yucca Mountain.  Yucca Mountain is an abandoned nuclear test site that has no value besides the planned storage of nuclear waste.  The quagmire of politics and federal arrogance and procrastination have been the main obstacles to moving forward.

It is understandable why Nevada objects.   No state wants to be known as the nation’s nuclear waste capital.  But, the alternative of continuing to store waste at nuclear energy facilities is far worse, especially given the growing threat of terrorist attacks.  There may not be any good and politically acceptable options but in that case, the least worse is clearly the best.

The outlook for nuclear power is not encouraging and more facilities are closing because of cost and politics, especially the politics of subsidized renewables.  Electric utilities simply are unwilling to take the economic and political risks associated with building and operating nuclear facilities.  Low natural gas prices, which are likely to persist for a very long time, make its future much brighter than nuclear.

As our nation becomes even more of a service economy, the demand for reliable and affordable electricity continues to grow.   Although environmental advocates and a number of states are pushing for greater reliance on alternatives, their future is tied primarily to continued subsidies and mandates.  Without those, it is doubtful that many utilities would make them the cornerstone of their power generating capacity.  Since subsidies are likely to be reduced or eliminated as the government confronts serious budget issuesand the embarrassment of growing crony capitalism, the real future of power generation is either nuclear or natural gas.

The outlook for nuclear power will not improve until its cost is substantially reduced and the waste disposal dilemma is resolved.  The fact is that Yucca Mountain is a safe solution and it needs to be used to store the existing waste from 120 less safe sites unless some other state with deep geological formations is willing to accept waste storage.  There may be no economic incentives that would cause a state to step forward but nothing is lost in trying.

Since the Yucca Mountain facility has no other use because of the prior nuclear tests, the problem, and it is a major one, is political opposition by Nevada.  It may be that there is nothing the Federal Government can do now to reduce that opposition but a good faith effort should be made by engaging the state’s leaders, political and public while exploring the option of another, more willing state.  The route to common ground is not easy but trying is certainly better than the status quo.  If such an effort fails, the federal government should move forward on the process to begin using the Yucca Mountain for some period of time while exploring a range of alternatives for a permanent facility.  Nevada could receive financial payments that would significantly escalate if the agreed upon storage period was exceeded without an alternative solution being adopted.

Until there is a politically viable solution to waste storage, no new nuclear plants should be approved and the federal government should take ownership of all commercial nuclear waste.  That would relieve utilities, and tax payers, from the continued cost of storage and protection.

Persistent Myths

The Washington Post recently ran an article by Jason Samenow (washingtonpost) that cavalierly dismissed EPA Administrator Pruitt’s call for a Red Team approach to refine what we know about climate science.  While the author accurately summarized the justification offered by Steven Koonin, who was a senior DOE official in the Obama Administration  and a highly regarded physicist, most of the article was an attempt at discrediting the value of a Red Team exercise.  Given the certitude of the climate establishment, they should welcome the opportunity to shame climate skeptics.

The writer uses remarks by David Titley, a climate scientist Penn State, “Science already has a red team: peer review, to make the case that peer review has already settled the argument over climate science. Is it a coincidence that Titley is a colleague of Hockey Stick slick Michael Mann?  Perhaps but don’t bet on it.

Claiming that science is settled because of “peer review” is at best disingenuous.  For peer review to be a credible standard, reviewers should be anonymous to authors and reviewed research should be replicated.  That is rarely the case, and especially not the case for articles on climate change where authors often suggest or pick reviewers.  In 2010, (The Guardian in response to Climategate published an article, Climate change emails between scientists reveal flaws in peer review, that exposed how the peer review process was being manipulated.   Flaws in peer review are not limited to climate science. The  Journal of the Royal Society of Medicine published a broader critique–Peer review: a flawed process at the heart of science and journals.  In this article, the point was made that peer review is like “democracy: a system full of problems but the least worst we have”.  The evidence is clear toparaphrase Samuel Johnson, parroting peer review is the last refuge of scoundrels.

The claim that 97% of climate scientists believe that humans are mainly responsible for warming over the past 50 plus years has been shown over and over to be a myth.  But, the climate establishment has kept repeating it until the media and public accept it as fact.  As a result, journalists have gotten lazy and don’t bother to do the hard work that follows from the view, “I’m not convinced”.  If they did, they would easily find that the papers purporting to show that this overwhelming consensus have been thoroughly debunked, not just one or two times but dozens. In 2014, Roy Spencer and Joe Bast published an insightful article in the Wall Street Journal (spencer and bast) that concluded, “There is no basis for the claim that 97% of scientists believe that man-made climate change is a dangerous problem”.  Friends of Science (friends of science ) also published a detailed study titled 97% CONSENSUS? NO! GLOBAL WARMING MATH MYTHS & SOCIAL PROOFS that provided a detailed critique of the consensus meme.

Rather than denigrate the call for a Red Team exercise, Mr. Samenow and the Post should spend some time looking at the evidence that supports Dr. Koonin’s proposal.  As the late Daniel Patrick Moynihan once observed, we are all entitled to our own opinions, just not our own facts.  The Red Team review would go a long way to making clear what is opinion and what is fact.







Carbon Dioxide Facts Versus Alternative Facts

The reaction of hard core environmentalists and their allies in Congress, and the main stream media to the President’s statement that the US “will continue to be cleanest and most environmental friendly country on Earth” demonstrates the lengths they will go to mislead and distort in pursuing their agenda.  An Associated Press article claims that “the US is among the dirtiest countries when it comes to … carbon pollution.”  That statement is a clear example of fake news and alternative facts.

Why is this the case?

Most important, CO2 is not a pollutant.  It is a nutrient that is essential for the growth of plants, crops, and trees.  According to a NASA study, “From a quarter to half of Earth’s vegetated lands has shown significant greening over the last 35 years largely due to rising levels of atmospheric carbon dioxide.  And, the notion that current CO2 levels in the atmosphere constitute a health risk is countered by the fact that the US Navy allows up to 5000 ppm in submarines, more than an order of magnitude higher than current atmospheric levels.

The classification of CO2 as a pollutant is the result of a poorly reasoned Supreme Court decision in 2007 that relied on the “Chevron doctrine” that gives deference to regulatory agencies in cases of legislative ambiguity.  However, the Court ignored the fact that in passing the 1990 amendments to the Clean Air Act, Congress explicitly denied EPA the authority to regulate CO2.  There was no ambiguity.

Contrary to the impression created by the climate establishment, CO2 is not the dominant or primary atmospheric greenhouse gas.  The atmosphere is comprised primarily of nitrogen and oxygen—99%.  Argon is 0.9%, CO2 is 0.03%, and water vapor which varies from 0.0 to 4.0% is the more potent greenhouse gas. According to the International Environmental Data Rescue Organization, “Water vapor accounts for 60-70% of the greenhouse effect while CO2 accounts for 25%.

Climate advocates have created the image of ever increasing levels of CO2 leading to ever increasing global temperatures.  That implies that there is a linear relationship between atmospheric levels and increases in temperature.  In fact, the warming potential of CO2 is non-linear.  This means that the warming caused by the next increment of CO2 is less than the increment that preceded it.  There is reason to believe that the world is on the flat end of the curve.

Although advocates point out the increase in CO2 levels and temperatures since the Industrial Revolution, they conveniently omit the fact that in the earth’s history, CO2 levels have been significantly higher.  Geologists estimate that 200 million years ago, average CO2 concentrations were about 1800 ppm, almost 5 times greater than today’s level. The highest concentrations of CO2 are estimated to have reached nearly 7000 ppm.  And earlier, over 400 million years ago, earth experienced an ice age with CO2 concentrations at 4400 ppm. The original “hockey stick” graph in Al Gore’s Earth in the Balance was revealing when plotted objectively and not to prove a point.  It showed over a 160,000 year time period that there were times when CO2 preceded warming, followed warming, and coincided with warming.

Today’s global temperatures are not unprecedented.  Within the past millennium, temperature was warmer during the Medieval Warming period (800-1400) and during the Little Ice Age (1400–1850), it was colder.  Tracing modern warming from the end of the Little Ice Age creates an erroneous picture of warming.

The effect of CO2 concentrations on warming is a function of climate sensitivity—the temperature effect from doubling CO2.  Absent positive feedback, a doubling of CO2 is estimated by most scientists to be 1 degree C.  The IPCC in its most recent report estimates that climate sensitivity ranges from 1.5C to 4.5C.  In 1985, UNEP put out a statement on greenhouse gases and climate.  It estimated that a doubling of CO2 would lead to a temperature increase of 1.5C to 4.5C.  Since research is intended to expand knowledge and reduce uncertainty it is telling that in spite of spending well over $100 billion, we seem to know no more about the effect of CO2 on warming than we did over 30 years ago.  Curious indeed.

EPA has set standards for air quality and records over the past 40 years show that there has been steady and continuing improvement. Between 1980 and 2010, major pollutants defined by EPA were reduced between 27% and 82%.  Ozone and particulates had the smallest reductions but their reductions continue as new technology is developed.  It is grossly misleading to challenge the global leadership in US air quality by focusing on an emission that is not a pollutant and not a major component of the air we breathe.

The fact that the scary predictions made over the past 30 years have not come true along with the advancement of knowledge should lead thoughtful people to realize that we do not face a climate catastrophe.  Pretending to know more than we really do and pretending that we can control a complex system by controlling one variable is what Frederich Hayek termed the fatal conceit.


Much Ado About Nothing

The liberal media, environmental organizations, crony capitalists, and some members of Congress are in a hand wringing dither at the prospect that President Trump will withdraw from the 2015 Paris Accord.  Rarely is so much air time, print space, and rhetoric been devoted to something that so irrelevant .

In the early 90s, when Al Gore was pushing for mandatory emission reduction actions, a number of individuals and organizations called for a voluntary program to address what was then called global warming.  Bob Reinstein, a climate negotiator under George H. W. Bush, was the leading proponent for what he called a Pledge and Review approach.  The Clinton Administration rejected voluntary measures out of hand and went all out in support of the Kyoto Treaty.  The Senate unanimously passed SR-98 which rejected any treaty like Kyoto.

Since 1997, the climate club has tried every year at its annual Conference of the Parties (COP) to build on Kyoto with tougher goals and mandates.  And each year, it has failed, although each COP always finds a way to claim victory.  The 2015 Paris meeting became the occasion for achieving the grand illusion.  The adopted accord is just another version of Pledge and Review.  The notion that it will have any affect on climate is a fraud.

The Agreement maintains the fiction that emission reductions will keep global temperatures 2 degrees C above preindustrial levels or lower, commits to zero net emissions by achieving a balance between emissions and absorption in the second half of the century, commits to assist developing countries in adapting and in reducing emissions, and sets a long term goal of a low carbon future.  It is always easy to set long term goals for a time horizon that is far distant.

Since Al Gore’s Earth in the Balance we have learned that the assumed correlation between CO2 emissions and temperature increases isn’t all that simple.  Global emissions have continued to rise but global temperatures peaked in 1998 even though climate models have them continuing to increase.  And, none of the scary scenarios involving extreme weather and flooding caused by global warming have materialized.

Proponents of the Paris Accord point to the fact that 195 countries, virtually the entire world, have accepted it.  They conveniently neglect to point out that developing country participation was bought with a $100 billion a year bribe.  As long as money will be thrown at them to deal with the climate problem, why not go along?

The climate establishment needed a victory and Paris was turned into one.  In fact, it is an illusion.  As the late historian Daniel Boorstin pointed out events like Paris are synthetic novelties called pseudo-events.  In today’s jargon, they are false news and alternative facts.

Environmentalist Bjorn Lomborg’s analysis of the agreement concluded, “if every nation fulfills every promise by 2030, and continues to fulfill these promises faithfully until the end of the century, and there is no ‘CO₂ leakage’ to non-committed nations, the entirety of the Paris promises will reduce temperature rises by just 0.17°C (0.306°F) by 2100”.  An analysis by American Enterprise Institute scholar, Ben Zycher, reached a similar conclusion, a whopping 0.17 degree reduction, a reduction that could even be smaller since the measurement error is 0.1.

Believing that all nations will do what they promised is like second marriages, a triumph of hope over experience.  Rhetoric will be far greater than performance and participating nations will be extremely creative in explaining away their performance.  During Kyoto, cheating became an art form.  Countries importing coal fired electricity omitted the emissions from its reports because another country produced the emissions.  Producing countries also omitted the emissions because the importing country should own them.  Companies accumulated emission credits by investing in offsets in developing countries under the Clean Development Mechanism but many of those credits were the result of creative accounting.  The same kind of game playing will take place under the Paris Accord with the participants willingly engaging in wink and nod compliance.

Single minded pursuit of reducing CO2 emissions will cause serious economic harm while doing nothing to affect climate change.  The charade that the Paris Agreement represents needs to be exposed for what it is, a full employment act for those who believe in “global governance” and who profit from marketing environmental doom.



Lessons from 2017 Hurricane Forecast

NOAA has just released its forecast for this year’s hurricane season, predicting that there is a .45 probability that it will be stronger than last year.  The forecast from Colorado State and North Carolina University are not that pessimistic, although all three could be considered similar given a reasonable margin of error in probability estimates.

However, more is known about Atlantic hurricanes, El Ninos, and surface water temperatures.  And, with almost three decades of forecasting, it would be reasonable to assume that models used would by now reflect fewer differences and that the forecasts would reflect a consensus.  The fact that differences remain is instructive when it comes to forecasting climate change.

The climate system is far more complex and contains more uncertainties with the factors producing hurricanes being a relatively small set of variables.

Climate is a chaotic system, meaning that it is non-linear and essentially unpredictable.  The father of chaos theory, Professor Edward Lorenz of MIT, according on a 2011 issue of the MIT Technology Review demonstrated that “the dream of perfect knowledge founders in reality.”  He demonstrated that small changes in simulation mattered a great deal and that “the imprecision inherent in any human measurement could become magnified into wildly incorrect forecasts.”

Climate scientists, along with many others, know this but pretend that chaos theory doesn’t apply to their climate forecasts.  In doing this, they are manifesting what Frederich Hayek called the fatal conceit and the presumption of knowledge.  In his address accepting the Nobel prize, he said, “If man is not to do more harm than good…, he will have to learn that in this, as in all other fields where essential complexity of an organized kind prevails, he cannot acquire the full knowledge which would make mastery of the events possible. … The recognition of the insuperable limits to his knowledge ought indeed to teach the student of society a lesson of humility which should guard him against becoming an accomplice in men’s fatal striving to control society… “

A healthy dose of humility and acknowledging the limits to our state of knowledge would allow us to better understand the climate system and more importantly what we can do about the risks associated with climate change.






Red Team EPA’s Endangerment Finding

The foundation for existing EPA regulations controlling greenhouse gases, primarily carbon dioxide—CO2—is the agency’s Endangerment Finding that was issued following the 2007 Supreme Court decision that greenhouse gases could be considered pollutants under the Clean Air Act (CAA) and that EPA could regulate them if it found that they endangered human health and the environment.  In 2009, EPA did exactly that. A challenge that followed was rejected by the DC Court of Appeals that found the “Endangerment Finding … (is) neither arbitrary nor capricious … and EPA’s interpretation of the governing CAA provisions is unambiguously correct.”

Although the appeals court decision appears to be an insurmountable hurdle to challenging since the Supreme Court has shown no interest in revisiting its 2007 decision, it is still being challenged even though Administrator Pruitt seems unwilling to initiate the rulemaking process to withdraw the Finding. And, based on commentary by environmental lawyers on the prospects for challenging the Finding and the procedures of the Administrative Practices Act, success is not certain.  However, the significance of the Endangerment Finding make the effort worthwhile.  Otherwise, a new Administration can undo the current regulatory rollback efforts that are now underway.

None the less, the Texas Public Policy Foundation (TPPF) has filed a petition with EPA challenging the Endangerment Finding claiming that “in its rush to regulate greenhouse gases in 2009, the Obama Administration missed an important step. It failed to submit the greenhouse gas endangerment finding to the Science Advisory Board for peer review, as required by statute, and that violation is fatal to the endangerment finding”.

In addition to challenging on the basis of a procedural flaw in issuing the Finding, it would be prudent to also attack its very foundation.  In issuing the Finding, EPA concluded, based on of peer-reviewed research, from the Intergovernmental Panel on Climate Change, the U.S. Global Climate Research Program, and from the National Research Council, that there was compelling evidence that carbon dioxide emissions endanger public health and welfare by warming the planet, leading to extreme weather events and increased mortality rates.

Last month, Steven Koonin, a well respected physicist and former Under Secretary of Energy in the Obama Administration wrote in the Wall Street Journal proposing that climate science be subjected to a “Red Team” exercise.  As Dr. Koonin wrote, “The national-security community pioneered the “Red Team” methodology to test assumptions and analyses, identify risks, and reduce—or at least understand—uncertainties. The process is now considered a best practice in high-consequence situations… .

Taking on the entire field of climate science would be a daunting undertaking and it is doubtful that Congress would set up a commission to do so.  Instead, petitioners should focus on just the foundation of the Finding which would be much more manageable and potentially more successful.  First, the assertion that carbon dioxide is a pollutant is vulnerable to a large body of science and the greening of the planet that has been taking place for decades.  Second, the contention that increasing levels of CO2 lead to temperature increases that cause extreme weather events like hurricanes and premature deaths is vulnerable to scientific facts and accumulating empirical evidence.  Hurricanes have not been increasing and claims of increased premature deaths are products of computer models that cannot withstand careful scientific scrutiny.

In 1993, the Supreme Court in Daubert v. Merrell Dow Pharmaceuticals established standards for the admissibility of scientific evidence in judicial proceedings.  The Court’s decision provides guidance that can be used to rebut EPA’s findings about CO2.  In particular, the Court stated, “in order to qualify as scientific knowledge, an inference or assertion must be derived by the scientific method.”  It went on to state, “Scientific methodology today is based on generating hypotheses and testing them to see if they can be falsified.”  The hypotheses that are the foundation for the Finding can be falsified as a Red Team exercise would show.

Professional Fakery

A major source of fake news and alternative facts is professionals who engage in political reality spinning while using their credentials as a source of legitimacy.  A recent example is the opinion piece in the Wall Street Journal by Alan Blinder, Princeton economist and former vice chair of the Federal Reserve and member of the President’s Council of Economic Advisors.

Blinder’s snarky assessment of the Trump tax plan, well it’s not a plan but a set of principles, is the work of political hacks; not that of a well respected academic.  Since the President released a one page summary of reform principles, Blinder must think that he is clairvoyant to divine the form legislation will take and the breakdown between winners and losers.  Since he is not clairvoyant, he relies on what he calls the “Republican Tax Cut Formula”.

Like all progressives, Blinder calls the proposal, which he translates into a plan, “remarkably regressive because it lowers the top tax rate from 39% to 35%, the corporate rate from the highest in the developed world to 15% and makes that rate apply to Sub-Chapter S corporations.

According to a report from Yardini Research the top 10% of tax payers, those with incomes over $100,000—hardly rich—pay 77% of federal income taxes.  The top 50% of tax payers pay a whopping 97% of federal income taxes, leaving the bottom 50% paying 3%.  How can reducing the top rate while also lowering the lowest rates be regressive?  Blinder must have an econometric model that makes it so.

The average corporate tax rate in developed countries is 22.5% while ours is 39% when state taxes are included.  This rate makes US corporations less competitive and creates an incentive to move investment offshore and keep foreign earning overseas.  It is estimated that US companies hold about $2.5 trillion overseas and untaxed. We live in a global economy, so making our corporations more competitive increases domestic capital investment, the route to increased productivity and job creation.  Blinder equates the change for Subchapter S corporations as a sop to hedge funds, real estate developers, and law firms—organizations that don’t have great public images.  The reality is far different than Blinder’s illusion.  According to the Tax Foundation, there are 24.7 million US corporations.  23 million are subchapter S which are subject to personal tax rates.  They are far more and far more diverse than hedge funds and others that the public scorns.

In his opinion piece, Blinder says “The system would remain complicated, unfair, and inefficient.  But the richest would pay much less.  Since legislation has not been written, how can he know this?  He doesn’t, it’s just alternative facts and fake news.

Republicans have a once in a generation opportunity to begin the process of putting our fiscal house in order since they control both the Congress and White House.  They realize this and also realize that picture could change next year.

That reality should give a sense of urgency to comprehensive tax reform.  The current tax code has 4 million words, which according to the Washington Times is seven times the length of War and Peace.  The Times also points out that 75 years ago the IRS 1040 had 2 pages and now it has 206.  Major reform that involves simplification is not an impossible objective.  Part of achieving it involves abandoning using the code to push industrial policy or other social objectives and for both political parties to work together.

Since entitlements represent two-thirds of federal spending, fixing social security, which Blinder didn’t mention, is an imperative.  The 1986 Bi-partisan Commission on Social Security represented a good start but it is clear that without another effort, both the national debt and deficit will continue to grow until our economic wellbeing is undermined.

People like Blinder and his colleague Paul Krugman should be using their professional talents and analytical rigor to address real fiscal problems instead of marketing partisan hobgoblins.



The Misguided March for Science

Although the Earth Day March for Science was billed to emphasize that “science upholds the the common good and to call for evidence-based policy in the public’s best interest,” it was nothing more than a reaction to the Trump Administration’s agenda and comments by the President that have been interpreted as hostility to science.  The only difference between the President’s remarks and the actions of the Obama Administration is that he has been blunt, even if misguided and ill-informed, where the Obama Administration wrapped its abuse of science in politically correct language.

Obama’s war on Climate Change was cloaked in science but was nothing more than a war against fossil fuels, especially coal.  EPA used science much as a drunk uses a lamp post—for support and not illumination.  Its Clean Power Plant regulation claimed benefits that were preposterous on their face.  The claim that reductions in air pollutants would reduce the incidence of asthma and premature deaths was accepted without challenge by many of those involved in the March for Science.  If they really wanted science based public policy, they would have challenged the basis for those claims by pointing out that the incidence of asthma has been increasing even though air quality kept improving and that the estimates of premature deaths avoided implied epidemiological precision of greater than 99%.  There is no scientific basis, beyond political science, that can justify those claims.

EPA’s abuse of science has been going on for a long time and has been accepted by many in the scientific community because it advanced the agenda of environmentalist elites who use science as a tool to increase the political power of government to promote their policy preferences. How else can you explain black box modeling and the one hit, linear dose response approach to toxic impacts?

Here are a few examples.

In evaluating the health effects of air pollutants and carcinogens, EPA takes the most conservative approach possible by assuming that there is no safe exposure and that dose response is linear.  Professor Judith Curry of Georgia Tech described one example this way: “ … EPA decided that there is no safe level of ambient PM 2.5 – however near to zero — at which risk of ‘early” death ceases. Statisticians call this analytic approach a “no threshold linear regression to zero analytic model.” …   This methodology … contradicts a foundational principle of toxicology, that it’s the dose that makes the poison. …”

In 2014, a former EPA scientist wrote Confessions of a Computer Modeler.  The example he used was his development of a large model to assess sewer treatment and drinking water quality.  As the regulations became more stringent, his analysis concluded that the agency “had hit the point of diminishing returns.”  EPA didn’t want to hear that.  He was told to rework the analysis and “sharpen my pencil”.  That kept happening until he asked his supervisor what result he was looking for?  In the end, he concluded “my work for the EPA wasn’t that of a scientist, at least in the popular imagination of what a scientist does.  It was more like that of a lawyer.  My job, as a modeler, was to build the best case for my client’s position.”

The agency draws on scientists to conduct research and participate on its advisory panels.  But, many of these groups end up reviewing the work of their members.  This has been shown to represent a clear conflict of interest.  Group Think virtually guarantees that these groups will pursue enlightened  self interest by being collegial and agreeing with each other’s research.

Past attempts to mandate transparency and data quality have not brought an end to EPA mischief.  And, it is the prospect that Congress will be more vigorous and vigilant going forward that rattled some of the organizers of the recent march.  If the marchers are serious about promoting “evidence-based policy” they will coalesce around accepted principles of science and complete transparency.  A good place to start would be support for the Reference Manual on Scientific Evidence and the rules set out by the Supreme Court in its 1993 opinion in Daubert v. Merrell Dow.