Voodoo Scholarship

Harvard professor Naomi Oreskes has just published an article in Environmental Research Letters–ERL–which is supposed to show that ExxonMobil mislead the public on climate change.  Professor Oreskes who has earned the reputation of being an academic character assassin and supported by the Rockefeller Family Fund which has been funding anti-ExxonMobil initiatives.

She first gained notoriety with her book Merchants of Doubt which attempted to discredit the reputations of three well respected but deceased scientists who could not defend themselves—Fred Seitz, Bob Jastrow, and Bill Nirenberg.  Their crime was to challenge climate orthodoxy with  scientific reasoning that has not been refuted scientifically.  The George C. Marshall Institute, which these three scientists founded, published a critique of her book–Clouding the Truth: A Critique of Merchants of Doubt-GCMI.  The conclusion of that critique says all that needs to be said about the Oreskes research model,“Merchants of Doubt is long on innuendo and short on evidence or compelling logic.  It fits well with Mark Twain’s classic observation about the gathering facts and then distorting them as the gatherer desires. If it were not for the attack on the Institute’s founders who cannot now defend themselves, the book could be dismissed for the partisan history it is”,

The methodology used by Oreskes in the Harvard article is essentially the same used by her and John Cook in the attempt to demonstrate a scientific consensus on anthropogenic global warming. The technique is known as content analysis.  The PEW Research Center says that “Content analysis has been defined as “the systematic, objective, quantitative analysis of message characteristics.”.  In a short piece on the technique by UC Davis, it is pointed out that content analysis “describes what is there, but may not reveal the underlying motives for the observed pattern (‘what’ but not ‘why’)”.  Oreskes started off knowing what she wanted to conclude and then structured her review to support her preconceived opinions.  A self fulfilling prophecy lacking the required objectivity!

The weakness of her work and the technique of content analysis was laid bare in critiques of the surveys on the so called scientific consensus.  A Canadian organization, Friends of Science, reviewed several studies, including Oreskes and Cook’s, that claim a 97% consensus among among scientists that human activities are the primary cause of climate change.  Its conclusion bluntly stated, “The deconstruction of the surveys that follow shows the claim of a 97% consensus is pure spin and ‘statisticulation’ – mathematical manipulation” and “there is no 97% consensus on human-caused global warming as claimed in these studies. None of these studies indicate any agreement with a catastrophic view of human-caused global warming”. That conclusion was shared in an article in Popular Technology titled, 97 Articles Refuting The “97% Consensus.

 In her article, Oreskes concludes that “ExxonMobil contributed to advancing climate science—but promoted doubt about it in advertorials.  Given this discrepancy, we conclude that ExxonMobil mislead the public.”  Research on climate science is intended to advance knowledge while advertorials are intended to communicate on public policy issues. The criterion for judging them should be whether they are based on a solid foundation.

Oreskes and her climate orthodoxy colleagues assert with certitude that human use of fossil fuels is the primary cause of global warming over the past 50 or so years.  But, that is not consistent with the state of science.  In its most recent report, the IPCC lists a number of climate factors for which knowledge is limited. And, its range of climate sensitivity, which has steadily been reduced, varies by a factor of 3. Hardly a basis for certitude. A National Academy of Sciences report stated, “ Because there is considerable uncertainty in current understanding of how the climate system varies naturally and reacts to emissions of greenhouse gases and aerosols, current estimates of the magnitude of future warming should be regarded as tentative and subject to future adjustments (either upward or downward).  Reducing the wide range of uncertainty inherent in current model predictions of global climate change will require major advances in understanding and modeling of both (1) the factors that determine atmospheric concentrations of greenhouse gases and aerosols, and (2) the so-called “feedbacks” that determine the sensitivity of the climate system to a prescribed increase in greenhouse gases.”.  Those advances have not taken place.

In addressing policy proposals that were based on greater certainty than actually exists, it is quite reasonable to use advertorials to focus on uncertainty and the economic consequences of implementing those policies.  Oreskes points out that ExxonMobil and Mobil before the merger ran advertorials in the New York Times every week from 1972 to 2001. But, it was Mobil that ran weekly ads before the merger in 1998. But, that didn’t deter her from assigning guilt to ExxonMobil.   Surely, if the advertorials were outside the bounds of legitimate discourse, the Times, given its position on climate change, would have rejected them or used its editorials to dispute them.  It is also telling that Oreskes’ article covers advertorials starting in 1972 which was well before climate change/global warming became a major public policy issue.  In the 1970s, the oil industry was concerned about price and allocation controls, divestiture, and energy independence issues.  This is another point demonstrating that Oreskes was simply looking for any way to support her biased, preconceived point of view.

There are only two possible uses for this article—a case study of how not to do content analysis and as liners for bird cages.  Harvard should be embarrassed!

 

 

 

 

 

.

 

 

 

Polls, Illusions, and Bad Analysis

Gallop and Bloomberg polling tell us that tax reform is not an important priority to the American people.  CNBC says, “Many Americans aren’t sure they even need tax reform.”  The Gallop poll, taken each April, shows that only about 50% of the people polled believe that taxes are too high, while only 4% of the people Bloomberg polled believes it is a high priority.

Since individual tax reform is supposed to lower tax rates and make the tax system simpler, why would people show a lack of interest?  The poll results seem to defy common sense.   The answer is contained in the fact that the top 40% of tax payers pay 97% of federal taxes while 45% pay no federal income taxes.  A 2016 Tax Foundation report on compliance concludes “the latest official estimates of the eye-popping amount of time and money that Americans lose each year in complying with IRS paperwork—8.9 billion hours and $409 billion in lost productivity—indicate that the most important benefit of tax simplification may be the gift of time.”

Since 45% pay no taxes, the way the poll was conducted and the structure of the questions may have foreordained the results.  Random sampling is a fundamental principle of statistics.  However, in the case of the Gallop poll, random sampling of sufficient size would clearly bias the results since almost half of those polled would have been in the pay no taxes category.  Gallop never connected its results with the skewed distribution of individual tax payments.  In addition, in explaining polling results that fail the smell test, pollsters blame “the growth of cellphones and the decline in people willing to answer surveys.”  In 2014, the Huffington Post ran an article by Frank Wall, Severe Flaws in Polls.  In it he said, “The basic problem with polls is that public sentiment is quite like our body temperature; it goes up and down, somewhat randomly, and taking action too quickly in response to changes can lead to serious unintended consequences. … It is also increasingly clear that many polls are tainted by respondents lying to pollsters because they believe they cannot trust where their answers might end up …  That can lead to unintended consequences of self-fulfilling promises because misinformed poll results obviously do not tell the real story … .” Amen to that!

Michael Traugott, a University of Michigan political science professor who specializes in polling and opinion surveys got to the heart of bad polling very succinctly, “Put simply, if you ask the right sampling of people the wrong thing … you’ll get a bad result”.  Traugoot could have also added, you also get bad results, if you don’t dig deep enough into the polling to get to the real meaning or if you cherry pick results to support an agenda.  Bloomberg and CNBC which wrote about the Gallop results could be guilty of both.  Neither were balanced or insightful in their articles.

In Daniel Boorstin’s The Image, he says of public opinion that it “became itself a kind of pseudo-event, forced into existence for the primary purpose of being reported.  Expressions of public opinion became among the most powerful, the most interesting and the most mysterious of pseudo-events.”  Others like Darrel Huff who wrote How to Lie With Statistics and John Brignell who wrote Sorry, Wrong Number provide us a way to identify attempts to bamboozle and poor methodologies and analysis.

Gallop should be especially sensitive to its polling methodologies and analysis since it gave us President Thomas Dewey and more recently led Mitt Romney to believe that he was going to win the presidency.

Stories based on polling results probably should carry a warning, “reader beware.”

 

 

 

 

 

Gasoline Prices: Onward and Upward?

Gasoline prices have been steadily increasing in the past year, going from a national average of $1.87 early in 2016 to $2.51 in July.  The end of the driving season should provide some relief but it won’t be enough to offset upward pressures.

There are 4 factors that drive gasoline prices:  crude oil prices, demand, refinery enhancements, and more recently exports, which have more than doubled since 2010.  Crude oil prices are roughly $6 a barrel higher this year than last, although OPEC was expecting a larger increase from its production cut.  A $6 per barrel increase translates into an increase of 14 cents per gallon.  According to EIA, domestic demand and product exports have been running above the five-year averages.  As long as the economy continues to improve, gasoline demand will remain robust.  Although, refinery production of gasoline has been strong, it has not been sufficient to meet demand, so there have been draws from product inventories.  Reductions in inventories also put upward pressure on gasoline prices because of expectations about the future.

In addition to the increase in the average price of gasoline, the price difference between regular and premium has also been increasing.  According to the Energy Information Administration, as of July 2017 the premium -vs.-regular differential reached $0.53/gallon — more than double the differential in 2012.  Although historically 90% of gasoline sales have been for regular, midrange and premium demand has grown in importance as sales of higher performance and turbo charged vehicles have increased.  EIA also attributes higher fuel economy standards as a driver in increased demand for midrange and premium gasoline.  Demand alone cannot explain why premium prices have increased more rapidly than regular.  The rest of the answer is the cost of production.

The octane level of gasoline before ethanol blending or chemical reforming is less than the 87 octane of regular gasoline.  For the past decade, ethanol, which has an octane level of 115, has been blended with straight run gasoline to achieve the needed octane levels for gasoline sold at the pump.  Now, with growing gasoline demand, ethanol blended with gasoline has reached the legal 10% limit and other sources of octane enhancement are needed.  Refiners have made refinery modifications to increase alkylate production which is the octane booster of choice because it is low in sulfur but high in octane.  Producing alkylate is expensive, resulting in a price per gallon that is 30 cents more than gasoline at the refinery gate.

So, even though gasoline prices have been rising, today’s price has to be considered a bargain when compared to the $4.00 per gallon that motorists were paying in 2011.  No one knows what the price of gasoline will be in the months ahead or next year but the likelihood is that it will continue the steady pace upward if the economy remains strong.  What is important is looking to the future is to keep in place policies that do not restrict domestic production or the efficient operation of refineries.  That will allow market forces instead of political forces to determine the price at the pump.

DARPA’s Threat to Climate Catastrophism

A recent article in Wired–BS Detector—described how DARPA is seeking proposals for “ways to determine what findings from the social and behavioral sciences are …credible.”  The request for proposal states, “There may be new ways to create automated or semi-automated capabilities to rapidly, accurately, and dynamically assign Confidence Levels to specific … results or claims.  …  Help experts and non-experts separate scientific wheat from wrongheaded chaff using “machine reading, natural language processing, automated meta-analyses, statistics-checking algorithms, sentiment analytics, crowdsourcing tools, data sharing and archiving platforms, network analytics, etc.”  Boiling this down to its essence, the article’s author suggests that DARPA is “trying to build a real, live, bullshit detector.”

It might seem that this is a far out notion except for two things.  First DARPA has a solid reputation for innovation and solving the unsolvable.  Its work on communications is now what we call the internet.  Second, Artificial Intelligence  technology is already being in business, especially in investing, where it is achieving above average returns, and in the developmental work on autonomous cars.

Machine learning is a more generalized application of artificial intelligence and may be the means by which DARPA achieves the ability to assign confidence levels to scientific results and claims.  When that day arrives, hopefully not that far in the future, the proponents of an impending climate catastrophe will be in serious trouble.

If there was a working BS Detector  a recent article in the Washington Post titled “We only have a 5 percent chance of avoiding ‘dangerous’ global warming might not have been published.   Written by avowed climate activist Chris Mooney, the article is based on two peer reviewed articles published in Nature Climate Change.  One analysis is based on the hypothesis that additional warming is embedded in the climate system through an energy imbalance from past emissions but has ”not yet arrived.”  The other analysis applies past statistics to the Kaya Identity model to estimate emissions and warming in 2100.  The Kaya Identity is a simple way to project future carbon emissions based on changes in population, GDP per capita, energy intensity, and carbon intensity.  The problems with this article are so obvious that Nature should have rejected publishing it.  The Kaya Identity model can be grossly right for near term periods but is useless for making estimates decades into the future.  No one can accurately estimate carbon intensity or energy efficiency 83 years into the future.  It is foolish to try.  In addition, this article implies an unjustified accuracy in climate sensitivity—the warming from doubling CO2.  The IPCC estimate has changed over time with the lower bound being reduced.  The current estimate varies by a factor of 3.

The media assigns too much credibility to peer review.  The peer review process has been misused in all fields of science and its shortcomings well documented.  In part, peer review fails because reviewers are not blinded, are too busy to invest enough time to do a thorough review, and most important the essential steps of data validation and replication are not performed.

If DARPA’s initiative is successful climate catastrophism will be exposed for what it really is, advocacy.  Fact Checker gives Pinocchios; maybe the BS Detector will assign piles of dung.

 

 

 

 

 

 

A Triumph of Hope Over Experience

The White House has laid out an aggressive schedule for passing tax reform legislation sometime this fall.  That schedule means that tax reform will have to be tied to the budget and passed under reconciliation.  After the health care debacle, enthusiasm and certitude need to be tempered by a large dose of reality and a good understanding of what would be deemed acceptable by the Senate Parliamentarian.

The House’s Better Way tax reform plan is based on solid principles but it also involves a number of elements, such as border adjustment, that represent significant departures from the current tax code.  That is going to make the sell in the Senate very difficult.

The White House supports eliminating or curbing numerous tax breaks to offset some of the lost revenue from cutting tax rates but those “tax breaks” are not identified and surely will cause some Senators to balk. The White House plan includes cutting the corporate tax rate from 35 percent to 15 percent, simplifying the taxes individuals and families pay, eliminating the estate tax and jettisoning the alternative-minimum tax.  Senator Hatch has dismissed that objective as unrealistic.

After the embarrassment of not being able to pass a republican only health care bill, there should be some reluctance to attempt the same approach with tax reform.  But, bringing democrats on board also represents a major challenge.  Democrats have issued a letter that identifies three basic demands: any tax legislation should be drafted under the regular bipartisan committee process; the legislation should deliver relief to middle-class families without tax cuts for the rich; and any tax changes should include at least as much revenue as is collected today.  Although this is the democrats opening gambit, it is not clear whether the two parties can come together on a reasonable set of changes.

It is a virtual certainty that both parties are applying game theory or something analogous to it.  The democrats don’t want to give the republicans any victories so that they can make a stronger case in next year’s elections.  At the same time, they cannot be seen as total obstructionists on taxes which has a high priority with the public.

The republicans would prefer to pass tax legislation without compromising but after the health care vote cannot count on 50 sure votes on comprehensive legislation along the lines of the House’s Better Way.  Further, comprehensive tax reform involves a time consuming process of tradeoffs, which the republicans cannot not afford.  They need to pass something this year to bolster their re-election prospects in 2018.  The tax reform act of 1986 took two years from start to finish.  The republicans cannot take that much time after failing to pass any major legislation that appeals to the public.

Given all of the political constraints that exist and the number of legislative days remaining, there is no realistic basis for believing that anything as comprehensive as the Better Way will get passed.  Corporate tax reform is essential to help US companies compete in a world where the average OECD tax rate is 22.5%.  Lowering corporate rates will probably be paid for by eliminating some business deductions.  The business community will probably go along—sullen but not mutinous—as long as the result is a level playing field and avoidance of dual taxation on foreign earnings.

On the individual side. Lower rates and simplification should generate strong support, especially if the top rate remains the same or close to it.  Mortgage deductions, state tax deductions, the inheritance tax, and the capital gains tax could be candidates for revision as long as changes do not penalize the  middle class are seen as benefitting them more than the top 10%.

One change that Congress should consider and which would be very popular with the average tax payer is for the IRS to prepare draft returns for review and approval by tax payers.  The service already collects the information needed to do so.  According to an article The Atlantic10 Second Tax Return –“ Eight OECD countries, including Finland and Norway, fully prepare returns for the majority of its taxpayers. In Estonia, it famously takes the average person five minutes to file taxes”.   That one reform measure would save taxpayers time and money spent in completing their annual tax returns and go a long way in restoring confidence in Congress.

The electorate is rightfully unhappy with the performance of our Federal Government.  Unless the Trump Administration and Congress start doing the peoples’ work both could pay a heavy price at the polls.  Modest tax reform now followed by a major simplification initiative would help to restore faith in our system of government.  The current tax code has grown from 26,000 pages at the time of the last comprehensive reform to over 74,000 pages today.  It needs major streamlining and simplification.

 

 

 

 

 

The Groucho Marx Explanation for Climate Change

The late comedian is known for a number of one liners.  One that is often repeated is “who are you going to believe, me or your lying eyes”.  That question is often the approach taken by climate advocates.

A case in point is the annual ritual of claiming that the latest year’s temperature is one of the highest on record even though satellite records since 1998 show that temperatures have not risen.

Advocates are quick to criticize satellite temperature measurements because they don’t conform to their orthodoxy, cover too short a time period, and are only an indirect way of estimating surface temperature.  And, they have readymade explanations for the adjustments that they routinely make to surface measurements that always make the surface temperatures warmer.

In the past few months, two analyses have shown how these advocates engage in scientific and statistical manipulation to bolster their argument that human activities are the primary cause of warming that has occurred over the past 50 years and that the warming hiatus is a mirage.

Climate scientists, James Wallace, Joe D’Aleo, and Craig Idso, have prepared a report—On The Validity of NOAA, NASA, & HADLEY SURFACE TEMPRATURE DATA— that examined the surface data adjustments in three data sets—NOAA, NASA, and the Hadley Center in the UK.  In their study, they make the point that “ the notion that some “adjustments” to historical data might need to be made is not challenged.”  They go on to make the logical observation that it would be expected that sometimes the adjustments would raise temperatures while others times they would lower them.  Instead, what they found was the adjustments were consistently to increase temperatures and had the result of “removing the previously existing cyclical temperature pattern.”  Clearly, climate advocates at these agencies know what they want temperature record should look like—keep going up—and adjustments are the way to get the desired result.

John Brignell has written a wonderful book—Sorry, Wrong Number, that covers in great depth the abuse of measurement, which has been going on for a very long time and is far broader than climate.  Conclusions don’t flow from data; data are arranged to confirm conclusions.

The steady increase in temperatures reported, generally with great fanfare, by these three organizations is used as a way of validating climate models, which on their face should be viewed with a healthy dose of skepticism.  As John Christy showed in recent testimony before the House Committee on Science, Space and Technology the models that predict ever increasing temperatures require the addition of “extra GHGs.” When the extra GHs are removed, model results “overlap the observations almost completely”.

Examples like this provide a strong justification for the Red Team exercise proposed by Steve Koonin and embraced by the EPA Administrator.

 

 

A Gift to Environmentalists

President Trump’s Unleashing American Energy speech, especially the portion dealing with the OCS will no doubt unleash a fundraising blitz by environmental organizations who will use scare tactics about OCS development off the Atlantic coast to fill their coffers.

The Five-Year plan that the Interior Department has prepared, and which is open for comment, would not go into effect until 2022.  If leases were offered there is no guarantee that companies would bid on them or that oil and gas would be found.  Much depends on the outlook for natural gas and crude oil prices, advances in technology as well as the positions on leases taken by coastal states.  For decades, coastal states, and communities on the Atlantic coast, have opposed any drilling because of fears of spills that would impose severe economic costs because many rely on tourism.

The Coastal Zone Management Act(CZMA) provides a mechanism for states to be actively involved in the leasing process.  Under a reauthorization to the CZMA in 1990, Congress gave states greater authority by eliminating the requirement that there had to be a “direct” effect on a state’s coastal zone for requiring a consistency determination that a proposed lease sale will not impact its land or water use or natural resources in its coastal zone.  If the state disagrees with the consistency determination, Interior must follow certain procedures to achieve resolution before holding the lease sale. In addition to requiring compliance with CZMA, there are other environmental laws could limit natural gas and oil activities by imposing additional requirements. These include: The National Environmental Policy Act, the Clean Air Act, the Endangered Species Act, the Clean Water Act, and the National Fishing Enhancement Act.

The fears expressed by coastal communities are not  grounded on a solid foundation.  Since the Santa Barbara accident in 1969, there has only been one major production accident, The Deep Water Horizon in the Gulf of Mexico.  The fact that 40 years lapsed between major production accidents with more than 50,000 wells drilled in US waters is testament to the technology and safety of offshore drilling.  Oil spills cause damage but it is not permanent and there are few things in the modern world that are risk free.

Seismic testing to determine the resource base of the Atlantic offshore, which occurs before leasing, does not translate into lease sales and drilling.  The abundance of onshore oil and gas made available through fracking technology combined with coastal politics, and the cost of offshore platforms affects the decision calculus of companies when considering whether to bid on an offshore lease.

So, while environmental organizations will make money from the decision about the next five-year plan, the reality is that the outburst of opposition is at least premature and not well informed.  Coastal communities own interests will best be served by emotion free analysis and an honest consideration of trade-offs.

CUT THE GORDIAN KNOT

For 30 years, the Federal Government has been studying, debating, and procrastinating on the permanent storage of nuclear waste.  According to Secretary of Energy Perry, there is over 70,000 tons of nuclear waste that is stored at 120 sites in 39 states.  That is clear evidence of irresponsibility.  The time has come to cut the Gordian Knot either with a scalpel or meat ax.

The history of nuclear waste disposal goes back not 30 years but 60 when the National Academies of Science recommended that the best disposal would be in a deep rock formation.  There has been no study since then that has changed that recommendation.  DOE began studying Yucca Mountain in 1978.  President Reagan, based on reports covering about 10 years of study selected three sites for intensive evaluation and in 1987 Congress directed DOE to consider only Yucca Mountain.  Yucca Mountain is an abandoned nuclear test site that has no value besides the planned storage of nuclear waste.  The quagmire of politics and federal arrogance and procrastination have been the main obstacles to moving forward.

It is understandable why Nevada objects.   No state wants to be known as the nation’s nuclear waste capital.  But, the alternative of continuing to store waste at nuclear energy facilities is far worse, especially given the growing threat of terrorist attacks.  There may not be any good and politically acceptable options but in that case, the least worse is clearly the best.

The outlook for nuclear power is not encouraging and more facilities are closing because of cost and politics, especially the politics of subsidized renewables.  Electric utilities simply are unwilling to take the economic and political risks associated with building and operating nuclear facilities.  Low natural gas prices, which are likely to persist for a very long time, make its future much brighter than nuclear.

As our nation becomes even more of a service economy, the demand for reliable and affordable electricity continues to grow.   Although environmental advocates and a number of states are pushing for greater reliance on alternatives, their future is tied primarily to continued subsidies and mandates.  Without those, it is doubtful that many utilities would make them the cornerstone of their power generating capacity.  Since subsidies are likely to be reduced or eliminated as the government confronts serious budget issuesand the embarrassment of growing crony capitalism, the real future of power generation is either nuclear or natural gas.

The outlook for nuclear power will not improve until its cost is substantially reduced and the waste disposal dilemma is resolved.  The fact is that Yucca Mountain is a safe solution and it needs to be used to store the existing waste from 120 less safe sites unless some other state with deep geological formations is willing to accept waste storage.  There may be no economic incentives that would cause a state to step forward but nothing is lost in trying.

Since the Yucca Mountain facility has no other use because of the prior nuclear tests, the problem, and it is a major one, is political opposition by Nevada.  It may be that there is nothing the Federal Government can do now to reduce that opposition but a good faith effort should be made by engaging the state’s leaders, political and public while exploring the option of another, more willing state.  The route to common ground is not easy but trying is certainly better than the status quo.  If such an effort fails, the federal government should move forward on the process to begin using the Yucca Mountain for some period of time while exploring a range of alternatives for a permanent facility.  Nevada could receive financial payments that would significantly escalate if the agreed upon storage period was exceeded without an alternative solution being adopted.

Until there is a politically viable solution to waste storage, no new nuclear plants should be approved and the federal government should take ownership of all commercial nuclear waste.  That would relieve utilities, and tax payers, from the continued cost of storage and protection.

Persistent Myths

The Washington Post recently ran an article by Jason Samenow (washingtonpost) that cavalierly dismissed EPA Administrator Pruitt’s call for a Red Team approach to refine what we know about climate science.  While the author accurately summarized the justification offered by Steven Koonin, who was a senior DOE official in the Obama Administration  and a highly regarded physicist, most of the article was an attempt at discrediting the value of a Red Team exercise.  Given the certitude of the climate establishment, they should welcome the opportunity to shame climate skeptics.

The writer uses remarks by David Titley, a climate scientist Penn State, “Science already has a red team: peer review, to make the case that peer review has already settled the argument over climate science. Is it a coincidence that Titley is a colleague of Hockey Stick slick Michael Mann?  Perhaps but don’t bet on it.

Claiming that science is settled because of “peer review” is at best disingenuous.  For peer review to be a credible standard, reviewers should be anonymous to authors and reviewed research should be replicated.  That is rarely the case, and especially not the case for articles on climate change where authors often suggest or pick reviewers.  In 2010, (The Guardian in response to Climategate published an article, Climate change emails between scientists reveal flaws in peer review, that exposed how the peer review process was being manipulated.   Flaws in peer review are not limited to climate science. The  Journal of the Royal Society of Medicine published a broader critique–Peer review: a flawed process at the heart of science and journals.  In this article, the point was made that peer review is like “democracy: a system full of problems but the least worst we have”.  The evidence is clear toparaphrase Samuel Johnson, parroting peer review is the last refuge of scoundrels.

The claim that 97% of climate scientists believe that humans are mainly responsible for warming over the past 50 plus years has been shown over and over to be a myth.  But, the climate establishment has kept repeating it until the media and public accept it as fact.  As a result, journalists have gotten lazy and don’t bother to do the hard work that follows from the view, “I’m not convinced”.  If they did, they would easily find that the papers purporting to show that this overwhelming consensus have been thoroughly debunked, not just one or two times but dozens. In 2014, Roy Spencer and Joe Bast published an insightful article in the Wall Street Journal (spencer and bast) that concluded, “There is no basis for the claim that 97% of scientists believe that man-made climate change is a dangerous problem”.  Friends of Science (friends of science ) also published a detailed study titled 97% CONSENSUS? NO! GLOBAL WARMING MATH MYTHS & SOCIAL PROOFS that provided a detailed critique of the consensus meme.

Rather than denigrate the call for a Red Team exercise, Mr. Samenow and the Post should spend some time looking at the evidence that supports Dr. Koonin’s proposal.  As the late Daniel Patrick Moynihan once observed, we are all entitled to our own opinions, just not our own facts.  The Red Team review would go a long way to making clear what is opinion and what is fact.

 

 

 

 

 

 

Carbon Dioxide Facts Versus Alternative Facts

The reaction of hard core environmentalists and their allies in Congress, and the main stream media to the President’s statement that the US “will continue to be cleanest and most environmental friendly country on Earth” demonstrates the lengths they will go to mislead and distort in pursuing their agenda.  An Associated Press article claims that “the US is among the dirtiest countries when it comes to … carbon pollution.”  That statement is a clear example of fake news and alternative facts.

Why is this the case?

Most important, CO2 is not a pollutant.  It is a nutrient that is essential for the growth of plants, crops, and trees.  According to a NASA study, “From a quarter to half of Earth’s vegetated lands has shown significant greening over the last 35 years largely due to rising levels of atmospheric carbon dioxide.  And, the notion that current CO2 levels in the atmosphere constitute a health risk is countered by the fact that the US Navy allows up to 5000 ppm in submarines, more than an order of magnitude higher than current atmospheric levels.

The classification of CO2 as a pollutant is the result of a poorly reasoned Supreme Court decision in 2007 that relied on the “Chevron doctrine” that gives deference to regulatory agencies in cases of legislative ambiguity.  However, the Court ignored the fact that in passing the 1990 amendments to the Clean Air Act, Congress explicitly denied EPA the authority to regulate CO2.  There was no ambiguity.

Contrary to the impression created by the climate establishment, CO2 is not the dominant or primary atmospheric greenhouse gas.  The atmosphere is comprised primarily of nitrogen and oxygen—99%.  Argon is 0.9%, CO2 is 0.03%, and water vapor which varies from 0.0 to 4.0% is the more potent greenhouse gas. According to the International Environmental Data Rescue Organization, “Water vapor accounts for 60-70% of the greenhouse effect while CO2 accounts for 25%.

Climate advocates have created the image of ever increasing levels of CO2 leading to ever increasing global temperatures.  That implies that there is a linear relationship between atmospheric levels and increases in temperature.  In fact, the warming potential of CO2 is non-linear.  This means that the warming caused by the next increment of CO2 is less than the increment that preceded it.  There is reason to believe that the world is on the flat end of the curve.

Although advocates point out the increase in CO2 levels and temperatures since the Industrial Revolution, they conveniently omit the fact that in the earth’s history, CO2 levels have been significantly higher.  Geologists estimate that 200 million years ago, average CO2 concentrations were about 1800 ppm, almost 5 times greater than today’s level. The highest concentrations of CO2 are estimated to have reached nearly 7000 ppm.  And earlier, over 400 million years ago, earth experienced an ice age with CO2 concentrations at 4400 ppm. The original “hockey stick” graph in Al Gore’s Earth in the Balance was revealing when plotted objectively and not to prove a point.  It showed over a 160,000 year time period that there were times when CO2 preceded warming, followed warming, and coincided with warming.

Today’s global temperatures are not unprecedented.  Within the past millennium, temperature was warmer during the Medieval Warming period (800-1400) and during the Little Ice Age (1400–1850), it was colder.  Tracing modern warming from the end of the Little Ice Age creates an erroneous picture of warming.

The effect of CO2 concentrations on warming is a function of climate sensitivity—the temperature effect from doubling CO2.  Absent positive feedback, a doubling of CO2 is estimated by most scientists to be 1 degree C.  The IPCC in its most recent report estimates that climate sensitivity ranges from 1.5C to 4.5C.  In 1985, UNEP put out a statement on greenhouse gases and climate.  It estimated that a doubling of CO2 would lead to a temperature increase of 1.5C to 4.5C.  Since research is intended to expand knowledge and reduce uncertainty it is telling that in spite of spending well over $100 billion, we seem to know no more about the effect of CO2 on warming than we did over 30 years ago.  Curious indeed.

EPA has set standards for air quality and records over the past 40 years show that there has been steady and continuing improvement. Between 1980 and 2010, major pollutants defined by EPA were reduced between 27% and 82%.  Ozone and particulates had the smallest reductions but their reductions continue as new technology is developed.  It is grossly misleading to challenge the global leadership in US air quality by focusing on an emission that is not a pollutant and not a major component of the air we breathe.

The fact that the scary predictions made over the past 30 years have not come true along with the advancement of knowledge should lead thoughtful people to realize that we do not face a climate catastrophe.  Pretending to know more than we really do and pretending that we can control a complex system by controlling one variable is what Frederich Hayek termed the fatal conceit.