Give Steve Koonin a Sunlight Award

In 1913, Louis Brandeis, later to be Justice Brandeis, wrote “Sunlight is said to the best of disinfectants; electric light the most efficient policeman.”  He was referring to the banking business back then but his insight offers a strong principle in support of openness and transparency.  On November 2 in the Wall Street Journal, Steve Koonin, a well respected physicist and former deputy secretary of energy. critiqued the most recent U.S. GLOBAL CHANGE RESEARCH PROGRAM CLIMATE SCIENCE SPECIAL REPORT.

His basic criticism is that the Congressionally mandated report reinforces alarmism and the narrative that human activities are mainly responsible for climate change since the mid 20th Century. The charter for the research program is to conduct a “comprehensive and integrated United States research program which will assist the Nation and the world to understand, assess, predict, and respond to human-induced and natural processes of global change.” Koonis’s opinion piece explains the failure of the just released report to do that.  It is advocacy masked as science.

The alarmist tone in the  report is not really surprising since the agencies that are involved in and managing the program have been populated by bureaucrats and scientists who operate in a structure and system created by Al Gore when he was Vice President.  During the early 2000s, the Bush Administration attempted to get control of the process but was thwarted by the bureaucracy, especially as the Administration’s priorities changed dramatically after 9-11.  Bureaucrats know that political appointees come and go every couple of years but they stay for what seems like forever.  What is especially alarming is the complicity of the National Academies of Science in endorsing it.

Yogi Berra is erroneously attributed with the observation that in theory, theory and practice are the same but in practice they are not.  The wisdom of this insight is proven by the fact that while the transparency with which the special report was prepared and reviewed should have been an incentive for those involved to honestly do their jobs, that was not the case for those who promote the climate orthodoxy.  Climate change for over two decades has been dominated by environmental zealots, GroupThink,  actions that reinforce it, and confirmation bias. Changing the current situation will not happen quickly but it needs changing for the sake of sound policy and the integrity of the scientific establishment.

Members of the Academy should demand a review of why and how its review process failed and openly admit that failure while withdrawing support for the Special Report.  Koonin has been calling for Red/Blue process for at least a year.  Although his call has gotten support from the EPA Administrator, it clearly is not one of his highest priorities.  Congress should step in and initiate an independent review of the Global Change Research Program, including a mandate for the Red/Blue Team review of the Special Report, and mandate corrective changes to enhance the integrity of the program, if it concludes that it should continue, which is not a foregone conclusion. Investing in climate science is a worthwhile activity but not the way it has been pursued since the early 1990s.

So, Steve Koonin began the application of the disinfectant and his initiative needs a great deal of support from others.  He deserves a Brandeis.

 

 

 

Filling the Swamp Isn’t Draining it!

President Trump may be a teetotaler but when it comes to corn alcohol he is all in and in a big way.  Instead of draining the swamp, he is filling it with ethanol by pandering to corn state senators by backing off from abolishing the renewable fuel standard.

The ethanol mandate originated in the 1990 Clean Air Amendments and was added as a way of buying agriculture’s support.  It was not needed for clean air purposes then and it is not needed for greenhouse emission reductions now.  It’s only benefit is to enrich corn farmers and ethanol manufacturers. At the time of the Clean Air debate, the oil and auto industries urged Congress to set tailpipe emission standards and let the two industries work together to meet them.  Instead, Congress wrote a formula for gasoline into legislation.  When the ethanol mandate switched from being about clean air to climate change, advocates claimed that it would result in lower greenhouse gas emissions.  The evidence demonstrates the opposite.

In 2005 when gasoline demand was rising, Congress tried to deal with complaints about the ethanol tax credit by mandating annual volumetric requirements.  Of course, shortly thereafter gasoline demand peaked.  EPA attempted to raise the current 10% per gallon limit which would have voided many new car warranties and later attempted to reduce the annual volume requirement but was rebuffed by a court.  Congress could have simply ended the volumetric requirement but the ethanol lobby has proven too powerful. No matter which party controls Congress, it needs support from agricultural representatives and the environmental lobby to pass legislation, so nothing happens.

With 40% of the domestic corn crop being dedicated to ethanol production, the effect has been to drive up the prices of food products that are based on corn.  The ripple effect of higher prices has an especially harmful impact on people in countries where corn based food is a basic commodity.  In addition, the ethanol mandate also means higher gasoline prices, especially in the summer when congressionally mandate reformulated gasoline is required.

To ensure compliance with the renewable fuel standard, EPA created a tracking system of renewable identification numbers—RINs—that are assigned to each gallon of biofuel.  Blenders of gasoline are required to have enough RIN credits for each gallon of gasoline they sell.  If they don’t have enough credits, they have to buy them.  And, since in recent years the volumetric mandate has exceeded the demand for gasoline, refiners have had to buy additional credits for fuels that did not exist, driving up the price of credits and gasoline.  Hence, regulatory compliance created a market in credits that would increase in value.

Anytime there is a trading market, Wall Street speculators will find a way to become involved and make money in the process.  Credits created for compliance purposes quickly became a commodity that could be bought and sold like any other commodity.

This is a classic example of the Bootlegger and Baptist theory of public choice.  Farmers, ethanol manufacturers, and now traders align themselves with environmentalists who are wedded to biofuels to  enriched themselves by selling a product that is not needed.

 

The End of the Internal Combustion Engine, But Just Not Yet

A wave of prospective bans on gasoline and diesel engines seems to be sweeping through Europe and is on the drawing board in China.  The bans announced by France and Britain are decades off but not in Norway or India.  China will probably pick a target far enough off that it will have no near term effects.

Environmentalists are obviously cheering but you have to ask how serious are these prospective bans?  If they are mainly aspirational, they will undoubtedly slip as they confront reality.  If they are serious deadlines, you have to wonder if there has been any serious analysis of the economic impact that a mandated shift to electric vehicles would have?

While some of these bans may reflect deeply held beliefs by ruling elites, it is hard not to conclude that they are driven by politics in the hope that the fervor driving the climate lobby will have dissipated by 2040.  Making commitments for a relatively far off period makes it easier to avoid tough actions now.

Advocates for electric vehicles make claims about falling costs and their being almost competitive with internal combustion vehicles but if that is true why would bans be needed?  If the stated optimism was genuine then advocates should believe that market forces would bring about the transition from gasoline and diesel more efficiently and faster.

In 2016, global auto sales were close to 90 million with EVs and PHEVs were about 1.5million.  Until recently, most forecasts for EVs showed steady but very optimistic growth over the next couple of decades with estimates ranging from one-third to over fifty percent of vehicles sold in 2040. But, the IMF has raised the outlook to an incredible 90%. IMF clearly shows life in an ivory tower and the  wonders of assumptions, equations, and extrapolation!

Auto manufacturers have been playing the EV game because of the effect of subsidies and for branding purposes.  All plan to roll out an increasing number of EV/PHEVs in the coming years with GM stating that it will have 20 all-electric vehicles by 2023 while Volvo plans of manufacture only EV/PHEVs beginning in 2019.  As long as subsidies are in place manufacturers will continue to produce these vehicles and continue to monitor and participate in advancing technology in case there is a breakthrough.  But, there is no evidence that any manufacturer except Volvo is shifting its emphasis from the internal combustion engine and advancing its technology.

After all of the hype and optimism, reality and economics will rule the day.  Electric vehicles cost more to manufacture than internal combustion engine cars, as documented by a comparative analysis by ADL.  While lithium battery costs have been dropping they have not dropped enough to offset higher manufacturing costs of EVs and PHEVsA.  Without a major breakthrough in battery technology, EV/PHEV potential will continue to be limited.  A 2015 article in MIT’s technology Review made this point very clear, “… while countless breakthroughs have been announced over the last decade, time and again these advances have failed to translate into commercial batteries with anything like the promised improvements in cost and energy storage … One difficult thing about developing better batteries is that the technology is still poorly understood. Changing one part of a battery … can produce unforeseen problems, some of which can’t be detected without years of testing”.

And, until that breakthrough takes place, range will be a serious limitation.  There is also the issue of charging.  How much additional power plant capacity will be needed, what will it costs, and how much will it add to the GHGs that EVs avoid?

Hype and hope are not substitutes for needed large scale capital investments and actual advances in technology which cannot be mandated to meet a political time frame.  In the meantime, the world does not face any near term shortage in oil and actual advances in internal combustion engine technology are providing increased efficiency and lower emissions.

 

 

Three Cheers for Ending Sue and Settle and Goodbye Bre’r Rabbit

EPA Administrator Pruitt’s decision to end “sue and settle” is a step in restoring the rule of law at EPA. And as expected, environmental groups howled in protest, and with good reason.

For too many years, democrat administrators have taken settlement agreements, which can be a good resolution, a step too far by agreeing to terms that imposed regulatory burdens outside of the rule making process.

Several years ago, the Washington Examiner ran an article describing how sue and settle had become a “cottage industry”. It described it this way “First, the private environmental group sues the EPA in federal court seeking to force it to issue new regulations by a date certain. Then agency and group officials meet behind closed doors to hammer out a deal. Typically in the deal, the government agrees to do whatever the activists want. The last step occurs when the judge issues a consent decree that makes the deal the law of the land. No messy congressional hearings. No public comment period. No opportunity for anybody outside the privileged few to know how government regulatory policy is being shaped until it’s too late.” And, too add insult to injury, the government would have to pay the plaintiffs legal fees which in turn are used to sue the government to achieve more settlements. According to a GAO study, between 1998 and 2010, the government shelled out $16million in tax payer dollars.

Sue and settle provides a way for environmental groups to short circuit the Administrative Practices Act which lays out a required process that allows all interested and affected parties to participate in the rule making process. Regulations that flow from this process are supposed to be based on existing law and an objective review of all comments on a proposed regulation. As the Examiner piece and others have documented, the process doesn’t work this way with sue and settle.

To make matters worse, many regulatory analysts have pointed to instances where there was apparent collusion between EPA and environmental organizations on potential litigation by agreeing on the terms of a settlement agreement. This became a classic example of using a piece of fiction, Uncle Remus, to do what Bre’r Rabbit did—get authority figures to act against what should be their own best interests, which is the public interest.
Now Bre’r Rabbit will be retired if the Administrator’s action is followed up by passage of the Sunshine for Regulatory Decrees and Settlements Act of 2017.

 

 

Rebuilding Puerto Rico—The Marshall Plan Model

The devastation to Puerto Rico from Hurricane Maria has left this bankrupt island with no functioning infrastructure or means of economic recovery.  Although steps are being taken to get food and water to Puerto Rico citizens as well as to restore its electrical grid, it will take months to restore basic services and capabilities and much longer to rebuild its economy and infrastructure.

Puerto Rico’s economic conditions severely limit its ability to raise the funds for rebuilding, which was true of western European countries after World War II.  The Marshall Plan provided the needed resources under clear but strict requirements, including repayment.  The same approach could be used with Puerto Rico.

Restoring electricity is the highest priority after saving lives.  Without electricity, Puerto Rico’s economy cannot recover and its citizens’ quality of life will be among the worst in the developed world.  There should be a two-step process to restoring the electrical grid, even if that is more expensive.  The first step is to restore basic electric service throughout the commonwealth as quickly as possible.  Presently, only about 6% of electrical service has been restored.  The second step and the far more ambitious, expensive and time-consuming one  involves building a grid for the future that is hardened and has the resilience to quickly recover from hurricanes or any other event that can disrupt power.

The destroyed grid was comprised of 15 power plants of various sizes.  In 2016, 47% of Puerto Rico’s electricity came from petroleum, 34% from natural gas, 17% from coal, and 2% from renewable energy. Two wind farms provided half of the renewable energy and four solar facilities provided the other 1%.  According to the L.A. Times, Puerto Rico Electric Power Authority (PREPA) “appears to be running on fumes, and … desperately requires an infusion of capital — monetary, human and intellectual — to restore a functional utility.”  However, the utility has a history of poor maintenance, poor staffing, and allegations of corruption on top of a mountain of debt.  Given that history, PREPA is not the vehicle for rebuilding Puerto Rico’s electrical system and simply restoring the existing system would be foolhardy, as the residents of the U.S. Virgin Islands can attest.  Four times over the past 30 years, hurricanes have destroyed their electrical systems.

A grid for the future should minimize above-ground poles and transmission lines to the extent feasible and, where not feasible, the poles and lines to be hardened to the extent possible.  Generating facilities that were not made uneconomic will also need to be reinforced and perhaps converted from oil and coal to natural gas.  Because of the population distribution outside of major cities, dispersed areas could plan on microgrids when they become economical and reliable. These are localized electric grids that allow communities to keep power if centralized power plants stop functioning. They incorporate small-scale power plants as well as energy storage like batteries to maintain electrical power.  When central power is lost, microgrids would provide resiliency by becoming the primary source of power.  Microgrid development is being supported by DOE and also by DOD for use by the military in remote locations.

While solar and wind power can play a role in a rebuilt electric power system, it would be a serious mistake to count on them (as some environmental advocates do) for more than minor and back-up roles.  Category 4 and 5 hurricanes are too strong for wind turbines and can uproot solar panels which also can be damaged to excessive rain and debris.

Rebuilding a robust electrical power system can provide Puerto Rico a solid foundation for rebuilding its economy and becoming a model for the Caribbean region.

 

Politics and the Bankruptcy of Truth

As soon as the White House released its framework for tax reform, democrats who like to get media attention condemned it as a sell out that will mainly benefit the rich.  This gives validity to what use to be a political joke but no longer is—How do you know when a politician is lying?  When he/she starts moving his/her lips.

John Adams two centuries ago said, “Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passions, they cannot alter the state of facts and evidence.”  He was absolutely right as was the late Daniel Patrick Moynihan who said that we are all entitled to our own opinions; just not our own facts.

The case for tax reform has been evident since the end of the great depression as the economy has grown less than 2% annually.  With an aging population and slow-down in productivity, achieving 4% growth would be quite a challenge but there is little doubt that we can do better.  We need more domestic investment, we need our companies to be more competitive, and we need more good paying jobs.

That is where tax reform comes into play.  Contrary to democrat assertions about only benefitting the wealthy.  There is a strong analytical case for reducing the corporate tax rate and simplifying the tax code.

The Heritage Foundation just released a paper that is a review of the economic literature on the impact of taxes on workers’ pay.  It starts with the seminal work of Arnold Harberger in 1962 and includes analyses in the 2000s.  The case is compelling: cutting corporate taxes will lead to higher wages.  Some analyses suggest that upwards of 75% of the benefits flow to workers.

These same democrat politicians also have a strange interpretation of fairness.  Demanding that the tax code be made fairer conveniently ignores the fact that according to the Tax Foundation, the top 10% of taxpayers pay 70% of the individual income tax while the bottom 50% pay 2%.  This does not mean that only democrats lie.  Distorting the truth and water boarding facts until they confess to anything and everything is bi-partisan.

The erosion of truth by blind adherence to political ideology and expediency helps to explain why respect for politicians and Congress is so low, 20% in the latest Gallup poll.  As a nation, we face a large number of serious problems, the economy being a major one.  Problems will multiply and get worse until those chosen to govern put the national interest ahead of self interest.
 

Arctic National Wildlife Refuge: The original Fake News

The New York Times reported earlier this month that the Department of Interior is proposing to lift restrictions on seismic studies in the Arctic National Wildlife Refuge (ANWR).  Predictably, this has caused the environmental community to become apoplectic.  One enviro called the decision, which only involves finding out how much oil and gas might exist in an area called 1002, “reckless and irresponsible.”  Another said “the arctic is the holy grail.”

Since the early 1980s, potential drilling in ANWR has been the equivalent of a casus belli for the environmental community.  The claims that they have made about damage and harm to wildlife have been extreme and not credible given the experience at Prudhoe Bay where wildlife has flourished.  Pictures of flowers, rabbits and other wildlife in fields of grass are designed to have an emotional impact by implying that is what is at risk from drilling. Those pictures are what Daniel Boorstin termed pseudo-events and what is now called fake news.  Those pictures are not of the area where drilling would take place.  They are of an area near the Brooks Range which is about 100 miles away.

 

ANWR is roughly 32,000 square miles in size.  The size of the 1002 area where drilling would take place is roughly 19 square miles.  To provide a clearer picture and comparison, ANWR is about the size of South Carolina and 1002 the size of Dulles Airport.  Can any reasonable person really believe that drilling in such a small area would be environmentally devastating?  Those assertions are mind boggling.

Drilling, if it eventually takes place, would be on the coastal plain which as anyone who has been there knows is a frozen desert.  Although Interior proposes seismic research, many may not know or recall that one well has already been drilled there.  In 2001, Chevron drilled an exploratory well at the edge of the coastal plain.  Very few know the results of that drilling.  It is one of our best kept secrets.  Even Wiki-Leaks doesn’t know.  That may suggest that the results were not encouraging but the results from one well are not conclusive.  Before the oil at Prudhoe Bay was discovered, a number of dry holes had been drilled and there was talk of abandoning the enterprise as a result.

Even if the seismic tests are promising, drilling would not be immediate, assuming Congressional approval.  Permitting and environmental impact analysis could take 7-12 years according to an EIA assessment in the early 2000s.  Companies interested in bidding for drilling rights would base their decision on their best estimates of crude oil prices over a 20-30 year period.  That is a tough calculus to make given the abundance of shale oil, potential advances in alternative energy technologies, and political risk.

Opposition to obtaining information represents a view that ignorance is better than knowledge. Indeed, it is always be better to err on the side of too much knowledge than too little.  Future energy policy will be better informed if we are smarter.

 

 

Voodoo Scholarship

Harvard professor Naomi Oreskes has just published an article in Environmental Research Letters–ERL–which is supposed to show that ExxonMobil mislead the public on climate change.  Professor Oreskes who has earned the reputation of being an academic character assassin and supported by the Rockefeller Family Fund which has been funding anti-ExxonMobil initiatives.

She first gained notoriety with her book Merchants of Doubt which attempted to discredit the reputations of three well respected but deceased scientists who could not defend themselves—Fred Seitz, Bob Jastrow, and Bill Nirenberg.  Their crime was to challenge climate orthodoxy with  scientific reasoning that has not been refuted scientifically.  The George C. Marshall Institute, which these three scientists founded, published a critique of her book–Clouding the Truth: A Critique of Merchants of Doubt-GCMI.  The conclusion of that critique says all that needs to be said about the Oreskes research model,“Merchants of Doubt is long on innuendo and short on evidence or compelling logic.  It fits well with Mark Twain’s classic observation about the gathering facts and then distorting them as the gatherer desires. If it were not for the attack on the Institute’s founders who cannot now defend themselves, the book could be dismissed for the partisan history it is”,

The methodology used by Oreskes in the Harvard article is essentially the same used by her and John Cook in the attempt to demonstrate a scientific consensus on anthropogenic global warming. The technique is known as content analysis.  The PEW Research Center says that “Content analysis has been defined as “the systematic, objective, quantitative analysis of message characteristics.”.  In a short piece on the technique by UC Davis, it is pointed out that content analysis “describes what is there, but may not reveal the underlying motives for the observed pattern (‘what’ but not ‘why’)”.  Oreskes started off knowing what she wanted to conclude and then structured her review to support her preconceived opinions.  A self fulfilling prophecy lacking the required objectivity!

The weakness of her work and the technique of content analysis was laid bare in critiques of the surveys on the so called scientific consensus.  A Canadian organization, Friends of Science, reviewed several studies, including Oreskes and Cook’s, that claim a 97% consensus among among scientists that human activities are the primary cause of climate change.  Its conclusion bluntly stated, “The deconstruction of the surveys that follow shows the claim of a 97% consensus is pure spin and ‘statisticulation’ – mathematical manipulation” and “there is no 97% consensus on human-caused global warming as claimed in these studies. None of these studies indicate any agreement with a catastrophic view of human-caused global warming”. That conclusion was shared in an article in Popular Technology titled, 97 Articles Refuting The “97% Consensus.

 In her article, Oreskes concludes that “ExxonMobil contributed to advancing climate science—but promoted doubt about it in advertorials.  Given this discrepancy, we conclude that ExxonMobil mislead the public.”  Research on climate science is intended to advance knowledge while advertorials are intended to communicate on public policy issues. The criterion for judging them should be whether they are based on a solid foundation.

Oreskes and her climate orthodoxy colleagues assert with certitude that human use of fossil fuels is the primary cause of global warming over the past 50 or so years.  But, that is not consistent with the state of science.  In its most recent report, the IPCC lists a number of climate factors for which knowledge is limited. And, its range of climate sensitivity, which has steadily been reduced, varies by a factor of 3. Hardly a basis for certitude. A National Academy of Sciences report stated, “ Because there is considerable uncertainty in current understanding of how the climate system varies naturally and reacts to emissions of greenhouse gases and aerosols, current estimates of the magnitude of future warming should be regarded as tentative and subject to future adjustments (either upward or downward).  Reducing the wide range of uncertainty inherent in current model predictions of global climate change will require major advances in understanding and modeling of both (1) the factors that determine atmospheric concentrations of greenhouse gases and aerosols, and (2) the so-called “feedbacks” that determine the sensitivity of the climate system to a prescribed increase in greenhouse gases.”.  Those advances have not taken place.

In addressing policy proposals that were based on greater certainty than actually exists, it is quite reasonable to use advertorials to focus on uncertainty and the economic consequences of implementing those policies.  Oreskes points out that ExxonMobil and Mobil before the merger ran advertorials in the New York Times every week from 1972 to 2001. But, it was Mobil that ran weekly ads before the merger in 1998. But, that didn’t deter her from assigning guilt to ExxonMobil.   Surely, if the advertorials were outside the bounds of legitimate discourse, the Times, given its position on climate change, would have rejected them or used its editorials to dispute them.  It is also telling that Oreskes’ article covers advertorials starting in 1972 which was well before climate change/global warming became a major public policy issue.  In the 1970s, the oil industry was concerned about price and allocation controls, divestiture, and energy independence issues.  This is another point demonstrating that Oreskes was simply looking for any way to support her biased, preconceived point of view.

There are only two possible uses for this article—a case study of how not to do content analysis and as liners for bird cages.  Harvard should be embarrassed!

 

 

 

 

 

.

 

 

 

Polls, Illusions, and Bad Analysis

Gallop and Bloomberg polling tell us that tax reform is not an important priority to the American people.  CNBC says, “Many Americans aren’t sure they even need tax reform.”  The Gallop poll, taken each April, shows that only about 50% of the people polled believe that taxes are too high, while only 4% of the people Bloomberg polled believes it is a high priority.

Since individual tax reform is supposed to lower tax rates and make the tax system simpler, why would people show a lack of interest?  The poll results seem to defy common sense.   The answer is contained in the fact that the top 40% of tax payers pay 97% of federal taxes while 45% pay no federal income taxes.  A 2016 Tax Foundation report on compliance concludes “the latest official estimates of the eye-popping amount of time and money that Americans lose each year in complying with IRS paperwork—8.9 billion hours and $409 billion in lost productivity—indicate that the most important benefit of tax simplification may be the gift of time.”

Since 45% pay no taxes, the way the poll was conducted and the structure of the questions may have foreordained the results.  Random sampling is a fundamental principle of statistics.  However, in the case of the Gallop poll, random sampling of sufficient size would clearly bias the results since almost half of those polled would have been in the pay no taxes category.  Gallop never connected its results with the skewed distribution of individual tax payments.  In addition, in explaining polling results that fail the smell test, pollsters blame “the growth of cellphones and the decline in people willing to answer surveys.”  In 2014, the Huffington Post ran an article by Frank Wall, Severe Flaws in Polls.  In it he said, “The basic problem with polls is that public sentiment is quite like our body temperature; it goes up and down, somewhat randomly, and taking action too quickly in response to changes can lead to serious unintended consequences. … It is also increasingly clear that many polls are tainted by respondents lying to pollsters because they believe they cannot trust where their answers might end up …  That can lead to unintended consequences of self-fulfilling promises because misinformed poll results obviously do not tell the real story … .” Amen to that!

Michael Traugott, a University of Michigan political science professor who specializes in polling and opinion surveys got to the heart of bad polling very succinctly, “Put simply, if you ask the right sampling of people the wrong thing … you’ll get a bad result”.  Traugoot could have also added, you also get bad results, if you don’t dig deep enough into the polling to get to the real meaning or if you cherry pick results to support an agenda.  Bloomberg and CNBC which wrote about the Gallop results could be guilty of both.  Neither were balanced or insightful in their articles.

In Daniel Boorstin’s The Image, he says of public opinion that it “became itself a kind of pseudo-event, forced into existence for the primary purpose of being reported.  Expressions of public opinion became among the most powerful, the most interesting and the most mysterious of pseudo-events.”  Others like Darrel Huff who wrote How to Lie With Statistics and John Brignell who wrote Sorry, Wrong Number provide us a way to identify attempts to bamboozle and poor methodologies and analysis.

Gallop should be especially sensitive to its polling methodologies and analysis since it gave us President Thomas Dewey and more recently led Mitt Romney to believe that he was going to win the presidency.

Stories based on polling results probably should carry a warning, “reader beware.”

 

 

 

 

 

Gasoline Prices: Onward and Upward?

Gasoline prices have been steadily increasing in the past year, going from a national average of $1.87 early in 2016 to $2.51 in July.  The end of the driving season should provide some relief but it won’t be enough to offset upward pressures.

There are 4 factors that drive gasoline prices:  crude oil prices, demand, refinery enhancements, and more recently exports, which have more than doubled since 2010.  Crude oil prices are roughly $6 a barrel higher this year than last, although OPEC was expecting a larger increase from its production cut.  A $6 per barrel increase translates into an increase of 14 cents per gallon.  According to EIA, domestic demand and product exports have been running above the five-year averages.  As long as the economy continues to improve, gasoline demand will remain robust.  Although, refinery production of gasoline has been strong, it has not been sufficient to meet demand, so there have been draws from product inventories.  Reductions in inventories also put upward pressure on gasoline prices because of expectations about the future.

In addition to the increase in the average price of gasoline, the price difference between regular and premium has also been increasing.  According to the Energy Information Administration, as of July 2017 the premium -vs.-regular differential reached $0.53/gallon — more than double the differential in 2012.  Although historically 90% of gasoline sales have been for regular, midrange and premium demand has grown in importance as sales of higher performance and turbo charged vehicles have increased.  EIA also attributes higher fuel economy standards as a driver in increased demand for midrange and premium gasoline.  Demand alone cannot explain why premium prices have increased more rapidly than regular.  The rest of the answer is the cost of production.

The octane level of gasoline before ethanol blending or chemical reforming is less than the 87 octane of regular gasoline.  For the past decade, ethanol, which has an octane level of 115, has been blended with straight run gasoline to achieve the needed octane levels for gasoline sold at the pump.  Now, with growing gasoline demand, ethanol blended with gasoline has reached the legal 10% limit and other sources of octane enhancement are needed.  Refiners have made refinery modifications to increase alkylate production which is the octane booster of choice because it is low in sulfur but high in octane.  Producing alkylate is expensive, resulting in a price per gallon that is 30 cents more than gasoline at the refinery gate.

So, even though gasoline prices have been rising, today’s price has to be considered a bargain when compared to the $4.00 per gallon that motorists were paying in 2011.  No one knows what the price of gasoline will be in the months ahead or next year but the likelihood is that it will continue the steady pace upward if the economy remains strong.  What is important is looking to the future is to keep in place policies that do not restrict domestic production or the efficient operation of refineries.  That will allow market forces instead of political forces to determine the price at the pump.