A Time of Greater Peril?

The majority of democrats running for president have embraced the Green New Deal or something like ot to address the claimed impending climate catastrophe. Vice President Biden got roundly criticized when it was suggested that he would propose a moderate climate agenda—rejoin the Paris Accord, promulgate emission reduction regulation, and support nuclear power, and natural gas. He quickly backtracked and now is promising an environmental revolution.

The fact that these people are babbling no nothings doesn’t mean much when the climate narrative is being pushed more aggressively than ever. Senator Schumer wants to keep the focus on climate because it is the one issue where democrats are seen to have a big advantage for the 2020 elections.

A poll by Engagious and Focus Pointe Global found general agreement that weather is getting weirder and that President Trump should do more on clean energy innovation and carbon emission reductions. According to Axios a “poll by the Yale Program on Climate Change and George Mason University found that the “alarmed” segment of the American public is at an all-time high of 29% — double the size in a 2013 survey.” These results are not surprising given media coverage, regular “science” reports and the media conflating extreme weather with climate change.

Support in polls is not surprising since polls can be structured to produce almost any result and climate change is being talked about more than almost any time in the last two decades. The strength of public opinion will be tested when the price tag on Green New Deal like actions is better known along with climate facts.

As proponents and many economists discovered several decades ago when they were championing Contingent Valuation to assess the value of environmental values, there is a big difference between what people say they are willing to do or accept and how much they are willing to actually pay. Look at the public reaction to proposals to raise the gasoline tax. As the price of actions favored by democrat candidates becomes more apparent and as people better understand the disconnect between model results as policy drivers and the real climate system, their level of support and enthusiasm are likely to shrink.

The reason that this is a time of greater peril is the fact that republicans do not have a credible alternative to the progressive’s narrative. What could such an alternative consist of?
• To the extent that CO2 emissions are a problem, the problem is a global one that won’t be solved as long as China, India, and other countries continue to rely on coal for electric power generation. Increased exports of energy technology could cost-effectively reduce emissions
• Nuclear power can be an important source of future energy if the problems of cost and public fear can be addressed.
• Any barriers to natural gas replacing coal need to be removed,
• The Dutch have demonstrated that there are practical solutions to the problem of sea level rise.
• Increasing R&D on energy and mitigation technologies.
• Focusing research on areas where it is clear that there are real problems in understanding climate change—reducing the range for estimated climate sensitivity, improving our understanding of natural variability and solar impacts, determining if it is possible to make climate models more accurate, and correcting the well documented problems with temperature and weather data.

None of these actions or others that could be added are attention grabbers are easily achieved but they demonstrate that the alternative is not a radical agenda that is economically destructive or doing nothing. Polarization is the enemy and the real reason why common sense, scientifically defensible policies are not being pursued.

The End of the Oil Age Keeps Getting Postponed

Oil production in the Permian Basin dates back to the 1920s although production data from the Texas Railroad Commission only goes back to 1940 when production reached 84 million barrels annually.  The Permian Basin is one of the oldest as well as one of the largest and thickest deposits of sedimentary rocks in the country, covering the western part of Texas and eastern portion of New Mexico.

Annual production grew to about 550 million barrels in 1957 and then began a decline that took about a decade to get back to the 1957 level.  It continued to increase and didn’t peak until about 1975 when it hit 750 million barrels.  The 70s were a time of price and wage controls and a belief that domestic oil production was on the verge of exhaustion.  By 1998, oil production had fallen to less than 400 million barrels annually.  

Today, as a result of horizontal drilling and “fracking”, production has reached almost 4 million barrels a day meaning that  it exceeds its 1975 production peak in a little more than 6 months. Not only has technology unlocked a tremendous amount of output, it has also helped to lower the breakeven cost per barrel from $60 to $33.  This will permit high levels of production under any realistic crude oil price scenario.

EIA’s Annual Energy Outlook projects continued increases in domestic production rising above 15 million barrels per day by 2022 and staying above 14 million all the way to 2040.  The main driver—the Permian Basin.  At the beginning of the 21stcentury, most oil and gas companies believed that Permian was past its prime and were shifting investments to more promising areas.  That has all changed thanks to advances in technology.

Renewable fuel and climate change advocates like to assert that the oil and gas industry is going the way of the horse and buggy.  A survey by the UK Sustainable Investment and Finance Association claims that there is “there is growing consensus of the threat of climate change-related financial risks to investments in Integrated Oil Companies unless they change their business model within 10 years. In spite of predictions that the end is near, energy companies are demonstrating that science and engineering will continue to make oil and gas fuels of choice for decades to come.  The threat is not wind, solar, and electric vehicles, it is wrong headed government policy predicated on a climate catastrophe that is always in the distant future.

Muddle Thinking

Democrats running for President are selling a narrative of growing income inequality that is being exacerbated by the Trump tax cuts, deregulation, a favoritism toward big business. There is what I call a political truism—the stronger the rhetoric, the weaker the factual foundation. The media rarely challenges these assertions or drill down to discover underlying facts. If it did, it would discover that the allegation contains a lot of sloppy work that embellishes the reality. Income inequality is not a myth, but the reality is not well defined, nor is how much is too much. If we can’t accurately answer those two issues, we’ll never find an answer.

Income inequality is inevitable and some forms are accepted without question. Gardeners make a fraction of what professional athletes make. So, what is the dividing line between acceptable inequality and unacceptable? Democrat candidates will never articulate an acceptable level but forcefully assert that today’s situation is unacceptable.

What do we know about income inequality and the underlying data?

In an article, Never Mind the 1 percent, Let’s Talk About the 0,01 Percent, Howard Gold makes the following points.” Some argue that income needs to be distributed more equitably, while others say governments should focus less on taking actions that could inhibit top earners and more on addressing the reasons others aren’t as successful. … As the debate continues, members of the 0.01 percent continue on their course.” According to the Tax Foundation, about 1.4 million tax returns were filed by the top 1%, meaning that the top 0.01% represented about 14,000 returns. To qualify for the top 0.01%, someone needs to earn about $2 million annually.

Economists who have studied data pertaining to the top 0.01% whose income has risen faster than the rest of the 1% have concluded that they “are most likely benefiting from what economists call “skill-biased technological change”—the increasing return on certain skills in an economy driven by technology and globalization.” Under established theory, a shortage skills in-demand will raise the price of those skills. Professional athletes and celebrity entertainers would have little trouble exceeding the $2 million entry point. I doubt that Elizabeth Warren and her fellow confiscator candidates would openly advocate tax penalties on Tom Brady, Tiger Woods, Tom Hanks, or Angelina Jolie. It’s the other unnamed rascals that they are after, whoever they are. But that does nothing to improve the lives and well-being of those in the middle and low income categories.

Two major factors contributing to income inequality are technology and globalization. Technology contributes to economic growth by increasing labor efficiency. But it also makes some employees less employable as technology substitutes for labor. Training and education helps mitigate the adverse effects of technology. And, globalization obviously results in companies outsourcing some production because they can get goods at a lower cost.

Although income inequality is real, its extent is being exaggerated by data problems and analyses relying on that data. The Congressional Budget Office report, The Distribution of Household Income, 2014, provides a detailed examination of income inequality by assessing virtually all changes to after-tax government benefits. Government redistribution of hundreds of billions of dollars from taxes on the rich to low income households is minimized in discussions of inequality. It concluded, “Under the adjustments, the top 10 percent income share seldom strays more than 5 percentage points away from a century-long average of about 35 percent.”

In addition, Phillip Magness in his paper The Myth of Spiraling Inequality pointed to the impact of flawed and outdated statistics—”… the widely reported explosion of inequality in the past three decades is likely a myth built upon outdated and flawed statistics. … These problems are sufficiently severe that they call into question the accuracy of the century-long inequality pattern at the heart of the U.S. policy discussion.”

Although income inequality is not a myth, it is likely far less severe than asserted by candidates for president. Debate would be better served by focusing first on the accuracy of measurements, the quality of data used in assessing inequality, and realistic policies that can help to lessen it..

Too Big to Fail?

The commercial aircraft manufacturing industry is dominated by Boeing and Airbus.  Until recently, not much thought may have been given to potential unintended consequences of this dominance .  That perhaps is changing.  Airbus bet on plane—A-380 that no one wants to fly and Boeing bet on one—737 Max—that no one wants to fly.

Perhaps the current situation is just a temporary bump in the road. But in a rapidly changing world, it could also be that Boeing and Airbus suffer from holding too much of the market captive, from too cozy relationships with government, and from the unintended consequences of being too big.

The following chart shows the significant concentration in the current aircraft manufacturing industry.

This data can be slightly misleading since it includes all sizes of commercial aircraft.  When it comes to large commercial planes—seating capacity of 140 or more, Boeing and Airbus control over 90% of the market.  Both firms have worked hard to achieve their current dominance which is aided by their activities in the pollical market place.  The latter pays off handsomely. Boeing, for example, receives about 40% of Export-Import Bank funding. 

In addition, economics makes it difficult for competitors to gain inroads. Beyond, the high cost of developing a new plane, airlines have incurred large sunk costs for maintenance facilities, training, and parts.  Changing from one manufacturer to another is very expensive which make airlines reluctant to do so.

The cost of bringing an aircraft to market and the time required to develop one represent a significant market barrier.  Given the challenge of predicting the market and consumer tastes, investors have little incentive to commit billions to develop a new aircraft. There are a few exceptions.  One of which is projected shifts in passenger demand.  The International Air Transport Association estimates that over the next few decades, half of new passengers will live in Asia.  This represents a potential opening for the Chinese and perhaps the Russians to gain a substantial foothold in emerging markets.  In addition, regional jet manufacturers Bombardier and Embraer are moving into the large commercial aircraft market. While both are likely to occupy only a small share of the global market in the short run, if their new aircraft prove to be as cost efficient as promised and Boeing and Airbus continue to stumble, they can become strong competitors in the large commercial jet segment.

Boeing’s problems are not limited to the 737 Max, it is also experiencing problems in the manufacture of the Dreamliner.  It could be that the Airbus-Boeing duopoly along with substantial barriers to entry may have led to complacency, hubris and carelessness in manufacturing quality.  If that is the case, changes in the market place and a loss of confidence by the flying public may significantly weaken Boeing’s market share.

What’s Surprising About the Rise in Gasoline Prices?

Last year around this time the average retail price of gasoline was about $3 per gallon.  It then began a steady decline and reached about $2.40 in December and slightly lower in January.  Since then, it has been steadily rising, reaching $2.74 this month. Some have speculated that $3 per gallon will be reached in the near future, perhaps around Memorial Day.  We should be grateful that the price is not higher.

Since December the average price of crude oil has risen almost $20 per barrel, from $45 to $64. That alone would translate into a 45 cent per gallon increase, which covers the price increases since the start of the year.  Crude oil prices are the major driver for prices at the pump but not the only ones. In addition to the effects of supply and demand imbalances, the price of ethanol comes into play as does the regulatory requirement to produce summer gasoline to counter smog formation and spring refinery maintenance.  

For anyone who doesn’t remember the formula for summer or reformulated gasoline (RFG) was written into the Clean Air Act Amendments of 1990.  The legislative specifications covering oxygenates, benzene levels, olefins, sulfur, and evaporation were included to mask the ethanol mandate. Since producing RFG is more expensive, it normally results in an increase of about 10 cents a gallon during the summer months.  All other things being equal, the price of gasoline prices should be higher.

Although Russia and Saudi Arabia appear to be adhering to the OPEC agreed production cut back, adherence by other members is always problematical.  Production has been exceeding the stated 30 million barrel per day quota because as crude prices rise, members have an incentive to cheat. And even though production by some members has declined recently, those reductions won’t last long except for Venezuela.  Most OPEC members like to get while the getting is good.

US shale production is up 25% from this time last year and will continue to increase as prices do.  The loss of Venezuela heavy crude imposes additional costs on refiners.  The next best source is Canada but the lack of pipeline capacity and efficient distribution alternatives limits supplies to US refiners. Constraints of Canadian distribution will be removed in the coming years as a result of increased pipeline investments. 

While OPEC and Russia would like to see crude oil prices around $80 a barrel, the downward pressure from a slowing global economy, cheating, increased domestic production and increased crude from Canada will likely keep that price target from being achieved absent some unknown and unpredictable event.

Resort to Principles of Economics

According to the Wall Street Journal, neither the Federal Communications Commission (FCC) nor the Federal Trade Commission (FTC) have been capable of reigning in the increasing annoyance of robocalls, scammers, etc. The only thing that the Telephone Consumer Protection Act seems to have accomplished is proving the ineptness of the bureaucracy in enforcing it.

Since 2015, the FCC has levied over $200 million in fines for violations of the Do Not Call Registry. That would be a deterrent if collected but to date the FCC has collected less than $7000 or .00004%. The FTC has obtained court judgments of $1.5 billion since 2004 in civil penalties for robocalls or violations of the Do Not Call Registry. It has done better than the FCC in collections but only because the FCC success rate is so pathetic. The FTC has recovered an unimpressive 8% or $121 million.

The FCC claims that it lacks enforcement authority and has to rely on the Justice Department. The FTC told the Journal that it was proud of its “strong enforcement.” No wonder that the number of robocalls has soared. Robocalls to mobile phones reached 50 billion last year according to the Pew Center.

Carriers are supposed to be working on verification systems that will help consumers decide if calls are legitimate. Of course, those systems won’t prevent the annoying ringing of calls that seem to always come at the worst of times. Consumers can purchase equipment or apps that will block unwanted calls but devices for landlines can cost over $100 and while some mobile systems are free, subscription apps cost at least several dollars a month. While apps provide some level of protection they’re not fool proof. There have been estimates that no service flags more than two-thirds of calls because many robocalls spoof their identities.

A professor at the University of Texas, Roger Meiners, suggests that economic principles might provide a better answer. Namely, if you want less of something, in this case robocalls, etc., raise its cost by imposing a tax on calls. Meiners has proposed the Penny for Sanity Tax. A I cent tax on 50 billion calls would raise $500 million annually. He makes the point that since the tax applies to all calls, it would avoid litigation over discrimination and would be very difficult to avoid.
According to the Pew Center, the average adult cell phone owner makes and receives around 5 voice calls a day. If we assume that average holds for land lines as well, assuming 3 calls made per day would cost 3 cents or less that $1 per month. At 5 cents, the monthly cost would be slightly less than $5. While $60 a year is steep relative to other options, it probably is more effective. Many would gladly pay a reasonable price to avoid the annoyance. Congress ought to be able to find a creative way to make that tax deductible or authorize consumer rebates to significantly reduce the burden on all of us. Congress has an incentive to do so because it is voters who are receiving these calls.

Making the Theory Work

Charles Kettering, a former head of General Motors Research once observed, ( in theory)“There is no difference between theory and practice. ( in practice) There is one difference. Practice won’t let you forget anything or leave anything out. In theory, problems are easily solved because you can leave something out.”

While he made that statement at least 72 years ago, climate advocates have latched onto the part about problems being easily solved by leaving something out. What they leave out is observational data that conflicts with their preferred assumptions.

Those who rely on the climate orthodoxy to promote their agenda are wedded to the results of complex computer models that have been constructed to demonstrate that the increase in CO2 emissions inevitably leads to dangerous warming. What they leave out is the fact that their model projections overstate warming. This is shown in the following graph produced by Professor John Christy.

The explanation for this difference is that the models assume a greater climate sensitivity than is demonstrated by the climate itself. According to the National Academy of Sciences, climate sensitivity is “the equilibrium global mean surface temperature change that occurs in response to a doubling of atmospheric carbon dioxide (CO2) concentration. Climate sensitivity is a function of numerous feedbacks among clouds, water vapor, and many other components of the earth’s climate system. It is presently one of the largest sources of uncertainty in projections of long-term global climate change.” The NAS went on to say that some uncertainties could be reduced or removed if there were better temperature records and better estimates of past radiative forcing. That falls into the category of wishful thinking because the records and estimates cannot be rehabilitated. There are too many variables involved with past temperature records to significantly improve their accuracy.

Analysis by Professor J. Ray Bares, University College Dublin, concludes that models underestimate the amount of heat radiated into space from the tropics. This conclusion is consistent by Dick Lindzen’s research demonstrating an “iris” effect in the tropics. In addition, Pat Michaels’ work on climate sensitivity shows that studies since 2011 estimate a lower sensitivity than the IPCC or the models that are used to project warming.

So, why do advocates of the climate orthodoxy cling to predictions of catastrophic warming when more recent research confirms a lower climate sensitivity as does the climate itself? There are several plausible explanations. One is that many are very risk adverse and believe in the Precautionary Principle which in Dick Lindzen’s words, “Everything is uncertain, thus anything may cause anything, and thus we should do something about it.” This is taking an abundance of caution to the extreme.

Another explanation is that some environmentalists have strong objections to economic progress and the way in which it is achieved. They want to control the means of production and how the economy evolves. Of course, they also happen to be high up on the economic ladder, so they can be cavalier in wanting to deny the benefits of economic growth to others.

In the end, if extreme energy policies are the mechanism for responding to climate change, resources will be wasted and the impact on warming and its climate effects will be imperceptible.

The Slow Walk to Silent Surrender

More and more businesses and industries are buying into the climate orthodoxy that human activities are the primary cause of climate change .  And, democratic candidates for President are making it a wedge issue that will put further pressure on business and industry to get on board.

While a large number of scientists support this hypothesis, for a variety of reasons, it still is only a hypothesis that relies on complex computer models that have been built using a large number of assumptions that attempt to fill gaps in knowledge.  While short term business objectives may justify going along to get along, there is an unexplored alternative that would not compromise the business community and damage our economic system.

Instead of being politically correct and accepting the climate orthodoxy and the actions that flow from it, industries and businesses should lay out an action oriented agenda that neither accepts or rejects the orthodoxy.

The United States is making more progress than most developed countries in reducing greenhouse gas emissions, some of which is the natural evolution of technology and some due to wrong headed policies that suppress fossil fuel use.  

Instead of accepting renewable standards biased in favor of wind and solar, a focus on incentives to shift to natural gas and revive the nuclear option would be more cost-effective.  Nuclear has two big hurdles—fear and cost.  The fear that has resulted from nuclear accidents that did not result in any casualties can be addressed by a well developed communication initiative that focuses on why nuclear should be the preferred option for reducing emissions and why it is in consumers interests to support it.  The cost issue is more difficult but it is not an insurmountable hurdle.  Two major cost drivers are the regulatory approval process and the lack of reactor standardization.  The progress being made with smaller modular reactors holds promise in lowering costs, increasing public comfort and, in competing with wind and solar.  The case needs to be made for a level playing field in power generation benefits consumers so that alternatives can be judged fairly.

The unneeded and wasteful subsidies for ethanol should be eliminated by demonstrating that tail pipe emission standards can be met without an ethanol mandate and that the production process actually leads to an increase in CO2.  Ethanol manufacturers should not be given a free ride.

Sea level rise, independent of the human component, is a serious problem but one for which near term solutions are readily available.  Coastal regions need to revise building codes so that new structures are not allowed so close to the waters’ edge that damage from sea rise and coastal storms is almost inevitable.  Currently, flood insurance is subsidized by the Federal Government, lowering the true cost of insuring coastal structures.  This subsidy should be eliminated.  Most of the Netherlands is below sea level and yet the Dutch have developed technology for mitigating the effects of flooding, Industry should support a vigorous program to adopt some of that technology.  

While the natural process of decarbonization is taking place, industry ought to support and participate in research to better clarify and define the extent of human influence on climate as well as of other factors identified by the IPCC. The  IPCC identified uncertainties provide a solid basis for a collaborative research program for demonstrating that the science is not settled and developing a better understanding of factors beyond CO2 that influence the climate system.

For over 20 years, estimates of climate sensitivity have varied by a factor of three. Research should be pursued to make that estimate more precise.  Over the same time period, Danish scientist, Henrick Svensmark has been conducting research to better understand the effect of solar activity on climate and to demonstrating how solar related mechanisms affect cloud cover and cloud formation. Additional solar related research should make clear that the effect of solar activity on warming has been underestimated in making attribution determinations.

A recent audit of temperature data by Dr. John Mclean has raised serious questions about the data bases that are the foundation for models and projections of future climate catastrophes.  The data bases from the Hadley Center and NOAA should be independently audited to validate or refute McLean’s findings.  We already know from work by Professor John Christy that US temperature measurements have seriously over estimated actual temperatures. 

 Collaborative research on natural variability, the actual impact of increasing CO2 levels since the warming effect is not linear, and on improving models in order to add to our state of knowledge and demonstrate that industry is being part of the solution and not the problem.

Americans are sorely in need of being educated on what is realistic in terms of emission reduction impacts.  If the US adopted all measures that are minimally economically plausible, the effect on global warming and climate would be marginal because the sources of emissions and increased atmospheric levels of CO2 are China, India, and developing nations that show no real inclination to reduce their use of coal or to accept lower levels of economic growth.

The alternative to taking a stand on principle and engaging in a constructive, realistic action agenda is to get rolled and rolled often.  The threat to the capitalist economic model is growing as evidenced by the percentage of people who believe socialism is preferable and by the support for the Green New Deal.  Proponents of the Green New Deal and similar programs must be challenged to show the cost of these programs, their effect on the economy, as well as on global warming.  That information would be sobering.

Past business strategies for confronting the climate orthodoxy have not worked. They have resulted in losing but losing gradually.  It is time to try a third way that is based on challenging climate advocates to join in a collaborative research and policy initiative.  They will most likely reject such an approach because they are winning. But that would demonstrate that they are more interested in scoring points than in developing cost-effective solutions.  That would put then on the defensive.

The Distortion of Rose Colored Glasses

 Some political scientists are postulating that authoritarian countries can offer their citizens high incomes by adopting or supporting private enterprise.  Since authoritarians have the power to get things done, authoritarian capitalism could cause democracy to lose its appeal.

Since World War II, the most economically successful countries have been democracies—mainly the US, Western Europe, Japan, Canada, and South Korea. Two political scientists, Roberto Fao, University of Melbourne, and Yascha Mounk, Johns Hopkins University, recently wrote that “sometime in the next five years, the total GDP of countries rated “ not free” …will surpass that of Western democracies.” They include in the “not free” Russia, China, Turkey, and Saudi Arabia. They believe that a growing number of countries are learning to combine autocratic rule with market oriented institutions.  To bolster their case, they point out that 60 years ago, western democracies accounted for two-thirds of global GDP while today it is one-third.  This is a non-sequitur.  It was western democracies and their capital investments that promoted the globalization that led to the wider distribution of global wealth.

Authoritarian capitalism is characterized by enforcing strict control by the state while the means of production is controlled by private owners  for their profit.  The benefits of capitalism come at the expense of personal freedom. There is an explicit partnership between government and big business and a willingness by the government to support capitalists at the expense of citizens and consumers.  China’s President Xi’s hostility to the private economy is evident by his state first philosophy and by favoring government connected enterprises.

There is no doubt that authoritarian governments can channel investments into favored industries and eliminate obstacles that could hinder their performance.  Turkey is a country that has gone through rapid industrialization that now has the world’s 17thlargest GDP.  And China has experienced double digit economic growth for the better part of three decades, at least that is what has been reported.  Authoritarian capitalist countries have out performed the US and other democratic capitalistic countries over the past decade and that has led to questions about the staying power of democratic capitalism.

But Turkey, China, and Russia are experiencing economic problems that raise questions about how well they can maintain private capitalism while repressing human rights and personal freedom.  While high rates of economic growth are the price of maintaining political power, political interference in economic decision making can lead to misallocation of resources because decision making is concentrated and not dispersed.  High rates of growth are also necessary to keep their populations sullen but not mutinous. The quality of life in these countries is not good.  In a mathematical quality of life calculation, the US was number 5 while Russia, China, and Turkey ranged from 73 to 81.  So, while the economic performance of these three authoritarian capitalist countries might seem impressive it comes at a very high price for their citizens.

The notion that authoritarian capitalism might replace democratic capitalism would turn on its head the belief that top down centralized economic direction will prove more lasting and superior to the benefits of dispersed economic decision making that characterizes democratic capitalism and its ability to respond quickly and accurately to consumer tastes and shifts in demand.  Freidrich Hayek’s Road to Serfdom made the convincing case of why democratic capitalism is superior.  

Democratic capitalism in the US is beset with a number of challenges which are giving socialism its recent appeal.  Some of our problems are the consequence of the influence of big companies on laws and regulations which distorts market forces and breeds crony capitalism.  The increasing power of Congress over how companies and industries act is what promotes crony capitalism and creates the view that the system is rigged.

The High Price of Staying Alive

In recent decades, medical science has madespectacular advances in treating life threatening diseases like cancer.  A growingpercentage of new chemotherapy treatments are now in the form of oral drugs, immunotherapy is being used more widely, and gene altering therapiesarebecoming ever more promising. Advances in other fields of medicine are just as spectacular.  But, and there always seems to be a but, these new treatments are incredibly expensive and getting more so.

Our ability to increase our knowledge about treating life threatening  diseases is outstripping our ability to make their costs reasonable.   This dilemma is ethically challenging as well as being a major public policy challenge. Some European countries use a cost-effectiveness standard to approve some very expensive cancer treatments. If a treatment costs hundreds of thousands of dollars but only adds a few months of life expectancy, it may not be approved.  As a society, we find this type of rationing abhorrent.  At the same time, we have not resolved how best to handle the high price of new life saving drugs which also are regressive and result in involuntary rationing.

Pharmaceutical companies probably come in for more criticism for high drug prices than they deserve.  Some price drivers are beyond their control.  Developing new drugs is expensive and some research suggests that only 1 in 10 new drugs prove safe and effective.  The cost of failures needs to be recovered or a company will go out of business.  The cost of drug development for companies that develop multiple cancer drugs can range from $2 to $5 billion per drug.

The largest cost in new drug development comes from regulatory requirements to prove a drug is safe and effective.  FDA requires preclinical testing and then three clinical trials before applying for approval.  This whole process can take 10 years.  After approval, a company has to monitor for side effects and conduct related tests.

Although there are a number of pharmaceutical companies, the only competition is in who can get to the market first with a FDA approved drug.  Once a drug is approved and under patent there is a legal monopoly.  Price is based on what the market will bear because there is no competition.  Manufacturers maintain a monopoly until their patents expire but too often generics are priced close to the patented drug price.  Again, no competition because usually only one firm is approved to  manufacture the generic.

Manufacturers give steep discounts—30% or more—to private purchasers like Sams, Pharmacy Benefit Managers (PBM) that administer prescription drug programs for employers, insurance companies and Medicare Part D, and hospitals.  With a lack of pricing transparency and poor communication, these drug providers can charge above competitive prices.  PBM’s for example, receive a share of the discounts they negotiate, so they have an incentive to keep list prices high. 

Since the price of these specialty drugs is less in many foreign countries than in the US, it is clear that lower prices can be achieved here.  

Since manufacturers are monopolies, they should be regulated as other monopolies are. The utility regulation model might be a useful approach. A body similar to State Corporation Commissions would approve prices charged and the rate of return that recognized risk and the need to recapture R&D expenses. Strong but reasonable returns would provide the incentive for continued investment.  

The cost of FDA approval is the major component in drug pricing.  But FDA is highly risk adverse. It doesn’t get blamed for the consequences of delays on bringing drugs to market but it does when a drug has unintended consequences.  Reducing red tape, streamlining the clinical trial process, modifying agency incentives would reduce approval costs.  In approving applications to produce generics, FDA should conduct a form of “dutch auction” where awards go to one or more firms that offer to charge the lowest retail price.

Insurance companies should be required to provide patients with information about the prices that a range of suppliers charge and explain better their formulary tiers and the rationale for drugs included in them.  Greater transparency about sources of supply and prices will engender greater competition.   The FTC should regularly review the vertical prescription drug structure to identify anti-competitive and anti-trust actions that are taking place.  Currently, each component in the supply chain has an incentive to keep prices high and to limit transparency.

Finally, barriers to drug importation should be reduced and Medicare/Medicaid should be allowed to negotiate prices like other sources do.

The insurance and drug lobbies are clear examples of Bootlegger and Baptist actions that result in poorer treatment for life threatening diseases.