Wink and Nod Tax Policy

Although the proposals for a special tax on the very wealthy by Alexandria Ocasio- Cortez (AOC) and Elizabeth Warren has received a lot of criticism, it also has gained a lot of support.  Like the dumb farmer that doesn’t learn from experience or adapt to agricultural conditions, these proposals assume that the very wealthy are dumb and won’t adapt to changes in tax law.

The super wealthy like Warren Buffett and Bill Gates have helped give these confiscatory tax proposals life by claiming that they are under taxed.  Bill Gates says, “I need to pay higher taxes” and Warren Buffett joined him by saying “I don’t need a tax cut.”  They are not alone, founded in 2010, a group named Patriotic Millionaires also advocates higher taxes on the wealthy.   According to an article in The Atlantic–pay more—a survey by US Trust found that 48 % of those with several million dollars in assets are willing to pay more taxes for the common good.  There is no doubt there is a a survey that also shows that Venezuelans love Maduro.

Implicit in the US Trust survey and the views of the very wealthy like Gates and Buffett is the belief that the Federal Government can spend their money more efficiently and wisely than they can.  That is a triumph of hope over experience.  But this implied belief is contradicted by those like Gates who have set up foundations for philanthropic objectives as a way to spend their wealth before the government gets its hands on it.

What is driving these wealthy individuals to act against their self interest?  One explanation is fairness. The super wealthy have to believe that they benefit disproportionately from tax cuts.  They also seem to believe that lost federal revenue means higher deficits and a growing debt.  Neither explanation passes the red-face test.  The Tax Foundation reports show that the tax system is already very progressive.  Forty four percent of tax filers pay no federal income tax while the top 1% pays 37.3% and the top 10% pays 69 %.  The tax rate for the top 1% is 26.9 % which is 7 times higher than the percentage paid by the bottom 50%.  To make the point more compelling, a CATO Institute analysis that “the Top 400 paid $29.4 billion in federal income taxes in 2014, an average of $74 million each.” This represents 2.13% of all taxes paid.  While that might seem like a small percentage, in it represents 70% of the taxes paid by the bottom 50% in 2016, the latest year available.

As for the notion that paying higher taxes will help with reducing the deficit and debt, the proponents of higher taxes on the wealthy have made clear that they want more revenue for increased spending; not fiscal responsibility.

The wealthy who say they want to pay more taxes know these facts. So, there must be some other reason especially since Bill Gates and Warren Buffett have pledged to give away half of their wealth and have recruited a number of others to do like wise.  As of last year, 183 people or couples had signed onto Bill Gates’ Giving pledge which involves a commitment “to dedicate the majority of their wealth to giving back.”

A more compelling reason may be the GroupThink effect of being viewed socially responsible by the public and their social network.  In fact, those who have created philanthropic foundations to give away the majority of their wealth are showing the shallowness of their support higher tax rates by giving away money for which they have received tax deductions that resulted in lower taxes.

People like Bill Gates and Warren Buffett, and perhaps most of the super wealthy, derive most of their income from capital gains which makes it easier to advocate higher incometax rates.  They know that the effect will be small and would encourage those subject to it to search for other ways to shelter income.  At some point, the AOCs and progressive presidential candidates will turn to raising the capital gains tax as a way of increasing taxes on the very wealty.  That will make for much political theater but won’t produce much in the way of income or economic benefit.

Increasing the capital gains tax rate would be counter productive in that it would provide an incentive to hold capital assets longer thus delaying when the government would get tax revenue.  Economists call this the locked in effect which means that more profitable investments can’t be pursued.  Further, economic research has shown that lower not higher capital gains rates increases economic growth.  While the case is made that lower capital gains rates encourage more sheltering because of the larger difference between capital gains and regular income rates, the evidence supports the economic case for a low rate because it increases capital investment and new business formation. 

At least some of the politicians advocating for these taxes understand behavioral economics and the tax policy effects because they are already wealthy and know how to protect their wealth.   Wink and nod policy proposals make for grand illusions and great politics but they are lousy and dangerous economics.  What people like Warren, AOC, and the other progressives running for president are really doing is using one of Saul Alinsky’s Rules for Radicals–Pick the target, freeze it, personalize it, and polarize it—to demonize the wealthy for their own political gain.  Class warfare has unintended consequences.

Oops Journalism

The coverage of the Covington HS incident during this year’s March for Life may have shown journalism at its worse. Depending on your personal bias, you could find a version of the event to confirm it.  

In this age of 24/7 news and instantaneous coverage of events, journalists and the media that employs them have incentives to be first with a piece of news.  If they get it wrong, many don’t engage in introspection; they simply move on. Partly this is a result of two conflicting definitions of what constitutes journalism–writing characterized by a direct presentation of facts or description of events without an attempt at interpretation and writing designed to appeal to current popular taste or public interest.  The second definition, which seems to be the one followed today, has no emphasis on facts without attempted interpretation.  Interpretation moves an article from reporting to opinion without admitting it.

The pejorative, Fake News, is not new but today, more journalists appear to follow Mark Twain’s counsel; “get your facts first, then you can distort them as you please.”  That this is taking place so frequently can be attributed to the 24/7 news cycle and the explosion of sources of media.  Seventy years ago, George Orwell wrote 1984.  Almost 60 years ago, Daniel Boorstin, historian and former Librarian of the Library of Congress, wrote The Image:  A Guide to Pseudo Events in America.  One insightful observation by Boorstin was “By harboring …and enlarging our extravagant expectations, we create the demand for the illusions with which we deceive ourselves.  And which we pay other to make to deceive us.”  This was not a new finding.  Boorstin points out that P.T. “Barnum’s great discovery was not how easy it was to deceive the public, but rather, how much he public enjoyed being deceived.”

Unfortunately, Boorstin offers no quick fix.  “There is no formula for mass disenchantment. … Each of us must disenchant himself, must moderate his expectations, must prepare himself to receive messages coming in from the outside.”

The lessons that not only journalists should take from the Covington incident and similar ones is one that I learned from a former colleague, Phil Goulding, who wrote a book—Confirm or Denyafter his tenure as Assistant Secretary of Defense for Public Affairs.  That lesson is “first reports are always wrong-or so often wrong that they always must be considered suspect.”  He amended that later to say third reports should be taken with a grain of salt.  At least that is my recollection.The volume of what is described as news is so great and comes at us so rapidly that all of us would be better served by starting with the view 

Free is Expensive

In 2010 President Obama gave us the Affordable Care Act which it isn’t.  Republicans pledged to repeal and replace, which they didn’t.  Now, progressive democrat candidates for president are promising healthcare or Medicare for all.  

Milton Friedman once said something like if you want to know how expensive something is, make it free.  The Brookings Institute and the Mercatus Center have estimated that healthcare for all would cost over $32 trillion dollars in its first decade.  While in theory it would reduce healthcare costs by $22 trillion, the net effect remains a $1 trillion a year increase in the growing deficit and debt.

Healthcare in the United States is a mess and various attempts to “fix it” have only made it worse.  If Congress and a President ever get serious about improving the delivery of healthcare and reducing its cost, they first have to understand its many problems.  Tinkering doesn’t work.

Advocates for change often refer to systems in European countries like the Netherlands, Norway, and Switzerland.  While there are elements of those programs that might be appropriate here, they can’t be adopted in a wholesale fashion.  But US healthcare does not compare well with countries like Sweden, Norway, or Switzerland in terms of cost and quality. So, they are worth taking into account.

What’s the answer for us?  It begins with recognizing that we don’t have insurance as it is understood; nor do market forces come into play.  Employer provided health care along with government provided subsidies for Medicare and Medicaid have blunted normal market incentives. In the case of employer provided “health insurance”, wage controls during World War II led employers to provide medical coverage as a benefit to get around those controls.  Much of our health insurance involves third party payers, meaning consumers have low or no incentives to control costs. In addition, the market is balkanized and bureaucratically controlled resulting in no real competition.

Here are some suggestions for consideration.

Employer provided health coverage should no longer be tax deductible.  Employees who receive employer provided insurance should be taxed on its value or alternatively could receive the cash equivalent of that insurance.  Instead of providing insurance, employers should be encouraged collaborate in forming non-profit exchanges which would provide insurance companies incentives to compete for a large pool of employees’ business.  With a reformed health insurance market, employees would have more options to choose from.  Insurance companies should be allowed to sell across state lines to increase competition and provide employees the widest range of insurance options.

The government should draw on the Swiss model which produces good outcomes at a national cost well below ours.  The Swiss mandate that all citizens purchase insurance from private insurance companies, establishes a minimum package of benefits, and mandate coverage of preexisting conditions.  Those provisions are similar to ones in the Affordable Care Act.  Republicans hate those features but there are strong reasons for them.

Medicaid should be reformed along the lines of the Swiss model to subsidize premiums for lower-income people to keep their costs less than 10 percent of their incomes.  The subsidy should cover 100% of the premiums for the unemployable and permanently disabled. As a rich nation, we should take care of those who can’t take care of themselves.

The Swiss model also has other features worth adopting.  The Swiss require insurance companies to offer minimal policies on a nonprofit basis. They are based on relatively high out-of-pocket expenses to encourage consumers to spend wisely.  The Swiss also mandate that prices be made public, which helps consumer markets function efficiently.

To avoid adverse selection by those with pre-existing conditions, states should establish high risk pools similar to pools for uninsured motorists.  And like auto policies, there would be a fee as part of health insurance premiums to fund the high risk pools.

The reform of the health insurance market must also include Medicare reform.  Medicare spends much more money than it takes in and as a result is one of the drivers of the growing deficit and debt.

To change that, the age for eligibility should be increased in line with increases in life expectancy.  Also, the cap on social security and medicare taxes should be raised and benefits means testing levels should be raised.  There have been a number of proposals for making Medicare more efficient and for doing a better job of lowering costs.  Those need to be pursued aggressively instead of just gathering dust.

Being Dumb Cost-Effectively

Environmentalists, Progressives, climate change advocates and the Green New Deal Proponents are determined to back fossil energy out of the US energy budget. The Green New Deal would like us all to believe that the energy system can get to zero CO2 emissions by 2030 and rely on renewables, except hydropower and nuclear. A more realistic estimate comes from Professor Joshua Goldstein’s book, A Bright Future: How Some Countries Have Solved Climate Change and the Rest Can Follow which estimates that pursuing renewables at a pace matching Germany decarbonize “would take 150 years”.

Advocates show no concern that its cost has been estimated to be as high as $15 trillion nor do they give any consideration to cost-effective power generation. Further, since China, India, and developing countries, along with many others, are not making serious efforts to reduce their emissions, this radical program would have virtually no effect on atmospheric CO2. concentrations.

US CO2 emissions peaked in 2005 and have been going down since, although there was a spike in 2018. Technology is allowing us to use fossil energy more efficiency and an abundance of natural gas is leading to a shift from coal. The trend in decarbonization is not new and did not require government regulations or mandates. Jesse Ausabel of Rockefeller University has document that the U.S. economy decoupled from carbon during the 1940s, long before congressional hearings about it.

The pursuit of zero emissions is based on a hypothesis that CO2 emissions are having a detrimental climate effect even though empirical observational data makes that hypothesis questionable. None the less, state legislatures and Congress are pursuing a costly and questionable energy policy to substitute wind and solar for more reliable and less costly power generation.

Wind and solar in addition to being intermittent, take up tremendous amounts of land. Producing all of our electrical power with them would require the equivalent of several New England states.
EIA data on the levelized cost of electricity—LCOE—tells an important story. Solar depending on whether it is thermal or PV ranges in cost from $34-$188 per MWh and wind depending on whether it is onshore or off ranges from $30 per MWh to $168. In the case of both solar and wind, there is the need for back up for periods of cloudiness and no or low wind. Back up increases costs. Nuclear on the other hand ranges from $89 to $97 MWh.

The MIT Technology Review in an article on carbon free electricity included the following quote from Jesse Jenkins, a postdoctoral fellow at Harvard, “You don’t confront a crisis with a limited tool set…You throw everything you’ve got at it.” That means that nuclear has to be an option. Those who do not subscribe to the climate orthodoxy may see these options as a choice between dumb and dumber. At least with nuclear, there is a proven technology and R&D has continued to advance it. The two biggest hurdles are cost and public phobia.

Professor Goldstein points out that with over 60 of nuclear experience, the only nuclear related fatality was the Chernobyl accident which was the result of Russian incompetence. The accidents at Three Mile Island and Fukushima were hyped by nuclear opponents in ways that simply added to public fears.

Nuclear energy is a long way from being cost competitive with natural gas for power generation but progress is being made. The development of small and medium sized reactors might be one way to respond to nuclear’s high capital cost and public fears. But in the long run, nuclear is not going to be widely accepted until it can be convincingly demonstrated to be as safe or even more safe than alternatives.

One small reactor is called NuScale. It is designed to generate just 50 megawatts of power, a fraction of the power generated from installed reactors. But being small means that it contains much less fuel and could be operated with a much lower risk. As a modular unit it can be built in a factory and shipped by truck. Power plants could activate one reactor at a time to generate the revenue need to purchase the next one.

While the nuclear industry attempts to gain greater public acceptance, it also needs to find ways to lower costs and reduce construction time. It should look to South Korea which has developed a world class nuclear industry and demonstrated that it is possible to bring projects to completion on time and on budget. According to Professor Goldstein, South Korea “has built 10 of its reactors based on the same design, … (and) produces nuclear power at or below fossil-fuel prices.”

Nuclear can also serve as a critical link in the further decarbonization of energy. Jesse Ausabel points out this is the natural consequence of technological development. In a paper– Density –, Ausabel points out, that the shift “to natural gas and nuclear power …together with relentlessly rising efficiency and changing industry composition, will carry us to a low-carbon economy in another 50 years or so. … The global energy system has been evolving toward hydrogen but perhaps not fast enough, especially for those most anxious about climate change.” Nuclear plants in addition to generating electricity can make hydrogen on the scale needed to meet our electric power needs.

Irresponsible Political Rhetoric

The Green New Deal (GND) is receiving a lot of attention; in part because of the outlandish statements by Alexandria Ocasio-Cortez or AOC as she is referred to by the media. To paraphrase what Mark Twain is supposed to have observed, it’s not all the things she doesn’t know that bothers me; it’s all the things that she knows that just aren’t so. In her case, she might be excused because of youthful immaturity and her judgment being clouded by her passionate exuberance. Others are just being politically irresponsible.

But she is not alone in her ignorance of how the world works or of the economic consequences of her ideas. The actual size of the Green New Deal Movement is unclear but it is made up of a lot of young activists who have gotten the attention of the progressives in the democrat party who now parrot its ideology.

It’s platform on energy wouldn’t be taken seriously if a lot of progressives were not also promoting Medicare for all at a cost of $32 trillion over 10 years. The GND energy plank calls for zero emissions of CO2 by 2030. To achieve this goal, GND calls for a dramatic expansion of renewable power to achieve 100% of national power demand from renewable sources. According to EIA, fossil fuels provide 63 % of our power needs while renewables, as defined by GND provide 8%. To be clear, its definition of renewables does not include hydroelectric or nuclear power which currently provide 27%.

The cost estimates of converting our power generation system to renewables is staggering. One estimate from NextBigFuture puts the cost at $15 trillion. Other estimates are in the range of $5 trillion but that is just to reach the goal of cutting greenhouse gas emissions by 80% in 2050. All estimates should be taken with a grain of salt because the radical nature of this transformation cannot accurately capture all of the costs of scrapping the current generation system and manufacturing and installing the renewable replacements on a forced timeline.

According to the Edison Electric Institute, the US generated 4,017,555 gigawatts of electric power in 2017. EIA has concluded that generating 1 gigawatt of power would require 3.1 million PV panels or 431 utility scale wind turbines. The math is simple and it demonstrates the foolishness of the GND proposal.

It may be that the socialists and progressives who support the GND know that it would totally wreck the economy and are using it as a tool to move public policy further to the left. Independent of whether GND proponents are being Machiavellian or just economically ignorant, their proposals should be exposed for what they are. Proponents should be pushed to explain the costs and economic consequences of their proposals. Radical ideas just don’t go away.

The opportunity cost of not swatting down foolish and irresponsible proposals is that we are not seriously addressing high priority problems in a bipartisan manner.

Bring Back the Office of Technology Assessment

The Office of Technology Assessment—OTA—was created by the Technology Assessment Act of 1972. It was defunded after Newt Gingrich became Speaker of the House, presumably because the Contract with America called for reducing Congressional spending.

One of Gingrich’s rationales was that members of Congress could talk directly with scientists without a filter (OTA). A nano second of reflection on that reason, if you can stop laughing long enough to reflect makes clear that it is nonsense. What defunding did make possible was opening the door to an army of lobbyists who could bamboozle members without being challenged by internal experts on matters of science and technology. Bootleggers had free rein to align themselves with Baptists by profiting from promoting a variety of “national interests.”

OTA was governed by a board that consisted of an equal number of democrat and republican senators and representatives. OTA provided Congress, at its request, with objective, comprehensive information and options it needed for the issues under consideration. Since it went out of existence, the General Accountability Office (GAO) has expanded its mission to include technology assessments. But GAO is a large organization that focuses on on-going programs where OTA focused on comprehensive longer term technical analyses.

In creating OTA, Congress stated “technology continues to change and expand rapidly, its applications are large and growing in scale; and
increasingly extensive, … and critical in their impact, … on the natural and social environment. It also found that “the Federal agencies … responsible directly to the Congress are not designed to provide the legislative branch with adequate … information, independently developed, relating to the potential impact of technological applications…”. The impact of science and technology have continued to grow in terms of importance and costs to society.

There is no shortage of important topics for which Congress needs informed and objective information. The needs for improved cybersecurity are evident daily. The electrical grid is woefully out of date and needs to be modernized, do advances in nuclear energy make it a potentially competitive addition to our electric power mix, and the scientific rationale for battery electric vehicles and solar/wind power are wanting. Congress and DOD have been sold a bill of goods by contractors on advanced weapon systems. The F-35 costing over $100 million a copy has been judged as failing to deliver “the full Block 3F capabilities (full combat capabilities) and the Navy’s $23 billion Gerald Ford carrier has been described as “a monument to the Navy’s and defense industry’s ability to justify spending billions on unproven technologies that often deliver worse performance at a higher cost.” The same is true of the Navy’s Littoral class vessel which was intended to replace Frigates. After 16 years and billions of dollars, the Navy may abandon the program because its high tech systems don’t work, its turbine propulsion system falls off, and it is highly vulnerable. It would be foolish to assume that only DOD has been captured by the defense complex. Any department with billions of dollars to spend on advanced systems is a ripe target to be exploited.

A strong case for re-establishing OTA has been made by former Congressman Rush Hold—PhD in physics—OTA–who is now the President of the Association for the Advancement of American Science (AAAS). The new Congress should make restoring OTA a high priority; one that should be able to attract bi-partisan support.

Meet the Press is Now Meet the Propogandist

At the start of the last show for 2018, Meet the Press host Chuck Todd proclaimed, dare I say Ex Cathedra, “We’re not going to debate climate change, the existence of it. The earth is getting hotter, and human activity is a major cause. Period.

We’re not going to give time to climate deniers. …“The science is settled even if political opinion is not.”

During the time that Tim Russert hosted Meet the Press, it achieved almost universal acclaim as the best news talk shows on TV. It achieved that position because Russert worked very hard in preparing for each show.  Whoever was the guest, whatever was the topic, Russert schooled himself on facts.  He displayed a level of journalist rigor and honesty that is long gone and totally absent in Chuck Todd.As Roy Spencer pointed out Todd set up a straw man because virtually no one denies that climate change is real or that human activities have affected climate.  So, who is Todd talking about?

In saying that the “science is settled”, Todd revealed his ignorance. Someone found a way to make him a shill for the climate orthodoxy.  If he continues down this road, Meet the Press will become just another purveyor of fake news.

There is one thing that is certain.  Either Todd did not read the most recent IPCC work group1 scientific assessment, which is different from the political tome that it released in October, or he did read it but has no comprehension of what it says.

On pages 7 and 14 of that scientific assessment, the IPCC presents two charts—one on extreme weather events; the other on radiative forcing elements.  For both, the IPCC uses subjective probability estimates of high, medium, and low confidence. For extreme weather events, it only has medium confidence that humans are the cause of heavy precipitation and low confidence of human causality for the intensity/duration of droughts and increases in the tropical cyclone cycle.  For radiative forcing factors, it only has medium confidence on the effects of short-lived greenhouse gases, the albedo effect caused by land use changes and solar irradiance, and low confidence in the effect of aerosols on clouds.

If the science was indeed settled, the level of confidence would be much higher.  Also, the estimate of climate sensitivity—the effect on temperature from doubling CO2—would not vary by a factor of 3.Not only does Chuck Todd not know what he is talking about but his comments are sophomoric.  The only interesting question is who or what got to Todd?  Was it his superiors at NBC or was it the Group Think psychological pressures from his social circle?

Doubling Down on Dread

The most recent IPCC Special Report attempts to make the case—again—that mankind is running out of time to avoid an impending climate catastrophe.  The report gives us 12 years to avoid temperatures going beyond 1.5 degrees C which is a line in the sand separating survival and an environmental apocalypse.

One of the advantages of a look back is recognizing that this movie has played before.  At the time of the first Earth Day, we were warned about the exhaustion of natural resources, a population explosion that would lead to mass starvation because the food supply would not be able to keep up with demand.  At the same time, many environmentalists were claiming that the introduction of industrial chemicals in the post war period represented a cancer time bomb that would lead to a cancer epidemic.

In looking back, it is easy to find an overlap between the purveyors of dread from the first Earth Day, the Limits of Growth, the impending cancer epidemic, and the climate change catastrophe.  People like Paul Ehrlich, Stephen Schneider, Maurice Strong, and John Holdren were part of the network that saw evil in capitalism and technology.  These and the other Apocalyptics have established a spectacular record of being consistently wrong.

Instead of running out of oil by 2000, we are awash in an abundance that to technology.  Instead of mass starvation, “The world has made great progress in reducing hunger: There are 216 million fewer hungry people than in 1990-92, despite a 1.9 billion increase in the world’s population,” according to the World Food Programme.  Norman Borlaug’s Green Revolution was a major reason for this progress.

The predicted cancer epidemic never arrived.  The CDC reports that between 1998 and 2017, the rate of new cancers dropped from 481 per 100,000 to 437.  The scary prediction was predicated on several important but unproven theories—the one hit theory, linear dose response, and rats are good models for humans. Now there is a recognition that genetics in addition to environmental exposures are a cause of cancer.

The predictions of an impending climate catastrophe have fared no better than all of the false ones that preceded them.

A recent paper by Judith Curry–Ms. Curry— found that the rising sea levels are not abnormal, nor are they the result of  human-caused climate change.  As Carl Wunsch of MIT has pointed out, sea level has been rising since the last Ice Age and will continue until the next one.

Extreme weather predictions continue to be pedaled in spite of data to the contrary. The number of violent tornados has dropped from a 15-year average of 13.7 in 1970 to a predicted 5.9 at the end of this decade.  Roy Spencer has shown that the number of major making landfall has been declining since 1940. Since extreme weather is supposed to be the result of unprecedented temperature increases, it should come as no surprise that those increases only come from model projections and not temperature observations which continue to show the continuing pause when the most recent El Nino is taken into account.

In spite of decades of failed predictions of doom, the Apocalyptics show no signs of quitting.  Although they have succeeded in getting climate change to be seen as one of the most important environmental issues, climate still does not rank high on the overall list of issues that concern most Americans.  Since the Apocalyptic’s agenda is political and climate change is a powerful weapon for attacking capitalism and technology, they will stay the course and probably become even more shrill.

Their rhetoric will be offset by the fact that oil, natural gas, and coal will continue to be the world’s primary sources of energy because they are abundant and affordable.  

What Ever Happened to Force Planning?

The US defense budget is larger than those of the next seven countries combined, according to the Peterson Foundation(spending) and yet every year we are told that the military is underfunded, aircraft and ships are not well maintained, training is not adequate, and the state of readiness is not good.  

The current year DOD budget is $610 billion while China in 2017 had a budget of $228 billion and Russia $66 billion. Part of the difference in these budgets is manpower costs. We have a volunteer force; China and Russia have conscripts.  None the less, there is a mismatch between spending and capability and the solution is not simply to increase the budget.

The Peterson Foundation—Strength at Home and Abroad– makes the case that “Our own defense planning is wanting. Due to parochial interests and a lack of political will, we have failed to fully modernize our national security policies to reflect the complex challenges posed by the world today. …Absent reforms, the growth of these costs will either swell the defense budget unsustainably, or squeeze out other areas of national security spending, leading to a hollowing of the force.” These views have been echoed by the Commission on National Defense Strategy.

The basic structure of our military forces is essentially the same as it was during the height of the Cold War even though the threats that we face are much different. The structures that comprise our military establishment influence planning and strategies. But the inertia in any large organization can obstruct the changes needed to meet future needs. So, instead of just building more ships, aircraft, and missiles, we should rebuild our military to address the threats that we are going to face in the coming decades.  That might lead to the conclusion that we don’t need four branches that have overlapping roles, that we don’t need $13 billion aircraft carriers or a $100-$300 million F-35 fighter.

The F-35 and the Gerald Ford carrier show what is wrong with the current force planning and weapon acquisition systems.  Because of crony capitalism and a military brass that wants more and bigger, these two platforms are not only extremely costly, they show little promise of being mission capable.  A DOD review of the F-35 concluded “In fact, the [F-35] program is actually not on a path toward success, but instead is on a path toward failing to deliver the full Block 3F capabilities [i.e., full combat capabilities].” The San Diego Union-Tribunedescribed the construction of the Ford as “a monument to the Navy’s and defense industry’s ability to justify spending billions on unproven technologies that often deliver worse performance at a higher cost. Serious questions have been raised as to whether the Ford class will be able to conduct the high-intensity flight operations expected during wartime.”

So, it is fair to ask, what conflicts are they being built for?  At one time, the US force structure was intended to be able to fight two full scale conventional conflicts and a smaller conflict.  Whether or not that was realistic in the past, it is not now. In addition to Russia and China as the two major military competitors, there are the threats from Iran, North Korea, and conflicts along the lines of Iraq and Afghanistan.

The Peterson Foundation set up a Defense Advisory Committee that recommended a strategy of a more flexible U.S. global presence. Such a strategy would be based on agility, technological superiority, and global reach. 

The existing services have strong incentives to engage in political horse trading while also attempting to get larger budgets and become larger.  That has resulted in budgets much larger than those of our adversaries and reduced operational readiness.  The time is ripe to design an armed force that has the capabilities identified by the Peterson Institute and that is more cost effective.  That may mean a complete overhaul of the existing services. Do we need both and army and marine corps, an air force as well as naval and marine air?  What about the triad.  Does it still make sense to have ICBM’s in fixed sites or nuclear bombs on aircraft that are over 50 years old?  Why do we need over 3700 nuclear weapons?

These are all tough questions and trying to answer them will run into tremendous opposition from members of Congress, the armed services, and the defense industry. But if they are not addressed, the current readiness situation will get worse and the size of the national debt will constrain how much the DOD budget can grow.  In the 1960s, the Rand Corporation published two books—Economic of Defense in the Nuclear Age and How Much is Enough.  Those need to be dusted off and used to plan for the future with imagination and innovation,

There Is Less Than Meets the Eye

This is a repost because the graphics in the original did not show up

Climate alarmists and those who blindly follow them often refer to graphs that show an increasing temperature trend over recent decades.  Michael Mann’s Hockey Stick, which has been discredited, is perhaps the most infamous graphs depicting the alleged human influence on temperature.

As with all graphs, it is important to know about the quality of data used to construct them.  In the case of temperature graphs, it turns out that there is clearly less than meets the eye.  MIT’s Dick Lindzen, using work from the late Stan Grotch from the Lawrence Livermore Laboratory, in a CO2 Coalition paper–Lindzen—made it abundantly clear that  these graphs represent the art of statistical magic.  This scatter gram contains 26,000 data points representing temperature measurements from the 1850s to 1984.

See figures 1a and 1c in the Lindzen paper

Grotch calculated the signal that is contained in the scatter gram and according to Frederick Colbourne in Geoscience-Environment determined that eliminating the top and bottom 10% of these anomalies resulted in an annual variance of +/-0.2 CI.  He concluded that a data range 10 times greater that the signal range, raises serious questions of confidence.

 It is clear that these anomalies don’t provide a convincing picture of warming.  To get that Lindzen demonstrates with the following graph that first the anomalies are averaged and then the temperature scale is stretched “by almost a factor of 10 so as to make … minuscule changes … look more significant.”

Going from raw data that does not show anything dramatic to graphs like this or the Hockey Stick that do are what Ross McKitrick, who provided a devastating critique of the Hockey Stick, labeled parlor tricks. This is especially true when the data sets contain their own sources of error and are not comparable.  In the scatter gram, ocean temperatures were added to land based measurements.  But as Lindzen points out ocean data measurements were from buckets in old ship data and then ship intakes after WW1. And, the surface measurements contain land based devices, weather balloon data and satellite measurements after 1979. Prior to 1900 the land based devices were sparsely distributed.

Inaccuracies in climate data sets have been documented by many scientists and brings to mind the observation of Sir Josiah Stamp,”The Government are extremely fond of amassing great quantities of statistics. These are raised to the nth degree, the cube roots are extracted, and the results are arranged into elaborate and impressive displays. What must be kept in mind, however, is that in every case, the figures are first put down by a village watchman, and he puts down anything he damn well pleases!

Analysts are fond of saying, torture the data until it confesses but torture it too much and it will confess to anything.  In view of all the noise in the temperature record and Lindzen’s CO2 Coalition paper, it is clear that there has been torturing of data on the scale of waterboarding.

There is a lot that can be done to improve the current situation that involves a standoff between skeptics and alarmists and the general public which by inaction is saying, we are not convinced.  First, the NAS could assemble a committee of skeptic and alarmist scientists, including some statisticians to develop a set of guidelines for adjusting historical data and for developing a more robust data set.  A similar approach could be taken with respect to models, although it may be the case that the complexity of the climate system combined with known uncertainties makes it impossible to develop a reliable forecasting model. 

 Finally, those of us who challenge the orthodoxy need to find a more effective way to communicate.  Climate alarmists have managed to marginalize us a “denier” and “skeptic.” As Nate Silver observed in his book The Signal and the Noise:“ In science, dubious forecasts are likely to be exposed-and the truth is more likely to prevail.  In politics, a domain in which the truth enjoys no privileged state, it’s anybody’s guess.”