Posts

A Deadly Caution: How the FDA’s Precautionary Principle Is Killing Patients by Alexander Tabarrok

I have long argued that the FDA has an incentive to delay the introduction of new drugs because approving a bad drug (Type I error) has more severe consequences for the FDA than does failing to approve a good drug (Type II error).

In the former case, at least some victims are identifiable and the New York Times writes stories about them and how they died because the FDA failed. In the latter case, when the FDA fails to approve a good drug, people die but the bodies are buried in an invisible graveyard.

In an excellent new paper (also here), Vahid Montazerhodjat and Andrew Lo use a Bayesian analysis to model the optimal tradeoff in clinical trials between sample size, Type I and Type II error.

Failing to approve a good drug is more costly, for example, the more severe the disease. Thus, for a very serious disease, we might be willing to accept a greater Type I error in return for a lower Type II error. The number of people with the disease also matters. Holding severity constant, for example, the more people with the disease the more you want to increase sample size to reduce Type I error. All of these variables interact.

In an innovation, the authors use the US Burden of Disease Study to find the number of deaths and the disability severity caused by each major disease. Using this data, they estimate the costs of failing to approve a good drug. Similarly, using data on the costs of adverse medical treatment, they estimate the cost of approving a bad drug.

Putting all this together the authors find that the FDA is often dramatically too conservative:

We show that the current standards of drug-approval are weighted more on avoiding a Type I error (approving ineffective therapies) rather than a Type II error (rejecting effective therapies).

For example, the standard Type I error of 2.5% is too conservative for clinical trials of therapies for pancreatic cancer — a disease with a 5-year survival rate of 1% for stage IV patients (American Cancer Society estimate, last updated 3 February 2013).

The BDA-optimal size for these clinical trials is 27.9%, reflecting the fact that, for these desperate patients, the cost of trying an ineffective drug is considerably less than the cost of not trying an effective one.

(The authors also find that the FDA is occasionally a little too aggressive, but these errors are much smaller: for example, the authors find that for prostate cancer therapies the optimal significance level is 1.2% compared to a standard rule of 2.5%.)

The result is important especially because, in a number of respects, the authors underestimate the costs of FDA conservatism.

Most importantly, the authors are optimizing at the clinical trial stage assuming that the supply of drugs available to be tested is fixed. Larger trials, however, are more expensive, and the greater the expense of FDA trials, the fewer new drugs will be developed. Thus, a conservative FDA reduces the flow of new drugs to be tested.

In a sense, failing to approve a good drug has two costs: the opportunity cost oflives that could have been saved and the cost of reducing the incentive to invest in R&D.

In contrast, approving a bad drug, while still an error, at least has the advantage of helping to incentivize R&D (similarly, a subsidy to research incentivizes R&D in a sense mostly by covering the costs of failed ventures).

The Montazerhodjat and Lo framework is also static: there is one test and then the story ends.

In reality, drug approval has an interesting asymmetric dynamic. When a drug is approved for sale, testing doesn’t stop but moves into another stage, a combination of observational testing and sometimes more RCTs — this, after all, is how adverse events are discovered. Thus, Type I errors are corrected.

On the other hand, for a drug that isn’t approved, the story does end. With rare exceptions, Type II errors are never corrected.

The Montazerhodjat and Lo framework could be interpreted as the reduced form of this dynamic process, but it’s better to think about the dynamism explicitly because it suggests that approval can come in a range for forms — for example, approval with a black label warning, approval with evidence grading, and so forth. As these procedures tend to reduce the costs of Type I errors, they tend to increase the costs of FDA conservatism.

Montazerhodjat and Lo also don’t examine the implications of heterogeneity of preferences or diseases morbidity and mortality. Some people, for example, are severely disabled by diseases that on average aren’t very severe — the optimal tradeoff for these patients will be different than for the average patient. One size doesn’t fit all.

In the standard framework, it’s tough luck for these patients. But if the non-FDA reviewing apparatus (patients/physicians/hospitals/HMOs/USP/Consumer Reports, and so forth) works relatively well — and this is debatable, but my work on off-label prescribing suggests that it does — this weighs heavily in favor of relatively large samples but low thresholds for approval.

What the FDA is really providing is information, and we don’t need product bans to convey information. Thus, heterogeneity (plus a reasonable effective post-testing choice process) mediates in favor of a Consumer Reports model for the FDA.

The bottom line, however, is that even without taking into account these further points, Montazerhodjat and Lo find that the FDA is far too conservative, especially for severe diseases. FDA regulations may appear to be creating safe and effective drugs, but they are also creating a deadly caution.

Hat tip: David Balan.

A version of this post first appeared at the Marginal Revolution blog.

Alex Tabarrok
Alex Tabarrok

Alex Tabarrok is a professor of economics at George Mason University. He blogs at Marginal Revolution with Tyler Cowen.

Electricity from New Wind Three Times More Costly than Existing Coal

WASHINGTON – The Institute for Energy Research released a first-of-its-kind study calculating the levelized cost of electricity from existing generation sources. Our study shows that on average, electricity from new wind resources is nearly four times more expensive than from existing nuclear and nearly three times more expensive than from existing coal. These are dramatic increases in the cost of generating electricity. This means that the premature closures of existing plants will unavoidably increase electricity rates for American families.

Almost all measures of the cost of electricity only assess building new plants–until now. Using data from the Energy Information Administration and the Federal Energy Regulatory Commission, we offer useful comparison between existing plants and new plants.

America’s electricity generation landscape is rapidly changing. Federal and state policies threaten to shutter more than 111 GW of existing coal and nuclear generation, while large amounts of renewables, such as wind, are forced on the grid. To understand the impacts of these policies, it is critical to understand the cost difference between existing and new sources of generation.

The following chart shows the sharp contrast in the cost of electricity from existing sources vs. new sources:

LCOE press png

Click here to view the full study.

This study was conducted by Tom Stacy, a former member of the ASME Energy Policy Committee, and George Taylor, PhD, the director of Palmetto Energy Research. The source of the calculations used in this study is a compilation of data reported by the generators themselves to FERC and EIA.

FLORIDA: 25 Reasons NOT to Take Federal Dollars to Expand Medicaid

Monday, The Florida Legislature opened a special session to decide on the state budget and debate how Florida should move forward in regards to our healthcare future. The Senate offered a plan that supporters, including many business interests, sugarcoated in conservative buzzwords such as “a free market approach,” even though the plan is anything but. As we say here at The James Madison Institute, pro-business isn’t always pro-free market. House Republicans and Governor Rick Scott, for good reason, oppose expanding federal control and a flawed program in our state. The Senate approved its plan Wednesday and the House is set to debate the bill today and vote on it this Friday [May 5th].

The Tampa Bay Times recently released an editorial giving 25 reasons Florida should take the money and encouraging Floridians to “tell (lawmakers) to listen to the powerful moral and financial arguments for taking the money and providing access to affordable health care.” Yes, there is a powerful moral and financial argument to be made. Yes, solutions exist to provide access to affordable healthcare. No, the Times does not have the right answers for either.

As Forbes opinion editor, senior fellow at the Manhattan Institute for Policy Research, and friend of JMI, Avik Roy points out, “Progressives have long enjoyed wielding the straw man. “If you oppose expanding Medicaid,” they say, ‘you oppose health care for the poor. Plain and simple.’ But the truth is, if you support expanding Medicaid, you’re doubling down on a failed system, one that shuts the door on real reforms that could provide quality health care to those who most need it.”

The James Madison Institute offers “25 Reasons NOT to Take Federal

Dollars to Expand Medicaid.” Share our infographic today and tomorrow through social media. RT on Twitter here. Share through Facebook here. Find on our website here.

  1. Medicaid already takes up more than 30% of Florida’s budget: Currently, Medicaid takes up more than 30 percent of Florida’s budget and crowds out other public priorities such as education, public safety and infrastructure.
  2. Medicaid payment rates are well below market rates:Payments to healthcare providers under Medicaid are well below market rates. Exasperating this system would be anathema to free-market reforms in healthcare.
  3. The federal government is already $18 trillion in debt; Obamacare costs rise daily:The federal government is $18 TRILLION in debt with the cost of Obamacare rising daily, requiring even more money from taxpayers to feed the beast.
  4. The supply of doctors accepting Medicaid is shrinking: As a consequence of federal Medicaid price controls, the supply of doctors that will accept Medicaid patients is shrinking — this shrinkage will become more rapid under an expansion of Medicaid.
  5. Medicaid expansion leads to greater use of ERs, not less: A March 2015 survey of 2,098 emergency-room doctors showed Medicaid recipients newly insured under the health law are struggling to get appointments or find doctors who will accept their coverage, and consequently wind up in the ER.
  6. Arkansas’s “private option” costs state taxpayers tens of millions: Medicaid expansion is not working in Arkansas. The Arkansas legislature passed a “private option” healthcare plan similar to what the Senate in Florida is proposing and the price tag is rising by the month under Obamacare’s Medicaid expansion and state taxpayers will now have to pay tens of millions to cover the unexpected costs. The proposed plan in Florida could cost far more than projections indicate.
  7. Mandated premiums create inefficiencies in supply and demand for healthcare services: When premiums for healthcare plan participants are mandated and set by legislative action, it is nothing more than market distorting price controls, which ultimately create inefficiencies in the supply and demand for healthcare services
  8. Feds won’t approve Senate’s special waivers; Florida left with traditional Obamacare expansion: The Senate’s plan includes a requirement that enrollees work, attend classes or prove they are seeking work in order to maintain eligibility for healthcare coverage. However, to date the federal government has rejected all state-run expansion plans with a work requirement. They will deny this special waiver and we’ll be left with traditional Medicaid expansion.
  9. Oregon study revealed Medicaid enrollees hardly better off than uninsured: Medicaid expansion is not working in Oregon. In Oregon, a study was conducted among Medicaid enrollees that found Medicaid “generated no significant improvements in measured physical health outcomes.”
  10. Medicaid Expansion will do nothing to lower cost of overall healthcare delivery: Medicaid expansion would not lead to any type of price transparency in healthcare delivery, which does nothing to help lower the cost of healthcare delivery.
  11. Medicaid expansion does not lead to better health outcomes for the poor: Research consistently shows Medicaid patients frequently receive inferior medical treatment, are assigned to less-skilled surgeons, receive poorer postoperative instructions, and often suffer worse outcomes for identical procedures than similar patients both with and without health insurance.
  12. New Hampshire feels the financial burn and is reconsidering Medicaid expansion: Medicaid expansion is not working in New Hampshire. According to the National Association of State Budget Officers’ annual report, in New Hampshire Medicaid grew from 24 percent of the overall state budget in 2012 to 27 percent in 2014. In January 2015, the state’s Department of Health and Human Services announced that it was $82 million over budget, thanks to Obamacare, Medicaid expansion and to the original Medicaid program expanding with additional enrollees. Lawmakers are now deciding whether to continue the expanded Medicaid program which sunsets in 2016.
  13. The federal government’s promises aren’t reliable: The U.S. Supreme Court told the federal government mandating Medicaid expansion was unconstitutional. However, they admitted this year that if Florida didn’t expand Medicaid under Obamacare, they would not be incentivized to continue the Low Income Pool funding. If they would pull funding from some of the most vulnerable in the system, what wouldn’t they do?
  14. Florida taxpayers will foot the bill for billions: Florida taxpayers will be responsible for a tab of billions of dollars as the federal government requires increasing shares from Florida’s budget after a certain point if the state expands Medicaid under Obamacare. Even if the federal government keeps its “promise” on the funding percentage, Florida taxpayers will be responsible for 10 percent of the total cost of expansion, a tab that will run into the billions based on even the most conservative estimates.
  15. Having health insurance isn’t the same as receiving healthcare:Medicaid is socialized health insurance, not access to healthcare. There is no guarantee that just receiving socialized insurance means an individual receives quality service.
  16. The majority of the Medicaid expansion population consists of working-age adults: The overwhelming majority of the Medicaid expansion pool are made up of childless, able-bodied, working-age adults. Expanding a failing entitlement program for this population will only lock people into the cycle of dependence.
  17. Medicaid expansion creates a perverse disincentive to improving one’s financial status: In many cases, making just a few more dollars per year will actually cost a person thousands in copayments, deductibles, and out-of-pocket expenses resulting in being pushed out of Medicaid rolls.
  18. Illinois faced unanticipated cost increases in the billions:Medicaid expansion isn’t working in Illinois. Forbes’s Akash Chougule reports, “Health officials originally estimated it would cost $573 million from 2017 through 2020 when the state’s funding obligation kicked in. But nearly 200,000 more people enrolled in the program in 2014 than originally projected. State budget officials were forced to revise their cost estimates to $2 billion—more than triple initial estimates.”
  19. Medicaid will cost Florida way more than anticipated: The cost projections for a Medicaid expansion in Florida are unreliable and grossly underestimated. Several states are experiencing the financial strain of Medicaid enrollment figures well higher than initial projections.
  20. Medicaid expansion wouldn’t necessarily result in more coverage or access to care: Florida’s own Medicaid director stated that he couldn’t guarantee the expansion would result in more coverage or access to care.
  21. Medicaid expansion increases private insurance rates: Expanding Medicaid rolls will inevitably distort the risk pool causing private insurance premiums to rise, effectively shifting more of the cost of expansion onto taxpayers and those not receiving Medicaid benefits.
  22. Ohio taxpayers face a $400 million bill: Medicaid expansion isn’t working in Ohio. Ohio’s Medicaid expansion is expected to be nearly $1 billion over budget in June. With Ohio on the hook for 10 percent of the expansion’s cost by 2020 (if the federal government keeps its promise) that will result in an annual cost of over $400 million for Ohio taxpayers.
  23. Expanding Medicaid will likely increase fraud: Medicaid expansion will increase the amount of fraud and abuse within an already strained government program
  24. The systemic issues in the healthcare system will not go away:Expanding Medicaid does absolutely NOTHING to address systemic issues facing Florida’s healthcare system that impact everyone.
  25. Dependency cycle will expand beyond true safety net intent: 
    The idea behind the safety net programs has always been to serve individuals in need, while providing mechanisms to pull out of dependence into productivity, not to create generations of citizens who know nothing except government reliance. By expanding Medicaid to populations that are outside the typical safety net composition, we effectively enlarge and encourage the cycle of dependency to grow and become more ingrained in our culture.

Full Stream Ahead: Why EPA’s Water Rule Goes Too Far

The Obama administration didn’t listen. Instead, it went ahead with its regulatory overreach over America’s waters. This worries farmers, ranchers, and other businesses.

EPA and the Army Corps of Engineers released their final Waters of the United States (WOTUS) rule–known as the “Clean Water Rule” in EPA lingo–that claims jurisdiction over vast swaths of the country.

In a statement EPA Administrator Gina McCarthy claimed, “This rule will make it easier to identify protected waters.” In reality, the rule does this by claiming federal jurisdiction over a huge number of waters.

Inside the 299 pages of regulations, definitions, explanations, and justifications for the rule, “adjacent” waters now under federal regulatory authority “include wetlands, ponds, lakes, oxbows, impoundments, and similar water features” that are “in the 100-year floodplain and that are within 1,500 feet” (five football fields) of a navigable water. The entire body of water is “adjacent” even if only a portion of it falls within the 100-year floodplain or within 1,500 feet of a navigable water.

While EPA and the Army Corps claim that WOTUS clarifies what waters are under federal jurisdiction, in agriculture’s case, nothing is clarified. The rule states [emphasis mine]:

Waters in which normal farming, ranching, and silviculture activities occur instead will continue to be subject to case-specific review, as they are today.

In fact, under this new definition bodies of water or wetlands over three-quarters of a mile from an navigable water could fall under federal jurisdiction if the federal government decides that it significantly affects another body of water [emphasis mine]:

[W]aters within 4,000 feet of the high tide line or the ordinary high water mark of a traditional navigable water, interstate water, the territorial seas, impoundments, or covered tributary are subject to case-specific significant nexus determinations.

The agencies claim they “do not anticipate that there will be numerous circumstances in which this provision will be utilized,” but who is to say the ever-growing Regulatory State won’t make this its default tool in its water regulation toolbox. Regulators’ best wishes are no guarantee that an agency’s power will be limited.

With federal jurisdiction comes costly federal permitting. “Over $1.7 billion is spent each year by the private and public sectors obtaining wetlands permits,” wrote the U.S. Chamber and 375 other associations in a comment on WOTUS to EPA and the Army Corps.

William Kovacs, the U.S. Chamber’s Senior Vice President of Environment, Technology, & Regulatory Affairs, said the process the agencies used to write the rule was “fundamentally flawed.”

Since issuing the proposed rule for public comment in April 2014, the agencies have somehow maintained that the proposal will have no significant regulatory or economic impact, and in fact the agency is simply ‘clarifying’ the current state of federal jurisdiction over waters. Such statements fly in the face reality.

Despite appeals from constituents and lawmakers across the country; countless business owners, farmers and industry leaders; and the Small Business Administration, the EPA and the Army Corps of Engineers failed to conduct any meaningful regulatory or economic impact analyses prior to issuing a final rule.

The Chamber filed lengthy public comments identifying exactly how the proposal could affect businesses of all sizes, including local municipalities, and requested the agencies convene a small business review panel to study and evaluate those impacts. Numerous state, local and business stakeholders and the Small Business Administration (twice) echoed that request, to no avail.

In a blog post prior to WOTUS being released, Kovacs worried that the water rule “would put [EPA] effectively in charge of zoning the entire country.”

Kovacs isn’t alone in criticizing the rulemaking process. While explaining that WOTUS will expand federal authority, Charles Maresca, Director of Interagency Affairs for the Small Business Administration’s Office of Advocacy, told a Senate Committee it was “incorrect” for EPA and the Corps to claim that the regulation won’t have “a significant economic impact on a substantial number of small businesses.”

It was no holds barred in the administration’s defense of its controversial rule. President Obama’s top environmental advisor Brian Deese said, “The only people with reason to oppose the rule are polluters who threaten our clean water.”

Tell that to farmers, ranchers, home builders, and other businesses. They understand that clean water means everything to their customers and their businesses. Federal regulators going over the heads of local and state officials accomplishes little but adding more barriers to job creation and economic development.

With WOTUS businesses will be up a creek without a paddle.

The Sweet, Sweet Privilege of the Maple Syrup Federation

“Quebec is the Saudi Arabia of maple syrup.”

That’s the money quote from an exposé on the provincial maple syrup cartel.

Quebec’s maple syrup “marketing board” first made big news in 2012 when the culprits of an $18 million maple syrup heist (really) from the Global Strategic Maple Syrup Reserve (really) were caught by provincial police. (See the whole story in a feature film, coming soon, starring Jason Segel — …really.) The board is making news again thanks to recent crackdowns on producers who want to sell syrup outside its control.

Quebec’s maple syrup is managed by the Federation of Maple Syrup Producers, a legal cartel that has been given special privileges under provincial law — including the power to enforce a monopoly in syrup sales — since 1966. All syrup goes through the Federation, which withholds enough from the market to keep the price high, takes a 12% cut of sales, and then (eventually) passes the remainder on to producers.

This month, the Federation stationed security guards in several sugar shacks to ensure that producers don’t sell their own syrup because some of them have begun to fight back.

In defiance of the rules, [Angèle] Grenier exported hundreds of barrels of maple syrup to New Brunswick from 2002 through 2014 — just so she could collect the money she earned.

Last year she knew she was being watched. Her brother and her neighbour each brought a tractor. In under an hour, with three tractors, the team loaded 40 barrels of syrup on a truck.

“All the children came and we did it quick, quick,” she says. The truck sped off down her dirt road.

The federation has now shut down Grenier’s exports.

Last month Justice Clément Sampson of Superior Court in Ste-Joseph-de-Beauce wrote an order permitting “a sheriff to penetrate into the sugar shack of [Grenier], without notice, at any time judged reasonable and as frequently as they judge appropriate … to verify the inventory of maple syrup, take photos and videos of the locale and the inventory, and put a mark, a seal, a tag or any other necessary identifying label on every maple syrup container.”’

Advocates for the syrup quota say it’s needed because prices aren’t enough to keep supply stable for a crop with a harvest that fluctuates from year to year. But if bad seasons are the danger, what syrup producers need is insurance (or futures markets). That could be accomplished simply and cheaply by setting aside money, rather than warehouses full of syrup guarded by state-of-the-art security to prevent Mission Impossible-esque heists.

The marketing board doesn’t protect producers from a bad harvest. It protects big producers from consumers — from you, me, and our dollars that could tempt sellers to compete with them outside the system.

The Federation estimates that only 75% of producers support its fixed prices, but it has the legal power to strong-arm the 25% who don’t. Dissenting producers don’t — can’t — have the same rights under the law if it’s to be enforced. Without equal rights under the law, there cannot be secure rights to property.

One rebellious seller remarks that since he defied the Federation, “They can come into my house anytime they want.” Perhaps that’s why producers in Ontario and New Brunswick, who still benefit from the price supports, have declined to join Quebec’s Federation.

How did this happen? Quebec producers sought legal privileges for themselves by organizing into the Federation. Now that that privilege exists, it’s been seized not only by maple syrup producers, but by a specific contingent who benefit most from it. Special powers, once created, benefit the especially powerful.

The good news is that prices still work. High prices have tempted others to enter the market, and as a result Quebec’s share of global production has fallen from 78% ten years ago to 69% today. If the trend continues, the cartel may become unstable — or it may try to convince (or, failing that, force) others to join in the racket.

In the end, it’s not just those who love maple syrup on a warm stack of flapjacks or that sweet, messy maple taffy who lose out when Big Syrup gets special privileges. Rather, it’s the people who want to act and innovate without permission. Innovation is the great enemy of the status quo, and those who benefit from that status quo, like the Syrup Federation, know this. Their message is clear: conform, syrup slingers, or find yourself in a sticky situation.

Janet Neilson

Janet Neilson is a founding member and program developer for the Institute for Liberal Studies and a contract researcher in Ottawa, Ontario, Canada.

Earth Month: 22 Ways to Think about the Climate-Change Debate

Reasoned agnosticism is a welcome antidote to hysteria by MAX BORDERS.

Reasonable people can disagree about the nature and extent of climate change. But no one should sally forth into this hostile territory without reason and reflection.

“Some scientists make ‘period, end of story’ claims,” writes biologist and naturalist Daniel Botkin in the Wall Street Journal, “that human-induced global warming definitely, absolutely either is or isn’t happening.”

These scientists, as well as the network of activists and cronies their science supports, I will refer to as the Climate Orthodoxy. These are the folks who urge, generally, that (a) global warming is occurring, (b) it is almost entirely man-made, and (c) it is occurring at a rate and severity that makes it an impending planetary emergency requiring political action. A Climate Agnostic questions at least one of those premises.

Trying to point out the problems of the Climate Orthodoxy to its adherents is like trying to talk the Archbishop of Canterbury into questioning the existence of God. In that green temple, many climatologists and climate activists have become one in the same: fueled both by government grants and zealous fervor.

Room for debate

But the debate must go on, even as the atmosphere for dialogue gets increasingly polluted. The sacralization of climate is being used as a great loophole in the rule of law, an apology for bad science (and even worse economics), and an excuse to do anything and everything to have and keep power.

Those with a reasoned agnosticism about the claims of the Climate Orthodoxy will find themselves in debate. It’s April 22nd — Earth Day. So I want to offer 22 ways to think about the climate-change debate. I hope these points will give those willing to question man-made climate change some aid and comfort.

1. Consider the whole enchilada

First, let’s zoom out a few orders of magnitude to look at the Climate Orthodoxy as a series of dots that must be connected, or better, a series of premises that must be accepted in their totality.

  • The earth is warming.
  • The earth is warming primarily due to the influence of human beings engaged in production and energy use.
  • Scientists are able to limn most of the important phenomena associated with a warming climate, disentangling the human from the natural influence, extending backward well into the past.
  • Scientists are able then to simulate most of the phenomena associated with a warming earth and make reasonable predictions, within the range of a degree or two, into the future about 100 years.
  • Other kinds of scientists are able to repackage this information and make certain kinds of global predictions about the dangers a couple of degrees will make over that hundred years.
  • Economists are able to repackage those predictions and make yet further predictions about the economic costs and benefits that accompany those global predictions.
  • Other economists then make further predictions based on what the world might be like if the first set of economists is right in its predictions (which were based on the other scientists’ predictions, and so on) — and then they propose what the world might look like if certain policies were implemented.
  • Policymakers are able to take those economists’ predictions and set policies that will ensure what is best for the people and the planet on net.
  • Those policies are implemented in such a way that they work. They have global unanimity, no defections, no corruption, and a lessening of carbon-dioxide output that has a real effect on the rate of climate change — enough to pull the world out of danger.
  • Those policies are worth the costs they will impose on the peoples of the world, especially the poorest.

That is a lot to swallow. And yet, it appears that the Climate Orthodoxy requires we accept all of it. Otherwise, why would the Intergovernmental Panel on Climate Change (IPCC) publish a document called “Summary for Policymakers”?

2. Models are not evidence

The problem with models is that they are not reality. Whenever we try to model complex systems like the climate, we’re only getting a simulacrum of a system, designed to represent projected scenarios. So when a climatologist presents a model as evidence, he is playing a kind of game. He wants you to think, by dint of computer wizardry, that he has drawn for you a picture of the world as it is. But he hasn’t. And if observation of surface temperatures over the last 18 years has shown one thing, it’s that climate models have been inadequate tools for forecasting complex natural phenomena.

3. Forecast is not observation 

In the first IPCC assessment of 1992, the authors wrote, “Scenarios are not predictions of the future and should not be used as such.” Whether one views the models as predictions or as scenarios, the evidence is barely within the most conservative of these in the most recent assessment, which is essentially designed to hide good news.

When one attempts to forecast — that is, to tell the future — one is not engaging in observation. That is not to claim that prediction isn’t a part of the scientific enterprise; it’s simply to say that when one’s predictions (or scenarios) are off, one’s theory is suspect, and it must be modified and tested again. Any theory, and any forecast scenarios on which it’s based, have to be tested in the crucible of observation. The Climate Orthodoxy has thus far failed that test.

4. Climate systems are complex

As I alluded to above, climate systems are complex systems. And complex systems are notoriously immune to certain types of prediction and forecast. As Edward Lorenz famously taught us when he coined the term “butterfly effect,” the slightest changes in initial conditions can give rise to wild, unpredictable outcomes in the system. It’s no different for a simulation. “I realized,” said Lorenz of his findings, “that any physical system that behaved non-periodically would be unpredictable.” Now, those concerned about climate change will try to use this perspective to suggest changes to the atmosphere could cause wild, unpredictable climatic catastrophes. And that might turn out to be true. (But it might not. We’ll discuss Pascal’s Climate Wager later.) What we should be concerned about for now is how easy it is for a single tiny error (or purposeful fudge) in a climate model to generate ranges that, though they can feed hysteria, are out of touch with reality.

5. Garbage in equals garbage out

Complex systems also make modeling difficult to undertake because a model is a kind of simulation whose success turns on the accuracy of inputs. Computer scientists have an apt saying for such simulations: “Garbage in, garbage out.” If any of your variables are in error, your results are suspect. And the more variables you introduce, the more likely you are to introduce errors. But for the model to resemble reality, you have to be more granular by including more and more variables that represent causal relationships in the world. As more variables get introduced, the likelihood of introducing false inputs goes up proportionally. And those errors compound. In The Black Swan, Nicolas Nassim Taleb writes:

Simply, we are facing nonlinearities and magnifications of errors coming from the so-called butterfly effects … actually discovered by Lorenz using weather forecasting models. Small changes in input, coming from measurement error, can lead to massively divergent projections — and that generously assumes we have the right equations.

In other words, the lower “res” the model, the less it conforms to reality’s details. The higher “res” the model, the more likely it is to be infected with errors. This is one of the great paradoxes of modeling.

6. Data can be detached

The problem with numbers is that they’re sometimes detached from the phenomena they’re meant to describe. If we see a record of a person’s body temperature from 1969 — at 99.1 degrees — we might assume he had a fever. But knowing the context of that measurement may lead us to tell a different story about what caused his temperature at that time: for example, that the man had been sitting in a hot tub. Climate data from the past can offer even less context, clarity, and accuracy.

But let’s suppose all the world’s thermometers — both satellite and land — have neither heat-island effects nor any other distortions, and that they offer an accurate description of the earth’s temperature. Let us also assume that the temperature readings over the last hundred years are completely accurate and represent the planet as a whole, and that the temperature data derived from inferential methods such as ice core samples and tree rings also paint an accurate picture of surface temperatures well into the past, which is doubtful.

We are still left with a problem: We cannot simply look at the outputs of the climate system (temperature), because they are linked to all-important inputs — that is, those factors that caused any changes in temperature. The inability for climate scientists to tell a more conclusive causal story about factors in past warming is another reason to remain agnostic about trends over longer timescales.

7. Decomposability is a virtual impossibility

Another serious problem with the theory of anthropogenic global warming (AGW) is that, if it is a theory at all, it seems to be a cluster of interconnected theories and interconnected models. Let that settle for a moment. Consider that the IPCC, the central climate-science organization whose job is to give the definitive word on climate change, has to assemble the work of hundreds, maybe thousands, of scientists and weave it into a comprehensive report. But as Norgaard and Baer write in Bioscience, “Models developed and heretofore interpreted within individual scientific communities are taken out of their hands, modified, and used with other models in ways over which the original scientific communities no longer have control.”

Now, in stitching together the various individual theories, studies, and models of such a diverse and inevitably error-prone community, the problem goes deeper. Never mind that the IPCC central committee has deep incentives to interpret the data in a way that creates the impression of a single, uniform theory. Suppose that every climate scientist that gets picked by the IPCC for its report claims 95 percent confidence. Even if each scientist were 95 percent certain of his particular prediction or set of parameters, we can’t be so certain about the agglomeration of 10 scientists’ opinions about disparate phenomena, much less 50. Nor can any given scientist be 95 percent confident about the work of any other scientist.

8. Stats stand in for certainty

People crave certainty, and politicians want to provide it. So when we hear that a scientist is 95 percent confident about his or her conclusions, we feel like that’s close enough, derived as it presumably is through some sort of statistical analysis. “Yet since things are ultimately uncertain,” writes theoretical mathematician William Byers:

We satisfy this need by creating artificial islands of certainty. We create models of reality and then insist that the models are reality. It is not that science, mathematics, and statistics do not provide useful information about the real world. The problem lies in making excessive claims for the validity of these methods and models and believing them to be absolutely certain.

Byers’s book The Blind Spot: Science and the Crisis of Uncertainty is a welcome antidote to this sort of scientific hubris.

Climatologist Judith Curry put matters a little differently. When a journalist asked her how the 95 percent number was determined, she replied, “The 95% is basically expert judgment, it is a negotiated figure among the authors. The increase from 90–95% means that they are more certain. How they can justify this is beyond me.”

The reporter then asked if it was really all so subjective. Curry’s reply: “As far as I know, this is what goes on. All this has never been documented.”

9. AGW might not be a theory at all

What makes a scientific theory a theory at all? This has been debated among philosophers of science, but most people generally agree that a certain set of minimum criteria should be in place. Among them, at least, are these:

  1. Is the theory testable? Can we formulate hypotheses grounded in the theory, then figure out a way to test the hypotheses?
  2. Is the theory falsifiable? Is there evidence that could call the theory into question? What evidence would exclude the theory?
  3. Does the theory unify? Does the theory unify seemingly unrelated phenomena under a single explanatory framework?

AGW is not testable in any laboratory sense, of course, but many natural phenomena are not. And yet we’ve already discussed the problems of testing models against available evidence — considering the models’ hypotheses and seeing whether these track with what we can observe. One might argue that models stand for hypotheses, and suffice for a testability criterion. But this is unclear.

Perhaps the most damning of the three for AGW is the falsifiability criterion. That is, the Orthodoxy has created a situation in which models play a major role in the theoretical framework. But when the models fail to track with observation, the Orthodoxy claims the timescales are not sufficient to determine a climate trend — for example, that discussing the pause of the last 18 years is “cherry picking.” Fair enough. But then what sort of data wouldcount to falsify the theory? And what, going forward, is a time scale sufficient to determine a climate trend? 100 years?

If we accept these longer timescales as sufficient to smooth out natural variability, we might reasonably ask the Orthodoxy to remain agnostic about AGW while another 70 years of data come in. (After all, they have had to rely on spurious proxies to “trick” temperature trends in the past.) But the Orthodoxy then changes tack and argues that’s too long to wait! After all, we might be going through an emergency that requires immediate action. So, despite the insufficient timescale, they expect everyone to accept the climate consensus as the basis for policymakers’ faith-based initiatives.

Finally, does AGW unify diverse phenomena under a single explanatory framework? AGW is meant to explain everything from ocean acidification to melting sea ice, to rising sea levels, to regional desertification. The trouble is with the explanatory part. When taken in isolation, each of these purported consequences of global warming either aren’t happening as predicted, or, if they are, they can be explained by factors outside AGW theory. So it’s not clear that AGW satisfies any unification criterion, either.

10. It’s matter of degree

What if the Climate Orthodoxy is wrong and the “lukewarmists” like Judith Curry turn out to be right? If we look at the empirical data over the last 30 years or so, they might be. As Rational Optimist author writes, “I found myself persuaded by the middle-of-the-road, ‘lukewarm’ argument — that CO2-induced warming is likely but it won’t be large, fast or damaging.” The Climate Orthodoxy might have been hyperventilating over a degree of warming over a century. (And, of course, policies driven by hysteria could mean the poorest people might be prevented from joining the middle class for the sake of an almost imperceptible change.)

11. Pascal’s Climate Wager

Suppose we all agreed that 100 years of accurate temperature data would be sufficient to determine a climate trend. The Climate Orthodoxy argues that we must act now to prevent climate change, in case they are right. People familiar with theology will recall this is the analogous to Pascal’s Wager, in which 17th-century Christian philosopher Blaise Pascal tells us we’d better believe in God, Heaven, and Hell. If we believe and we’re wrong, we haven’t lost anything, according to Pascal. But if we disbelieve and we’re wrong, we have eternity to suffer. Similarly, we must believe, suffer, and sacrifice now to stave off climate change.

There are a number of problems with this rationale, but the biggest one is rather ironic. There is no viable political climate solution currently on the table that is capable of mitigating any predicted warming. Taking the IPCC’s own assumptions, Patrick Michaels and Paul “Chip” Knappenberger found that there is no winning “wager” here:

Assuming the IPCC’s value for climate sensitivity (i.e., disregarding the recent scientific literature) and completely stopping all carbon dioxide emissions in the US between now and the year 2050 and keeping them at zero, will only reduce the amount of global warming by just over a tenth of a degree (out of a total projected rise of 2.619°C between 2010 and 2100).

If you think that a rise of 2.482°C is vastly preferable to a rise of 2.619°C then all you have to do is set the carbon tax large enough to drive U.S. emissions to zero by mid-century — oh yeah, and sell that tax to the American people.

So even if all the models turn out to be true, there is little we can do with policy at this point. So unlike Pascal’s Wager, there is no amount of repenting and belief that could save us. We’re either all going to climate hell, anyway, or something ain’t right. The whole conversation about “climate action” appears to be moot at this point. Don’t believe it? Check the Handy Dandy Climate Temperature Savings Calculator.

12. The debate is not over, and the science is not settled

Freeman Dyson, a brilliant theoretical physicist, is no man of the right. But he is intellectually honest enough to wear the mantel of “heretic.” Here’s why:

I am especially unimpressed by the claim that a prediction of rapid and dangerous warming is “settled science,” as firm as evolution or gravity. How could it be? It is a prediction! No prediction, let alone in a multi-causal, chaotic and poorly understood system like the global climate, should ever be treated as gospel. With the exception of eclipses, there is virtually nothing scientists can say with certainty about the future. It is absurd to argue that one cannot disagree with a forecast. Is the Bank of England’s inflation forecast infallible?

Indeed. And to say that the debate is over is not to say that those willing to debate have nothing to say. It is rather to say that you have turned off your curiosity, your humility, and your willingness to engage in discourse so that you can get what you want.

And what should we say about all this “consensus” talk? Science writer Ronald Bailey (no agnostic about climate change) wisely says:

One should always keep in mind that a scientific consensus crucially determines and limits the questions researchers ask. And one should always worry about to what degree supporters of any given scientific consensus risk succumbing to confirmation bias. In any case, the credibility of scientific research is not ultimately determined by how many researchers agree with it or how often it is cited by like-minded colleagues, but whether or not it conforms to reality.

13. Climate science isn’t climate policy

One of the biggest problems with the Climate Orthodoxy is that one set of experts that is cocksure about the science really has no expertise in the economics of climate change or in climate-change policy. How in the world is an expert in albedo effects going to have anything meaningful to say about whether climate change is good or bad for the world today — much less 50 years into the future? This profound disconnect has never stopped scientists like James Hansen from advocating for certain types of policies.

Seeing this disconnect, however, the Orthodoxy has begun training up so-called specialists in the economics of climate change, led by such “experts” as Sir Nicholas Stern, whose models and predictions are the stuff of both speculation and spectacle. More tempered in his prognostications is Yale’s William Nordhaus, but economists such as Robert Murphy offer very good reasons to question Nordhaus’s almanac, as well.

If you think modeling the climate is hard, try modeling an economy. As economist Arnold Kling writes,

I think that if the press were aware of the intellectual history and lack of scientific standing of the models, it would cease rounding up these usual suspects. Macroeconometrics stands discredited among mainstream academic economists. Applying macroeconometric models to questions of fiscal policy is the equivalent of using pre-Copernican astronomy to launch a satellite or using bleeding to treat an infection.

Whatever the pedigree of the economist, his laurels, or his letters, mixing macrometeorology with macroeconomics is like trying to read tea leaves.

14. The climate orthodoxy is inherently corruptive

Here’s the heretic Dyson again:

The politicians and the public expect science to provide answers to the problems. Scientific experts are paid and encouraged to provide answers. The public does not have much use for a scientist who says, “Sorry, but we don’t know.”

He’s right. It is nearly impossible to inoculate science from the influence of those who pay the bills. As I wrote in “The Climate Complex Strikes Back” (Freeman, February 2015), “That government money shouldn’t corrupt is just another application of the unicorn fallacy so common among well-meaning greens.” And it’s even tougher not to develop blind spots and biases when those who fund you claim to be on the side of the angels. That is why we must put our faith not in centralized hierarchies of experts but in the Republic of Science itself.

15. Reasoned agnosticism is not “denial”

Godwin’s law surfaces quickly in the debates about global warming. Here’s Botkin again:

For me, the extreme limit of this attitude was expressed by economist Paul Krugman, also a Nobel laureate, who wrote in hisNew York Times column in June, “Betraying the Planet” that “as I watched the deniers make their arguments, I couldn’t help thinking that I was watching a form of treason — treason against the planet.” What had begun as a true scientific question with possibly major practical implications had become accepted as an infallible belief (or if you’re on the other side, an infallible disbelief), and any further questions were met, Joe-McCarthy style, “with me or agin me.”

Of course, the term “denier” is meant to evoke Holocaust denial.

16. AGW might be beneficial on net

If Stern and Nordhaus (see #11) can engage in economic speculation, then we can, too. In fact, when we look back at warmer periods in the history of civilization, we see relative flourishing.

According to Matt Ridley, writing in the UK Spectator, Professor Richard Tol of Sussex University aggregated 14 major academic papers about the future effects of climate change. Tol determined that things look rosier than the Orthodoxy would have us believe:

Professor Tol calculated that climate change would be beneficial up to 2.2°C of warming from 2009 (when he wrote his paper). This means approximately 3°C from pre-industrial levels, since about 0.8°C of warming has happened in the last 150 years.

And in a more recent paper, Tol looks back over the last 100 years. He concludes that climate change raised human and environmental welfare during the 20th century:

By how much? He calculates by 1.4 per cent of global economic output, rising to 1.5 per cent by 2025. For some people, this means the difference between survival and starvation.

Sure, it’s speculative, even looking back. But isn’t it just as likely that there will be benefits as costs? It might turn out that if the planet does warm a couple of degrees, there will be new forms of flourishing.

17. One hundred years of certitude

One wonders what people in 1915 would have thought about our lives today. The pace of technological change has been staggering. And though a few people tried to make predictions, they were not cut out for the task. Likewise, we cannot readily say what forms of energy we’ll use, and what technologies they will power. As Troy University economist Daniel Sutter reminds us,

A dynamic market economy will feature too much creative destruction to allow detailed planning for the distant future. Nothing is sure in a market economy ten years from now, much less 100 years, and discounting in cost-benefit analysis simply reflects this reality. The economic future becomes more predictable when government controls economic activity, but then stagnation results. Discounting in climate change economics tells us to create wealth to protect future generations. Economic freedom and the institutions of the market economy, not central planning of energy use, is the prudent policy approach to a changing climate.

Inherent in our inability adequately to plan and predict is a recommendation that we adapt instead.

18. Adaptation as policy prescription

If the climate is warming some, and it might be, then what is the best policy? One can make a powerful case for adaptation. Adaptation is not about doing nothing. It means liberalizing the world on a number of dimensions of economic freedom to ensure that countries are rich enough to be resilient. A wealthy and adaptive people like the Dutch can figure out how to live with rising waters. A rich and resilient people like the Hong Kong Chinese can figure out how to build a city-state on a rock in 50 years. A rich and resilient citizenry of the world should be able to handle what a degree or two of change in average global temperature has in store for us — especially as we will undergo untold technological transformations over the next decade or two.

19. Climate policy has a defector problem

The problem with climate-change policies like carbon taxes is that they require near-global unanimity to work. That is, if the United States adopts a carbon tax, energy becomes more expensive for Americans. But if energy becomes more expensive here, it might be less expensive in other parts of the world. And, indeed, businesses and the energy industry will engage in energy arbitrage. Developing countries like India, China, Brazil, and Russia will welcome these energy arbitrageurs with open arms. They might develop even as we stagnate. And they should: they are lifting billions of people out of poverty. But there’s a problem here for climate policy. Every signatory to a climate treaty has strong incentives to defect. And as defectors do their thing, carbon continues to pour into the atmosphere. Nothing changes to mitigate climate change; industry simply shifts around.

20. Climate policy has an efficacy problem

Suppose we don’t accept Pascal’s Climate Wager and we conclude that no climate policy under consideration will do much to mitigate warming. Those who claim that action is vital respond to this claim by saying, “We have to start somewhere!” But if you’re conceding that no policy under consideration does very much, why would you start with a failed policy? It appears to be more empty rhetoric used to justify an unprecedented level of taxation designed to feed some of the most insatiable and predatory governments in the world.

21. Climate policy has a corruption problem

Earlier, I suggested that the Climate Orthodoxy has a corruptive influence on science. We shouldn’t stop there. The “climate industrial complex“ is large and growing. Scores of green energy companies are on the take, donating campaign contributions to politicians who control the purse strings at the Department of Energy. Legacy energy utilities lick their chops, seeing opportunities to game the system in a carbon-tax environment that is unfavorable to their competitors. Traders get in on energy credit schemes. Green NGOs play “Baptists” to all the corporate “bootleggers,” and when you scrutinize it all — including the billions of dollars the federal government pours into the “science” — the whole things starts to smell like one festering pile of corruption.

22. The confidence game

If you’re feeling uncertain, consider that the Climate Orthodoxy has to do everything it can to pull members of the public like you into assent. Here’s one final nod to Dyson:

The public prefers to listen to scientists who give confident answers to questions and make confident predictions of what will happen as a result of human activities. So it happens that the experts who talk publicly about politically contentious questions tend to speak more clearly than they think. They make confident predictions about the future, and end up believing their own predictions. Their predictions become dogmas, which they do not question. The public is led to believe that the fashionable scientific dogmas are true, and it may sometimes happen that they are wrong. That is why heretics who question the dogmas are needed.

If you are a Climate Agnostic, that’s okay. (You won’t burn at the stake; you’ll merely burn in the heat of a baking planet.)

Postscript: We are creative conservationists

As the world changes for this reason or that, we are growing richer, stronger, smarter, and more resilient. We are becoming more conscious about the environment and its natural treasures. On almost every environmental dimension — including air quality, water quality, the extent of forestland, and the return of wildlife — things are getting better. Whether you think most of these gains are a consequence of environmental regulations or improvements in market efficiencies, one thing is clear: wealthier is healthier. We should continue to cherish the beauty of the planet and continue to grow economically so we can afford to protect its wonders. Being agnostic about climate change does not require that we stop loving Planet Earth, it only means keeping a cool head and an open mind, even when the discourse overheats.

Max Borders

Max Borders is the editor of the Freeman and director of content for FEE. He is also co-founder of the event experience Voice & Exit and author of Superwealth: Why we should stop worrying about the gap between rich and poor.

New York Federal Reserve: Higher Health Costs, More Part-Time Workers from Obamacare

Obamacare puts employers in a bind, two New York Federal Reserve surveys show. Employers’ health care costs continue to rise, and the health care law is driving them to hire more part-time labor, CNBC reports:

The median respondent to the N.Y. Fed surveys expects health coverage costs to jump by 10 percent next year, after seeing a similar percentage increase last year.

Not all firms surveyed said the Affordable Care Act (ACA) is to blame for those cost increases to date. But a majority did, and the percentage of businesses that predicted the ACA will hike such costs next year is even higher than those that said it did this year.

Obamacare’s higher costs will cascade down to consumers. The surveys found that “36 percent of manufacturers and 25 percent of service firms said they were hiking prices in response” to Obamacare’s effects.

The Empire State Manufacturing Survey polls New York State manufacturers, and the Business Leaders Survey polls service firms in the New York Federal Reserve District.

A June Gallup poll found that four in ten Americans are spending more on health care in 2014 than in 2013.

Let’s dig into the numbers.

When asked, “How would you say the ACA has affected the amount your firm is paying in health benefit costs per worker this year?” More than 73% of manufacturers and 58% of service firms said the health care law has increased costs this year.

Companies are also more pessimistic about Obamacare next year. Over 80% of manufacturers and 74% of service firms expect health plan costs to increase in 2015.

New York Federal Reserve Empire State Manufacturing and Business Leaders Surveys

Source: New York Federal Reserve. For a larger view click on the image.

Employers were also asked what effects Obamacare is having on their labor forces. Over 21% of manufacturers and nearly 17% of service firms say they reduced the number of employees because of the law, while only about 2% of each have hired more workers. What’s more, nearly 20% of both manufacturers and service firms say that Obamacare has pushed them to increase their proportion of part-time workers, but just under 5% of each type of firm said they have lowered them. Presumably this is due to the perverse incentives from Obamacare’s employer mandate.

This data fits with research from the Atlanta Federal Reserve that found that since the recession, 25% of firms have a greater share of part-time workers, while only 8% have a lower share. This data also fits with anecdotes from around the country of employers saying that they’re hiring more part-time workers because of Obamacare.

New York Federal Reserve Empire State Manufacturing and Business Leaders Surveys

Source: New York Federal Reserve. For a larger view click on the image.

The sad truth is the health care law is pushing higher health costs onto employers and incentivizing them to hire more part-time workers. Despite passing a law in 2010 loaded with rules, regulations, mandates, and taxes, health care reform is needed more than ever. For solutions that that will control health care costs, improve quality, and expand access, check out the U.S. Chamber’s Health Care Solutions Council report.

Follow Sean Hackbarth on Twitter at @seanhackbarth and the U.S. Chamber at @uschamber.

Did the Florida Commissioner of Education lie about the cost to implement Common Core?

A question was posed to Pam Stewart, interim Florida Education Commissioner, during public hearings on Common Core State Standards at a public meeting in Tampa, FL.

The question posed by a parent was: Has any cost analysis been done on Common Core – what will it cost taxpayers?

Commissioner Stewart answered “It will cost nothing.”

Brenda Pastorick, who attended the Tampa, FL meeting, states, “Of course, there was loud applause. Now, is she that naive? Or, if she lied about this to the general public, what other lies are being told by the Florida Department of Education with regards to Common Core. We certainly don’t want liars dictating policy over the education of our children here in Florida, do we?”

The Florida Coalition to Stop Common Core issued a report on implementation in the sunshine state. Chapter 6 is on “The Cost to Implement Common Core Standards.” According to the report:

Based on data from several sources, the Common Core standards and accompanying tests will be very expensive – both to implement and to maintain.

Florida is projected by the Pioneer Institute to spend $1,024,163,000 to pay for testing, technology, textbooks, and professional development in what they characterize as a “middle of the road” estimate compared to $905,838,000 in grants received, leaving at least $118,325,000 in costs to Florida taxpayers just for implementation.

Given that former Commissioner Bennett and the State Board of Education (SBOE) originally asked for $442 million in one year to implement assessments, which is more than what Florida has already spent on the FCAT between 1996 and 2008 combined, that $118 million amount might well be low and will serve as a huge unfunded mandate to already strapped county districts. Marion County has had to lay off 160 teachers, and Charlotte County was forced to discontinue physical education classes until parental outrage and funding shifts reversed that decision as costs for Common Core implementation continue to mount.

Even more concerning is that Bennett changed his education budget request to $100 million in the middle of the legislative session. This constitutes a $342 million swing, indicates enormous credibility problems, and appears to be an effort to hide the true costs of this capaciously expensive system. In addition, the commissioner later said that Florida may consider some other completely different testing scheme at an unknown cost, even though Florida is the fiscal agent for PARCC.

Read the full report here.

According to Pastorick, “Just to let you know the hearing on Common Core last night in Tampa was well attended, but with only one member of the FL Board of Education, John Cologne, present. He left early and did not even hear me and I was #26 speaker. Hillsborough County had an unending number of teachers, etc. praising the results they are having in their classrooms using Common Core – one even broke up in tears because her daughter who has always had trouble reading is now reading and excelling under Common Core.  It was evident that the words of the two experts, Dr. Sandra Stotsky (English) and Ze’ve Wurman (Math) probably fell on deaf ears with the advocates of CC – credentials attached – each was allowed 15 minutes at the beginning of the hearing.”

There have been concerns that advocates would be given more time to speak than opponents.

Chrissy Blevio from the Florida Stop Common Core Coalition (FLCC) wrote in an email, “Dear FLCC Fighters, Please do your best to get to the remaining two hearings. Our opinion, after sitting in on the FDOE meetings in Tampa and the public hearing, is that the FDOE has no intentions to consider dropping Common Core but only to change the name or ‘rebrand.’ It’s the old ‘bait and switch’ routine.”

Saving Billions with Fly Ash

What is fly ash, you may ask? Have you ever heard of the Roman Pantheon?  It stands today because it was built with volcanic ash (a.k.a. fly ash). Similarly, bridges built with fly ash can be designed to last for a century and highways for 80 years. Fly ash can double the lifespan of a construction or infrastructure project; significantly lower maintenance costs; allow more roads, bridges and buildings to be built on fewer dollars; and ultimately create more jobs.

Why is fly ash important to both Florida and the United States?

According to Mike Murtha, President of the Florida Concrete and Products Association, “Currently, the federal transportation committee is considering an amendment allowing fly ash to continue to be used. This amendment is critical for Florida. Without this amendment, the fly ash industry will be heavily over-regulated by the federal government. If the industry is washed out it would cost 30,000 Floridians their jobs.”

The federal transportation bill is set to be decided on by the end of June, so this is a hot topic for the building industry. From a study done by the American Road and Transportation Builders Association (ARTBA), recycled fly ash is used in 95% of Florida’s concrete products that build transportation infrastructure projects all across the state. The use of recycled fly ash concrete has saved the state more than a $180 million over the span of five years as it makes structures stronger and longer lasting, as well as decreases the need to mine virgin resources from the ground.

Where does fly ash come from?

Fly ash is one of the residues generated in combustion, and comprises the fine particles that rise with the flue gases. In an industrial context, fly ash usually refers to ash produced during combustion of coal. Fly ash is generally captured by electrostatic precipitators or other particle filtration equipment before the flue gases reach the chimneys of coal-fired power plants, and together with bottom ash removed from the bottom of the furnace is in this case jointly known as coal ash.

Coal has become a target for environmentalists, President Obama, and former Florida Governor Charlie Crist. Coal-fired plants in Florida and across America are not being built, closing or converting to natural gas plants. As this occurs, fly ash is becoming scarce.

According to Murtha, “Fly ash is crucial to American transportation infrastructure — in 2010 alone, more than 55 million tons of fly ash was recycled for construction purposes. Concrete represents 15 percent of the total cost of building and maintaining transportation infrastructure in the United States each year. More than 75 percent of that concrete — $9.9 billion worth — utilizes fly ash as a partial cement replacement blend. In some states, fly ash is used for virtually all concrete projects. Without fly ash, many of our nation’s largest transportation projects would not have been possible.”

The cost of closing coal fired plants has other implications. Fly ash is one of them.

WATCH DOG RADIO – FLORIDA: Mike Murtha, President of the Florida Concrete and Products Association, will be a guest on Watch Dog Radio – Florida on Wednesday, June 27th from 11:40 to Noon EST. You may tune in on WWPR AM 1490 or listen to the live stream over the Internet at www.DrRichShow.com.