Is the Scientific Process Broken? by Jenna Robinson

The scientific process is broken. The tenure process, “publish or perish” mentality, and the insufficient review process of academic journals mean that researchers spend less time solving important puzzles and more time pursuing publication. But that wasn’t always the case.

In 1962, chemist and social scientist Michael Polyani described scientific discovery as a spontaneous order, likening it to Adam Smith’s invisible hand. In “The Republic of Science: Its Political and Economic Theory,” originally printed in Minerva magazine, Polyani used an analogy of many people working together to solve a jigsaw puzzle to explain the progression of scientific discovery.

Polanyi begins: “Imagine that we are given the pieces of a very large jigsaw puzzle, and … it is important that our giant puzzle be put together in the shortest possible time. We would naturally try to speed this up by engaging a number of helpers; the question is in what manner these could be best employed.”

He concludes,

The only way the assistants can effectively co-operate, and surpass by far what any single one of them could do, is to let them work on putting the puzzle together in sight of the others so that every time a piece of it is fitted in by one helper, all the others will immediately watch out for the next step that becomes possible in consequence.

Under this system, each helper will act on his own initiative, by responding to the latest achievements of the others, and the completion of their joint task will be greatly accelerated. We have here in a nutshell the way in which a series of independent initiatives are organized to a joint achievement by mutually adjusting themselves at every successive stage to the situation created by all the others who are acting likewise.

Polyani’s faith in this process, decentralized to academics around the globe, was strong. He claimed, “The pursuit of science by independent self-co-ordinated initiatives assures the most efficient possible organization of scientific progress.”

But somewhere in the last 54 years, this decentralized, efficient system of scientific progress seems to have veered off course. The incentives created by universities and academic journals are largely to blame.

The National Academies of Science noted last year that there has been a tenfold increase since 1975 in scientific papers retracted because of fraud. A popular scientific blog, Retraction Watch, reports daily on retractions, corrections, and fraud from all corners of the scientific world.

Some argue that such findings aren’t evidence that science is broken — just very difficult. News “explainer” Vox recently defended the process, calling science “a long and grinding process carried out by fallible humans, involving false starts, dead ends, and, along the way, incorrect and unimportant studies that only grope at the truth, slowly and incrementally.”

Of course, finding and correcting errors is a normal and expected part of the scientific process. But there is more going on.

A recent article in Proceedings of the National Academy of Sciences documented that the problem in biomedical and life sciences is more attributable to bad actors than human error. Its authors conducted a detailed review of all 2,047 retracted research articles in those fields, which revealed that only 21.3 percent of retractions were attributable to error. In contrast, 67.4 percent of retractions were attributable to misconduct, including fraud or suspected fraud (43.4 percent), duplicate publication (14.2 percent), and plagiarism (9.8 percent).

Even this article on FiveThirtyEight, which attempts to defend the current scientific community from its critics, admits, “bad incentives are blocking good science.”

Polanyi doesn’t take these bad incentives into account—and perhaps they weren’t as pronounced in 1960s England as they are in the modern United States. In his article, he assumes that professional standards are enough to ensure that contributions to the scientific discussion would be plausible, accurate, important, interesting, and original. He fails to mention the strong incentives, produced by the tenure process, to publish in journals of particular prestige and importance.

This “publish or perish” incentive means that researchers are rewarded more for frequent publication than for dogged progress towards solving scientific puzzles. It has also led to the proliferation of academic journals — many lacking the quality control we have come to expect in academic literature. This article by British pharmacologist David Colquhoun concludes, “Pressure on scientists to publish has led to a situation where any paper, however bad, can now be printed in a journal that claims to be peer-reviewed.”

Academic journals, with their own internal standards, exacerbate this problem.

Science recently reported that less than half of 100 studies published in 2008 in top psychology journals could be replicated successfully. The Reproducibility Project: Psychology, led by Brian Nosek of the University of Virginia, was responsible for the effort and included 270 scientists who re-ran other people’s studies.

The rate of reproducibility was likely low because journals give preference to “new” and exciting findings, damaging the scientific process. The Economist reported in 2013 that “‘Negative results’ now account for only 14% of published papers, down from 30% in 1990” and observed, “Yet knowing what is false is as important to science as knowing what is true.”

These problems, taken together, create an environment where scientists are no longer collaborating to solve the puzzle. They are instead pursuing tenure and career advancement.

But the news is not all bad. Recent efforts for science to police itself are beginning to change researchers’ incentives. The Reproducibility Project (mentioned above) is part of a larger effort called the Open Science Framework (OSF). The OSF is a “scholarly commons” that works to improve openness, integrity and reproducibility of research.

Similarly, the Center for Scientific Integrity was established in 2014 to promote transparency and integrity in science. Its major project, Retraction Watch, houses a database of retractions that is freely available to scientists and scholars who want to improve science.

A new project called Heterodox Academy will help to address some research problems in the social sciences. The project has been created to improve the diversity of viewpoints in the academy. Their work is of great importance; psychologists have demonstrated the importance of such diversity for enhancing creativity, discovery, and problem solving.

These efforts will go a long way to restoring the professional standards that Polyani thought were essential to ensure that research remains plausible, accurate, important, interesting, and original. But ultimately, the tenure process and peer review must change in order to save scientific integrity.

This article first appeared at the Pope Center for Higher Education.

Jenna RobinsonJenna Robinson

Jenna Robinson is director of outreach at the Pope Center for Higher Education Policy.

Policy Science Kills: The Case of Eugenics by Jeffrey A. Tucker

The climate-change debate has many people wondering whether we should really turn over public policy — which deals with fundamental matters of human freedom — to a state-appointed scientific establishment. Must moral imperatives give way to the judgment of technical experts in the natural sciences? Should we trust their authority? Their power?

There is a real history here to consult. The integration of government policy and scientific establishments has reinforced bad science and yielded ghastly policies.

An entire generation of academics, politicians, and philanthropists used bad science to plot the extermination of undesirables.

There’s no better case study than the use of eugenics: the science, so called, of breeding a better race of human beings. It was popular in the Progressive Era and following, and it heavily informed US government policy. Back then, the scientific consensus was all in for public policy founded on high claims of perfect knowledge based on expert research. There was a cultural atmosphere of panic (“race suicide!”) and a clamor for the experts to put together a plan to deal with it. That plan included segregation, sterilization, and labor-market exclusion of the “unfit.”

Ironically, climatology had something to do with it. Harvard professor Robert DeCourcy Ward (1867–1931) is credited with holding the first chair of climatology in the United States. He was a consummate member of the academic establishment. He was editor of the American Meteorological Journal, president of the Association of American Geographers, and a member of both the American Academy of Arts and Sciences and the Royal Meteorological Society of London.

He also had an avocation. He was a founder of the American Restriction League. It was one of the first organizations to advocate reversing the traditional American policy of free immigration and replacing it with a “scientific” approach rooted in Darwinian evolutionary theory and the policy of eugenics. Centered in Boston, the league eventually expanded to New York, Chicago, and San Francisco. Its science inspired a dramatic change in US policy over labor law, marriage policy, city planning, and, its greatest achievements, the 1921 Emergency Quota Act and the 1924 Immigration Act. These were the first-ever legislated limits on the number of immigrants who could come to the United States.

Nothing Left to Chance

“Darwin and his followers laid the foundation of the science of eugenics,” Ward alleged in his manifesto published in the North American Review in July 1910. “They have shown us the methods and possibilities of the product of new species of plants and animals…. In fact, artificial selection has been applied to almost every living thing with which man has close relations except man himself.”

“Why,” Ward demanded, “should the breeding of man, the most important animal of all, alone be left to chance?”

By “chance,” of course, he meant choice.

“Chance” is how the scientific establishment of the Progressive Era regarded the free society. Freedom was considered to be unplanned, anarchic, chaotic, and potentially deadly for the race. To the Progressives, freedom needed to be replaced by a planned society administered by experts in their fields. It would be another 100 years before climatologists themselves became part of the policy-planning apparatus of the state, so Professor Ward busied himself in racial science and the advocacy of immigration restrictions.

Ward explained that the United States had a “remarkably favorable opportunity for practising eugenic principles.” And there was a desperate need to do so, because “already we have no hundreds of thousands, but millions of Italians and Slavs and Jews whose blood is going into the new American race.” This trend could cause Anglo-Saxon America to “disappear.” Without eugenic policy, the “new American race” will not be a “better, stronger, more intelligent race” but rather a “weak and possibly degenerate mongrel.”

Citing a report from the New York Immigration Commission, Ward was particularly worried about mixing American Anglo-Saxon blood with “long-headed Sicilians and those of the round-headed east European Hebrews.”

Keep Them Out

“We certainly ought to begin at once to segregate, far more than we now do, all our native and foreign-born population which is unfit for parenthood,” Ward wrote. “They must be prevented from breeding.”

But even more effective, Ward wrote, would be strict quotas on immigration. While “our surgeons are doing a wonderful work,” he wrote, they can’t keep up in filtering out people with physical and mental disabilities pouring into the country and diluting the racial stock of Americans, turning us into “degenerate mongrels.”

Such were the policies dictated by eugenic science, which, far from being seen as quackery from the fringe, was in the mainstream of academic opinion. President Woodrow Wilson, America’s first professorial president, embraced eugenic policy. So did Supreme Court Justice Oliver Wendell Holmes Jr., who, in upholding Virginia’s sterilization law, wrote, “Three generations of imbeciles are enough.”

Looking through the literature of the era, I am struck by the near absence of dissenting voices on the topic. Popular books advocating eugenics and white supremacy, such as The Passing of the Great Race by Madison Grant, became immediate bestsellers. The opinions in these books — which are not for the faint of heart — were expressed long before the Nazis discredited such policies. They reflect the thinking of an entire generation, and are much more frank than one would expect to read now.

It’s crucial to understand that all these opinions were not just about pushing racism as an aesthetic or personal preference. Eugenics was about politics: using the state to plan the population. It should not be surprising, then, that the entire anti-immigration movement was steeped in eugenics ideology. Indeed, the more I look into this history, the less I am able to separate the anti-immigrant movement of the Progressive Era from white supremacy in its rawest form.

Shortly after Ward’s article appeared, the climatologist called on his friends to influence legislation. Restriction League president Prescott Hall and Charles Davenport of the Eugenics Record Office began the effort to pass a new law with specific eugenic intent. It sought to limit the immigration of southern Italians and Jews in particular. And immigration from Eastern Europe, Italy, and Asia did indeed plummet.

The Politics of Eugenics

Immigration wasn’t the only policy affected by eugenic ideology. Edwin Black’s War Against the Weak: Eugenics and America’s Campaign to Create a Master Race(2003, 2012) documents how eugenics was central to Progressive Era politics. An entire generation of academics, politicians, and philanthropists used bad science to plot the extermination of undesirables. Laws requiring sterilization claimed 60,000 victims. Given the attitudes of the time, it’s surprising that the carnage in the United States was so low. Europe, however, was not as fortunate.

Freedom was considered to be unplanned, anarchic, chaotic, and potentially deadly for the race. 

Eugenics became part of the standard curriculum in biology, with William Castle’s 1916 Genetics and Eugenicscommonly used for over 15 years, with four iterative editions.

Literature and the arts were not immune. John Carey’s The Intellectuals and the Masses: Pride and Prejudice Among the Literary Intelligentsia, 1880–1939 (2005) shows how the eugenics mania affected the entire modernist literary movement of the United Kingdom, with such famed minds as T.S. Eliot and D.H. Lawrence getting wrapped up in it.

Economics Gets In on the Act

Remarkably, even economists fell under the sway of eugenic pseudoscience. Thomas Leonard’s explosively brilliant Illiberal Reformers: Race, Eugenics, and American Economics in the Progressive Era (2016) documents in excruciating detail how eugenic ideology corrupted the entire economics profession in the first two decades of the 20th century. Across the board, in the books and articles of the profession, you find all the usual concerns about race suicide, the poisoning of the national bloodstream by inferiors, and the desperate need for state planning to breed people the way ranchers breed animals. Here we find the template for the first-ever large-scale implementation of scientific social and economic policy.

Students of the history of economic thought will recognize the names of these advocates: Richard T. Ely, John R. Commons, Irving Fisher, Henry Rogers Seager, Arthur N. Holcombe, Simon Patten, John Bates Clark, Edwin R.A. Seligman, and Frank Taussig. They were the leading members of the professional associations, the editors of journals, and the high-prestige faculty members of the top universities. It was a given among these men that classical political economy had to be rejected. There was a strong element of self-interest at work. As Leonard puts it, “laissez-faire was inimical to economic expertise and thus an impediment to the vocational imperatives of American economics.”

Irving Fisher, whom Joseph Schumpeter described as “the greatest economist the United States has ever produced” (an assessment later repeated by Milton Friedman), urged Americans to “make of eugenics a religion.”

Speaking at the Race Betterment Conference in 1915, Fisher said eugenics was “the foremost plan of human redemption.” The American Economic Association (which is still today the most prestigious trade association of economists) published openly racist tracts such as the chilling Race Traits and Tendencies of the American Negro by Frederick Hoffman. It was a blueprint for the segregation, exclusion, dehumanization, and eventual extermination of the black race.

Hoffman’s book called American blacks “lazy, thriftless, and unreliable,” and well on their way to a condition of “total depravity and utter worthlessness.” Hoffman contrasted them with the “Aryan race,” which is “possessed of all the essential characteristics that make for success in the struggle for the higher life.”

Even as Jim Crow restrictions were tightening against blacks, and the full weight of state power was being deployed to wreck their economic prospects, the American Economic Association’s tract said that the white race “will not hesitate to make war upon those races who prove themselves useless factors in the progress of mankind.”

Richard T. Ely, a founder of the American Economic Association, advocated segregation of nonwhites (he seemed to have a special loathing of the Chinese) and state measures to prohibit their propagation. He took issue with the very “existence of these feeble persons.” He also supported state-mandated sterilization, segregation, and labor-market exclusion.

That such views were not considered shocking tells us so much about the intellectual climate of the time.

If your main concern is who is bearing whose children, and how many, it makes sense to focus on labor and income. Only the fit should be admitted to the workplace, the eugenicists argued. The unfit should be excluded so as to discourage their immigration and, once here, their propagation. This was the origin of the minimum wage, a policy designed to erect a high wall to the “unemployables.”

Women, Too

Another implication follows from eugenic policy: government must control women.

It must control their comings and goings. It must control their work hours — or whether they work at all. As Leonard documents, here we find the origin of the maximum-hour workweek and many other interventions against the free market. Women had been pouring into the workforce for the last quarter of the 19th century, gaining the economic power to make their own choices. Minimum wages, maximum hours, safety regulations, and so on passed in state after state during the first two decades of the 20th century and were carefully targeted to exclude women from the workforce. The purpose was to control contact, manage breeding, and reserve the use of women’s bodies for the production of the master race.

Leonard explains:

American labor reformers found eugenic dangers nearly everywhere women worked, from urban piers to home kitchens, from the tenement block to the respectable lodging house, and from factory floors to leafy college campuses. The privileged alumna, the middle-class boarder, and the factory girl were all accused of threatening Americans’ racial health.

Paternalists pointed to women’s health. Social purity moralists worried about women’s sexual virtue. Family-wage proponents wanted to protect men from the economic competition of women. Maternalists warned that employment was incompatible with motherhood. Eugenicists feared for the health of the race.

“Motley and contradictory as they were,” Leonard adds, “all these progressive justifications for regulating the employment of women shared two things in common. They were directed at women only. And they were designed to remove at least some women from employment.”

The Lesson We Haven’t Learned

Today we find eugenic aspirations to be appalling. We rightly value the freedom of association. We understand that permitting people free choice over reproductive decisions does not threaten racial suicide but rather points to the strength of a social and economic system. We don’t want scientists using the state to cobble together a master race at the expense of freedom. For the most part, we trust the “invisible hand” to govern demographic trajectories, and we recoil at those who don’t.

But back then, eugenic ideology was conventional scientific wisdom, and hardly ever questioned except by a handful of old-fashioned advocates of laissez-faire. The eugenicists’ books sold in the millions, and their concerns became primary in the public mind. Dissenting scientists — and there were some — were excluded by the profession and dismissed as cranks attached to a bygone era.

Eugenic views had a monstrous influence over government policy, and they ended free association in labor, marriage, and migration. Indeed, the more you look at this history, the more it becomes clear that white supremacy, misogyny, and eugenic pseudoscience were the intellectual foundations of modern statecraft.

Today we find eugenic aspirations to be appalling, but back then, eugenic ideology was conventional scientific wisdom.

Why is there so little public knowledge of this period and the motivations behind its progress? Why has it taken so long for scholars to blow the lid off this history of racism, misogyny, and the state?

The partisans of the state regulation of society have no reason to talk about it, and today’s successors of the Progressive Movement and its eugenic views want to distance themselves from the past as much as possible. The result has been a conspiracy of silence.

There are, however, lessons to be learned. When you hear of some impending crisis that can only be solved by scientists working with public officials to force people into a new pattern that is contrary to their free will, there is reason to raise an eyebrow. Science is a process of discovery, not an end state, and its consensus of the moment should not be enshrined in the law and imposed at gunpoint.

We’ve been there and done that, and the world is rightly repulsed by the results.

Jeffrey A. TuckerJeffrey A. Tucker

Jeffrey Tucker is Director of Digital Development at FEE, CLO of the startup Liberty.me, and editor at Laissez Faire Books. Author of five books, he speaks at FEE summer seminars and other events. His latest book is Bit by Bit: How P2P Is Freeing the World.  Follow on Twitter and Like on Facebook.

Is Cheap Gas a Bad Thing? by Randal O’Toole

Remember peak oil? Remember when oil prices were $140 a barrel and Goldman Sachs predicted they would soon reach $200? Now, the latest news is that oil prices have gone up all the way to $34 a barrel. Last fall, Goldman Sachs predicted prices would fall to $20 a barrel, which other analysts argued was “no better than its prior predictions,” but in fact they came a lot closer to that than to $200.

Low oil prices generate huge economic benefits. Low prices mean increased mobility, which means increased economic productivity. The end result, says Bank of America analyst Francisco Blanch, is “one of the largest transfers of wealth in human history” as $3 trillion remain in consumers’ pockets rather than going to the oil companies. I wouldn’t call this a “wealth transfer” so much as a reduction in income inequality, but either way, it is a good thing.

Naturally, some people hate the idea of increased mobility from lower fuel prices. “Cheap gas raises fears of urban sprawl,” warns NPR. Since “urban sprawl” is a made-up problem, I’d have to rewrite this as, “Cheap gas raises hopes of urban sprawl.” The only real “fear” is on the part of city officials who want everyone to pay taxes to them so they can build stadiums, light-rail lines, and other useless urban monuments.

A more cogent argument is made by UC Berkeley sustainability professor Maximilian Auffhammer, who argues that “gas is too cheap” because current prices fail to cover all of the external costs of driving. He cites what he calls a “classic paper” that calculates the external costs of driving to be $2.28 per gallon. If that were true, then one approach would be to tax gasoline $2.28 a gallon and use the revenues to pay those external costs.

The only problem is that most of the so-called external costs aren’t external at all but are paid by highway users. The largest share of calculated costs, estimated at $1.05 a gallon, is the cost of congestion. This is really a cost of bad planning, not gasoline. Either way, the cost is almost entirely paid by people in traffic consuming that gasoline.

The next largest cost, at 63 cents a gallon, is the cost of accidents. Again, this is partly a cost of bad planning: remember how fatality rates dropped nearly 20 percent between 2007 and 2009, largely due to the reduction in congestion caused by the recession? This decline could have taken place years before if cities had been serious about relieving congestion rather than ignoring it. In any case, most of the cost of accidents, like the other costs of congestion, are largely internalized by the auto drivers through insurance.

The next-largest cost, pegged at 42 cents per gallon, is “local pollution.” While that is truly an external cost, it is also rapidly declining as shown in figure 1 of the paper. According to EPA data, total vehicle emissions of most pollutants have declined by more than 50 percent since the numbers used in this 2006 report. Thus, the 42 cents per gallon is more like 20 cents per gallon and falling fast. [Ed. note: And pollution is also mostly due to congestion.]

At 12 cents a gallon, the next-largest cost is “oil dependency,” which the paper defines as exposing “the economy to energy price volatility and price manipulation” that “may compromise national security and foreign policy interests.” That problem, which was questionable in the first place, seems to have gone away thanks to the resurgence of oil production within the United States, which has made other oil producers, such as Saudi Arabia, more dependent on us than we are on them.

Finally, at a mere 6 cents per gallon, is the cost of greenhouse gas emissions. If you believe this is a cost, it will decline when measured as a cost per mile as cars get more fuel efficient under the current CAFE standards. But it should remain fixed as a cost per gallon as burning a gallon of gasoline will always produce a fixed amount of greenhouse gases.

In short, rather than $2.38 per gallon, the external cost of driving is closer to around 26 cents per gallon. Twenty cents of this cost is steadily declining as cars get cleaner and all of it is declining when measured per mile as cars get more fuel-efficient.

It’s worth noting that, though we are seeing an increase in driving due to low fuel prices, the amount of driving we do isn’t all that sensitive to fuel prices. Real gasoline prices doubled between 2000 and 2009, yet per capita driving continued to grow until the recession began. Prices have fallen by 50 percent in the last six months or so, yet the 3 or 4 percent increase in driving may be as much due to increased employment as to more affordable fuel.

This means that, though there may be some externalities from driving, raising gas taxes and creating government slush funds with the revenues is not the best way of dealing with those externalities. I’d feel differently if I felt any assurance that government would use those revenues to actually fix the externalities, but that seems unlikely. I actually like the idea of tradeable permits best, but short of that the current system of ever-tightening pollution controls seems to be working well at little cost to consumers and without threatening the economic benefits of increased mobility.

This post first appeared at Cato.org.

Randal O’TooleRandal O’Toole

Randal O’Toole is a Cato Institute Senior Fellow working on urban growth, public land, and transportation issues.

Zika Virus Shows It’s Time to Bring Back DDT by Diana Furchtgott-Roth

The Zika virus is spreading by mosquitoes northward through Latin America, possibly correlated with birth defects such as microcephaly in infants. Stories and photos of their abnormally small skulls are making headlines. The World Health Organization reports that four million people could be infected by the end of 2016.

On Monday, the WHO is meeting to decide how to address the crisis. The international body should recommend that the ban on DDT should be reversed, in order to kill the mosquitoes that carry Zika and malaria, a protistan parasite that has no cure.

Zika is in the news, but it is dwarfed by malaria. About 300 million to 600 million people suffer each year from malaria, and it kills about 1 million annually, 90 percent in sub-Saharan Africa. We have the means to reduce Zika and malaria — and we are not using it.

Under the Global Malaria Eradication Program, which started in 1955, DDT was used to kill the mosquitoes that carried the parasite, and malaria was practically eliminated. Some countries such as Sri Lanka, which started using DDT in the late 1940s, saw profound improvements. Reported cases fell from nearly 3 million a year to just 17 cases in 1963. In Venezuela, cases fell from over 8 million in 1943 to 800 in 1958. India saw a dramatic drop from 75 million cases a year to 75,000 in 1961.

This changed with the publication of Rachel Carson’s 1962 book, Silent Spring, which claimed that DDT was hazardous. After lengthy hearings between August 1971 and March 1972, Judge Edmund Sweeney, the EPA hearing examiner, decided that there was insufficient evidence to ban DDT and that its benefits outweighed any adverse effects. Yet, two months afterwards, then-EPA Administrator William D. Ruckelshaus overruled him and banned DDT, effective December 31, 1972.

Other countries followed, and DDT was banned in 2001 for agriculture by the Stockholm Convention on Persistent Organic Pollutants. This was a big win for the mosquitoes, but a big loss for people who lived in Latin America, Asia, and Africa.

Carson claimed that DDT, because it is fat soluble, accumulated in the fatty tissues of animals and humans as the compound moved through the food chain, causing cancer and other genetic damage. Carson’s concerns and the EPA action halted the program in its tracks, and malaria deaths started to rise again, reaching 600,000 in 1970, 900,000 in 1990 and over 1,000,000 in 1997 — back to pre-DDT levels.

Some continue to say that DDT is harmful, but others say that DDT was banned in vain. There remains no compelling evidence that the chemical has produced any ill public health effects. According to an article in the British medical journal the Lancet by Professor A.G. Smith of Leicester University,

The early toxicological information on DDT was reassuring; it seemed that acute risks to health were small. If the huge amounts of DDT used are taken into account, the safety record for human beings is extremely good. In the 1940s many people were deliberately exposed to high concentrations of DDT thorough dusting programmes or impregnation of clothes, without any apparent ill effect… In summary, DDT can cause toxicological effects but the effects on human beings at likely exposure are very slight.

Even though nothing is as cheap and effective as DDT, it is not a cure-all for malaria. But a study by the Uniformed Services University of the Health Sciences concluded that spraying huts in Africa with DDT reduces the number of mosquitoes by 97 percent compared with huts sprayed with an alternative pesticide. Those mosquitoes that do enter the huts are less likely to bite.

By forbidding DDT and relying on more expensive, less effective methods of prevention, we are causing immense hardship. Small environmental losses are inferior to saving thousands of human lives and potentially increasing economic growth in developing nations.

We do not yet have data on the economic effects of the Zika virus, but we know that countries with a high incidence of malaria can suffer a 1.3 percent annual loss of economic growth. According to a Harvard/WHO study, sub-Saharan Africa’s GDP could be $100 billion greater if malaria had been eliminated 35 years ago.

Rachel Carson died in 1964, but the legacy of Silent Spring and its recommended ban on DDT live with us today. Millions are suffering from malaria and countless others are contracting the Zika virus as a result of the DDT ban. They were never given the choice of living with DDT or dying without it. The World Health Organization should recognize that DDT has benefits, and encourage its use in combating today’s diseases.

This article first appeared at E21, a project of the Manhattan Institute.

Diana Furchtgott-RothDiana Furchtgott-Roth

Diana Furchtgott-Roth, former chief economist of the U.S. Department of Labor, is director of Economics21 and senior fellow at the Manhattan Institute.

The Ethanol Mandate Is Literally Impossible by Alan Reynolds

In recent years, politicians set impossibly high mandates for the amounts of ethanol motorists must buy in 2022, while also setting impossibly high standards for the fuel economy of cars sold in 2025. To accomplish these conflicting goals, motorists are now given tax credits to drive heavily-subsidized electric cars, even as they will supposedly be required to buy more and more ethanol-laced fuel each year.

Why have such blatantly contradictory laws received so little criticism, if not outrage? Probably because ethanol mandates and electric car subsidies are lucrative sources of federal grants, loans, subsidies and tax credits for “alternative fuels” and electric cars. Those on the receiving end lobby hard to keep the gravy train rolling while those paying the bills lack the same motivation to become informed, or to organize and lobby.

With farmers, ethanol producers and oil companies all sharing the bounty, using subsidies and mandates to pour ever-increasing amounts of ethanol into motorists’ gas tanks has been a win-win deal for politicians and the interest groups that support them and a lose-lose deal for consumers and taxpayers.

The political advantage of advocating contradictory future mandates is that the goals usually prove ridiculous only after their promoters are out of office. This is a bipartisan affliction.

In his 2007 State of the Union Address, for example, President Bush called for mandating 35 billion gallons of biofuels by 2017, an incredible target equal to one-fourth of all gasoline consumed in the United States in 2006. Not to be outdone, “President Obama said during the presidential campaign that he favored a 60 billion gallon-a-year target.”

The Energy Independence and Security Act of 2007 (EISA) did not go quite as far as Bush or Obama, at least in the short run. It required 15 billion gallons of corn-based ethanol by 2015 (about 2 billion more than were actually sold), but 36 billion gallons of all biofuels by 2022 (which would be more than double last year’s sales). The 2007 energy law also raised corporate average fuel economy (CAFE) standards for new cars to 35 miles per gallon by 2030, which President Obama in 2012 ostensibly raised to 54.5 mpg by 2025 (a comically precise guess, since requirements are based on the size of vehicles we buy).

The 36 billion biofuel mandate for 2022 is the mandate Iowa Governor Terry Branstad (and Donald Trump) now vigorously defend against the rather gutsy opposition of Sen. Ted Cruz. But it is impossible to defend the impossible: Ethanol consumption can’t possibly double as fuel consumption falls.

From 2004 to 2013, cars and light trucks consumed 11% less fuel. The Energy Information Agency likewise predicts that fuel consumption of light vehicles will fall by another 10.1% from 2015 to 2022.  So long as ethanol is no more than 10% of a gallon (much higher than Canada or Europe), ethanol use must fall as we use less gasoline rather than rise, as the mandates require. If we ever buy many electric cars or switch from corn to cellulosic sources of ethanol, as other impossible mandates pretend, then corn-based ethanol must fall even faster.

If raising ethanol’s mandated share above 10% is any politician’s secret plan, nobody dares admit it. Most pre-2007 cars can’t handle more than 10 percent ethanol without damage, and drivers of older cars often lack the income or wealth to buy a new one. Since ethanol is a third less efficient than gasoline, adding more ethanol would also make it even more impossible for car companies to comply with Obama’s wildly-ambitious fuel economy standards (which must also reduce ethanol use, if they work).

The 2007 law also mandated an astonishing 16 billion gallons of nonexistent “cellulosic” ethanol by 2022 from corn husks or whatever. We were already supposed to be using a billion gallons of this marvelous snake oil by 2013. Despite lavish taxpayer subsidies, however, production of cellulosic biofuel was only about 7.8 million barrels a month by April, 2015 (about 94 million a year). The Environmental Protection Agency (EPA) mandate in June 10, 2015 was 230 million billion in 2016, which is more fantasy.

It doesn’t help that the Spanish firm Abenoga – which received $229 million from U.S. taxpayers to produce just 1.7 million gallons of ethanol – is trying to sell its plant in Kansas to avoid the bankruptcy fate of cellulosic producer KiOR. It also doesn’t help that a $500,000 federally-funded study paid finds biofuels made with corn residue release 7% more greenhouse gases than gasoline.

The contradictory, fantastic and often scandalous history of ethanol mandates illustrates the increasing absurdity of mandates from Congress and the EPA.

The 2007 biofuel mandate was not just bad policy. It was and remains an impossible, bizarre policy.

This post first appeared at Cato.org.

Alan ReynoldsAlan Reynolds

Alan Reynolds is one of the original supply-side economists. He is Senior Fellow at the Cato Institute and was formerly Director of Economic Research at the Hudson Institute.

Lesbians Castrating 11-Year Old Tommy to become Tammy

Lesbians adopting little boys only to castrate them….welcome to Obama’s new America. Katie McGuire reported:

Pauline Moreno and Debra Lobel, a lesbian couple from California, claimed their 11-year-old son Thomas didn’t want to be a boy. Thomas, who prefers to go by “Tammy,” wanted to be a girl. So his mothers gave him hormone treatments to delay puberty so that he could fully “transition” into a female through surgery when he is old enough.

Wake up America! What are we doing to our children? Where are the Christians of America?

RELATED ARTICLE: Lesbian couple in California chemically alter their 11-year old boy to prep for sex-change surgery

Education Emergency: Our Children (and U.S.) at Risk

As an independent physicist I’ve spent 40± years on environmental advocacy, and energy education. In the later part of this journey I’ve become increasingly distressed about what is happening in our education system.

After speaking out about this several times, in 2013 I was asked to put on a presentation to the US House Science, Space and Technology Committee, as well as to the North Carolina Legislators. The unabridged version of both of those talks is online at ScienceUnderAssault.info.

Since then, most of what I’ve seen indicates that the situation is getting worse, rather than remedied. This is a summary of key education parts that need to be immediately addressed. Hopefully it will encourage citizens to get more involved with rectifying this extraordinarily important matter.

1 – We can not effectively fix anything until we are on the same page. I believe that the place to start here, is that we need to fully agree on the overall objective of the education system. Exactly what is the product we expect to get at the end of a laborious 12+ year assembly line?

In my view, the number one criteria for determining whether the educational system has been a success or not is: do these graduates have the ability and inclination to do Critical Thinking?

Google founder Vint Cerf says that there is no more important skill to teach than Critical Thinking. He calls it the one tool we have to defend ourselves from the onslaught of misinformation we are saturated with today. He argues that Critical Thinking would enable citizens to be more thoughtful about what information they accept, then process, and then use. That skill is a major benefit in literally every aspect of life.

My experience is that while the education system gives lip-service to Critical Thinking, when the rubber-meets-the-road, it’s not really happening. An easy test is to ask any college or high school student today what they think about global warming. Do they provide a thoughtful, thorough analysis — or simply regurgitate propaganda?

My first recommendation is that this be adopted by every state education department, every local school board, every academic institution, etc:

“It is our obligation to produce critically thinking graduates.”

2 – I’m a zealous defender of my profession, Science. Most people are not aware of it, but Science is under a ferocious attack, worldwide. The reason is that individuals and organizations promoting political agendas, or their own economic interests, are acutely aware that real Science is not their
friend — as it will expose them for what they are.

Those self-serving parties realize that even though most citizens have faith in Science, very few actually understand what Science is. So they take advantage of that discrepancy, by purposefully making false Science claims. They are fully aware that only a small number of people will understand the fraud — and even fewer will say anything public about it.

From what I’ve seen, the most egregious assaults on Science are taking place in such newbie science branches such as Environmental Science, Earth Science, Ecology, etc.

This campaign is being supported by slick internet video “science” series like Crash Course, Bozeman Science, etc. Listen carefully to the Crash Course founder explaining why they made over 200 education videos. He says “We don’t really have a coherent answer.” SAY WHAT?! I call these QVC Science, as (IMO) they are effectively polished sales pitches.

Propagandizing Science starts in our local schools. The good news is that the solution is also there — and is entirely under our control (see #3). Recommendation number two is that I’m advocating that every state education department, every local school board, every academic institution, formally adopt and implement this standard:

“Science education will be apolitical.”

3 – In my countrywide travels and correspondences I’ve heard from many parents of students. Quite a few have complained about various matters going on in their district. I asked them what response they got when they expressed their concerns to the teacher, principal, school board or superintendent? Most said essentially the same thing: they were reluctant to speak out for fear of retribution to their child. What a wonderful system.

The remaining citizens are those with no school children. Those people understandably believe that the school system is being held accountable by those with the most at stake: parents of current children. But no!

My wife and I are in the second group. We were warned that because we had no kids in the system, that defenders of the status quo would instead attack us personally if we spoke up publicly about the secondary school system. We’d be accused of being anti-superintendent, anti-school board, anti-teacher, and/or anti-children.

It seems rather hypocritical that school districts who pride themselves for enforcing a “no tolerance” bullying policy between students, would actually tolerate intimidation of citizens who have the temerity to speak up about school system improvements…

Most people (including us) would like the federal government to stay out of the education business. Additionally we would also prefer that the state have minimal involvement in the education process. We want the ability to locally decide what is best for our children and our community. We rarely hear about the flip side to this freedom: responsibility. If we want to control things ourselves, for our interests, then that means that there has to be real community involvement — which includes unfettered and unpenalized inputs from parents and citizens.

So my third suggestion is that every state education department and school district officially adopt the following position for their interfaces with parents and the public (prominently putting it on their websites, letterhead, etc):

“Please tell us how we can do a better job!”

When inputs from the public are received the choice is very simple. The recipients can be genuinely appreciative that citizens take the time to make constructive suggestions to improve student education — or they can circle the wagons, and defend the status quo. Ironically, it’s the later action that necessitates more higher level intervention…

Whether you have children in the education system or not, is irrelevant. The future of our country, is literally at stake here. We all are going to sink or swim based on whether we have an effective education system. Please carefully investigate what is happening in your community.

“The function of education is to teach one to think intensively, and to think critically.” — Martin Luther King, Jr.

EDITORS NOTE: The featured image is courtesy of ShutterStock.com.

Greece, Cyprus and Israel to build Eastern Mediterranean Gas Pipeline

Auspicious meetings were held in Nicosia, Cyprus with members of the emerging Trilateral Eastern Mediterranean Gas Pipeline alliance: Israeli Prime Minister Benjamin Netanyahu, Cyprus President Nicos Anastasiades and Greek Prime Minister Alexis Tsipras.

Watch this Jerusalem Post news video of the historic triple alliance meeting in Nicosia:

leaders on mediteranian pipeline

Israeli Prime Minister Benjamin Netanyahu, left, Cyprus President Nicos Anastasiades and Greek Prime Minister Alexis Tsipras at Nicosia trilateral meeting, January 27, 2016.

The Jerusalem Post reported the triple alliance leaders announcing plans to set up the long delayed Eastern Mediterranean gas pipeline:

NICOSIA – Israel, Cyprus and Greece decided at their first ever tripartite meeting to set up a steering committee to look into laying a gas pipeline from Israel to Cyprus, and then to Greece for further export to Europe.

The decision was announced by Prime Minister Benjamin Netanyahu, standing next to Cyprus President Nikos Anastasiades and Greek Prime Minister Alexis Tsipras.

Each leader delivered a statement noting the historic nature of the meeting, and highlighting the possibilities this emerging alliance has for the region. They did not answer any questions from the press.

While both Anastasiades and Tsipras stressed, without mentioning Turkey by name, that this cooperation was not “against anyone else,” Netanyahu did not make a reference at all to Turkey, either directly or indirectly.

National Infrastructure, Water and Energy Minister Yuval Steinitz, who was part of the Israeli delegation, told reporters on the plane en route to Nicosia that Israel wanted to have the ability to export the gas both through Greece and Turkey. Laying the pipeline to Turkey is considerably cheaper than through Cyprus and Greece.

Anastasiades, as host of the summit, spoke first, and said this cooperation was based on an appreciation that “it is imperative to work collectively through coordination.” He said that the three leaders signed a joint declaration, which he termed a “historic document” that deals with cooperation in the energy, tourism, research, water-management, anti-terrorism and immigration spheres. He said that a trilateral steering committee will monitor the agreement.

Netanyahu, who said that as the son of a historian he was averse to using the term “historic,” said that the term did however fit the meeting. “I believe this meeting has historic implications,” he said. “The Last time Greeks, Cypriots and Jews sat around a table and talked about a common framework was 2,000 years ago.”

In addition to the gas pipeline, Netanyahu also spoke of a plan to lay an underwater cable to connect the electric grids of all three countries. “You can export gas through electricity,” he said.

Tsipras said that cooperation with Israel and Cyprus was a “strategic choice” for Athens.

EasternMedPipeline(1)In a January 2015, New English Review article, “Could Israel Lose the Energy Prize in the Eastern Mediterranean,” we noted this about the prospects for the Triple Alliance Eastern Mediterranean Pipeline.

“On December 9, 2014, Israel, Cyprus and Greece pitched the Eastern Mediterranean pipeline a day before a conference organized jointly byNatural Gas Europe, the Greek Energy Forum, ESCP Europe, RCEM and the European Economic and Social Committee (EESC). The conference was titled “2030 EU Energy Security, the Role of the Eastern Mediterranean Region” and took place at EESC headquarters in Brussels. Natural Gas Europe in an article on the EESC conference noted the comments of Greek Energy Minister, Ioannis Maniatis:

Europe will need an extra 100 bcm of natural gas in the next 15 years, and in light of Europe’s increasing dependence on imports to fulfill its energy needs, the EU must find a sustainable model to ensure it is a competitive economy.

The EU needs to reduce external dependence, increase efficiency, diversify its sources and routes of supply, and improve interconnectors, he added. Fully connected energy grids, greater transparency, good governance and a thorough understanding of global events should also be the focus of the EU according to Maniatis. He explained that Greece’s importance is growing. The East Med pipeline pitched by Israel, Cyprus and Greece would run from Israel and Cyprus via Greece to Italy and then to the rest of Europe is technically feasible and attached to attractive prospects said Maniatis. He told the audience that the results of a feasibility study on the East Med pipeline will be released next year and that the pipeline would serve as a new source and provider of natural gas comparable to the Southern Corridor. The attractiveness of the East Med Pipeline, said Maniatis, is that unlike the Southern Corridor, it would pass exclusively through four member states and hence deserves strong EU backing for its materialization.

The Eastern Mediterranean Pipeline had received the endorsement of the EC as a priority project for underwriting in November 2013. According to The Guardian that could provide the Eastern Mediterranean pipeline project “access to a €5.85bn fund, and preferential treatment from multilateral banks.”

Natural Gas Europe reported at the time the options under consideration:

The basic plan will see the pipeline stretch from the Leviathan field offshore Israel on to Cyprus ending in eastern part of the Island of Crete in Greece. Three alternate routes were discussed:

  • To the Peloponnesus Peninsula joint via spur with the Trans-Adriatic Pipeline (TAP)
  • From Crete to northern Greece where it would join the Interconnector Greece-Bulgaria (IGB)
  • From Crete to the Revythousa LNG terminal close to Athens. The terminal would be significantly upgraded to accommodate large amounts of gas exports thereafter.

The technically difficult 1,880 kilometer long submarine pipeline project, reaching depths of more than 2,000 meters, would connect Leviathan and Aphrodite gas fields ultimately to Italy. Cost for the project was estimated at over $20 Billion and would likely not be concluded at the earliest until 2020, assuming that production of the Leviathan field in the Israeli EEZ begins in 2017. With the demise of both the Turkish Leviathan-Ceyhan pipeline and the Australian Woodside Pty. Ashdod LNG –Eilat pipeline for delivery of gas to the Asian markets, the Eastern Mediterranean pipeline project may have serious consideration. There is the alternative of the onshore LNG facility at Vassilikos on Cyprus’ south shore to be built by the Consortium at an estimated cost of $10 billion. A Memorandum of Understanding for planning the Vassilikos LNG complex was signed by Cyprus and the Consortium in June 2013. In the interim, offshore floating LNG processing platforms that might be leased to ship processed gas via pressured LNG vessels to receiving terminals in Greece and Italy. However, Noble Energy was not initially supportive of the Eastern Mediterranean pipeline option, instead concentrating on sales from Leviathan to regional users like Jordan and Egypt and building the proposed Cypriot LNG processing facility.”

Israel has overcome the ruling of its former Anti Trust Authority general director, approving an offshore gas development plan with US Partner Noble Energy, inc. and Israeli partner Delek Group involving the Leviathan, Tamar and adjacent Aphrodite gas fields in the Cyprus Exclusive Economic Zone. With yesterday’s announcement in Nicosia by the Triple Alliance of Israel, Cyprus and Greece, a way can now be seen to go forward with the Eastern Mediterranean Gas Pipeline and the LNG facility in Cyprus.  At the time we wrote the January 2015 NER article, Russian President Putin and Turkish President Erdogan had announced a $12 billion Turkey Stream pipeline deal to supply Europe with natural gas. Given the break off in relations between Russia and Turkey over the latter’s downing of a Russian Su-24 bomber, Putin has suspended the project.  That sent Erdogan scrambling to re-open diplomatic relations with Jerusalem seeking supplies of Israeli gas.  The dour circumstances propounded in our January 2015 article appear to be lifted by the geo-resource and political wars in the Syrian and ISIS conflicts.  That is enhanced by the settlement of Israel’s plan for development and distribution of its offshore gas and oil fields.

RELATED ARTICLE: 10 Reasons Israel Is Not An ‘Apartheid’ State

EDITORS NOTE: This column originally appeared in the New English Review.

Climate Confusion

Many Americans are again confused over how the President and the United Nations can say we are at grave risk from man-made global warming (a.k.a climate change) when we continue to get pummeled by brutal, record shattering, winter storms. If this situation has you confused, take heart. You are not alone.

Once again the natural world has slapped the ‘warmist’ community down hard with yet another record breaking blizzard in the northeastern US between January 22 and January 24, 2016. Winter storm ‘Janus’ (a Weather Channel designation) dumped record snow totals in the major cities of the USA with a major snowstorm that stretched from Arkansas to Massachusetts. Here are but a few examples of the storm’s wrath:

New York City saw 26.8 inches of snow fall in Central Park, the second highest ever recorded. It missed tying the all time record by one tenth of an inch. JFK Airport had 30.5 inches of snow. Washington’s Dulles airport measured 28.3 inches, the second highest ever. Baltimore had 29.2 inches, its largest snow total ever recorded. The list of snow events and the breadth of this winter calamity that dumped record snow from the central US to the mid-Atlantic states to the Northeast was truly one for the record books.

What is also shocking about this ‘snowmageddon’ is that according to the manmade global warming crowd, none of this was possible. We were told by United Nations scientists there was not supposed to be any snow anywhere on the planet after 2003!  And who can forget the previous terrible winter of 2014-2015 here in the US, where new temperature and snow records were routinely broken. Again, that mercilessly long and cold winter was not possible either according to the climate models from the UN and the U.S. government. How can the impossible happen so often?

We should not forget other monstrously bad predictions, the ‘warmist’ community has proffered. NOAA scientists were telling us along with Al Gore, that the Arctic sea ice would be completely gone by 2008, then revised that to 2013. Of course neither happened. Global sea ice, especially in Antarctica, is in fact, growing rapidly.

Greenhouse gas emissions recently reached 400 ppm, yet the predicted overheating of the Earth is not happening – on the contrary. The 800 lb gorilla in the climate laboratory that the manmade warming community ignores is, that there has been no meaningful growth in global temperatures for eighteen years! That includes the so-called warmest year ever – 2015. Unfortunately, my colleagues and I have observed that the US government can no longer be relied upon to tell the truth about the Earth’s climate or its temperature.

The United Nations certainly cannot be trusted either. The corruption of climate science via its climate reports issued since 1990, has been so deep seated within that organization for so long, that we must now conclude they simply are unable and unwilling to be truthful. The UN climate models of which they, the US media, and our government are so enamored, are well over 100% in error in many of the models in predicting global temperature variation. Yet, the predictions from these failed models are still offered up as evidence of the need to shut down coal and CO2 production worldwide. Further, recent data suggests that the Earth’s climate appears to be relatively insensitive to CO2!  Even the UN is now confused.

It is my fondest hope that we can put the sad era of manmade global warming behind us soon and begin the preparations needed for the rapidly approaching cold epoch, a message I have been spreading since 2007. Starting this year, a long term decline in global temperature begins. It will be at the bottom during the 2020’s through the 2030’s. This time will be grim for our species as the cold era starts its destruction of crops around the world.

We humans are easily confused about the climate. Many of us actually believe what we want to believe, and not what the facts tell us we should. Worse; we are often intentionally deceived by our leaders.

The natural world does not suffer from these afflictions. It is never confused.

RELATED ARTICLE: 300 Scientists Want NOAA To Stop Hiding Its Global Warming Data

January 27, 1951 Operation Desert Rock: First test of a U.S. Nuclear Bomb in Nevada

‘Able’ was the first air-dropped nuclear device to be exploded on American soil. The test took place on 27 January 1951 at Frenchman Flat, a dry lakebed in the Nevada Test Site. The 1-kiloton explosion launched the fourth U.S. nuclear test series code-named ‘Ranger’, which consisted of five air-dropped nuclear tests in early 1951.

first nuclear test nevada

The vertical stripes are smoke trails from rockets used to signal the speed and distance of shock waves from the explosion in the early days of nuclear testing.

The initial post-war U.S. nuclear tests – including the similarly named Able test on 1 July 1946 at the Bikini atoll – had been conducted at remote atolls in the Pacific Ocean, far from U.S. mainland. With the first Soviet nuclear test in 1949, the United States had lost its monopoly on nuclear weapons. The United States decided to significantly expand nuclear testing programme and chose the Nevada Test Site as the main location for subsequent tests.

The Able test was followed by about 100 more atmospheric nuclear tests at the Nevada Test Site. By the end of the 1950s, the grave effects of radioactivity on personnel involved in the testing and the surrounding population became evident. Public outrage helped to conclude the 1963 Partial Test Ban Treaty (PTBT), which banned all nuclear tests above ground, in the atmosphere, underwater and in outer space. Nuclear weapon testing underground, though, not only continued butincreased in numbers. A total of 928 nuclear tests were conducted at the Nevada Test Site, more than anywhere else.

In a 1955 brochure on ‘Atomic Test Effects in the Nevada Test Site Region‘, the Atomic Energy Commission assured residents close to the test site that radiation levels were “only slightly more than normal radiation which you experience day in and day out wherever you may live.” The nuclear weapon tests in Nevada were even promoted as tourist attractions.

U.S. troops participated in nuclear testing with little or no protective clothing.

Until today, the scale of the harm caused by radioactivefallout from the Nevada Test Site remains controversial. A 2006 study (PDF) by Steven L. Simon, André Bouville and Charles E. Land finds that exposure to fallout from atmospheric testing will continue to have adverse health effects in the form of increased rates of certain types of cancer such as leukemia. The National Cancer Institute’s 1999report finds that internal exposure to iodine-131 was the most serious health consequence for downwinders. Milk contaminated with iodine-131 was consumed by children in particular.

In 1990, the U.S. Congress adopted the Radiation Exposure Compensation Act (RECA) which allows downwinders from Utah, Nevada and Arizona to apply for a US$ 50.000 compensation payment in cases where a disease was caused by fallout from nuclear testing. There have been, however, repeated calls for expansions of the Compensation Act to boost payments and include other U.S. states in the compensation scheme. In November 2011, the U.S. Senate unanimously approved a resolution which designates 27 January as a National Day of Remembrance. The resolution recognizes that “downwinders paid a high price” for the development of the U.S. nuclear weapons program.

The United States conducted its last nuclear test ‘Divider‘ in September 1992. In 1996, it was the first country to sign the Comprehensive Nuclear-Test-Ban Treaty (CTBT), which bans all nuclear explosions. However, it has yet to ratify the Treaty, a step that is mandatory for its entry into force. The same applies to seven other nuclear-capable States: China, the Democratic People’s Republic of Korea, Egypt, India, Israel, Iran and Pakistan.

EDITORS NOTE: This column is courtesy of Special Forces Gear.

Recent Energy & Environmental News

windmillfog

How wind turbines can affect climate by creating fog. Photo courtesy of Professor E. A. Shinn, University of South Florida.

Energy and Environmental Newsletter, is now online.

Some of the more intriguing energy articles in this issue are:

Some of the most interesting Global Warming articles in this issue are:

PS: Please pass this on to open-minded citizens. If there are others who you think would benefit from being on our energy & environmental email list, please let me know. If at any time you’d like to be taken off the list, please let me know that too.

PPS: I am not an attorney, so no material appearing in any of the Newsletters should be construed as giving legal advice. My recommendation has always been: consult a competent attorney when you are involved with legal issues.

Why do we have an Oil Glut?

The world is awash in oil and gas. Amazing.  Less than two decades ago in 1998, the predictions were by this time in 2016 oil production would be past its peak. In fact the gloom and doom experts were called Oil Peakists. Note this from Science magazine back in 1998:

From Science magazine’s “The Next Oil Crisis Looms Large—and Perhaps Close,” Aug. 21, 1998:

This spring . . . the Paris-based International Energy Agency (IEA) of the Organization for Economic Cooperation and Development (OECD) reported for the first time that the peak of world oil production is in sight. Even taking into account the best efforts of the explorationists and the discovery of new fields in frontier areas like the Caspian Sea . . . sometime between 2010 and 2020 the gush of oil from wells around the world will peak at 80 million barrels per day, then begin a steady, inevitable decline, the report says.

However, technology, especially here in the U.S., relegated that prediction to the proverbial dust bin of history. With the private developments of  revolutionary shale fracking and horizontal drilling technology, vast new energy resources were opened up in places like North Dakota, Ohio, Pennsylvania and even in the older Permian field in West Texas. The U.S. is now pumping 9 million barrels of oil a day, and trillions of cubic yards of gas. We are no longer dependent on importing Middle East oil. In fact much of the oil that we import comes from our neighbors Canada and Mexico.

In the wake of lifting sanctions against nuclear Iran, oil is beginning to flow again to the European Union from Tehran which says it could add another 500,000 barrels in production this year.  U.S. oil is also flowing to Europe now that the 43 year old ban on oil exports was lifted and signed in law late in 2015. The first shipment of sweet crude drawn from the Eagle Ford Shale field in South Texas left the port of Corpus Christi, Texas on New Year’s eve and landed at the port of Marseilles on Friday. Another shipment out of Houston made it to Rotterdam on Thursday. A third one out of Houston is on its way to Marseilles. The oil is the equivalent of the so-called Saudi light or sweet crude which doesn’t require as much refining producing profit margins for the refiners.

So, why do we have this glut? 

The world’s economies are not growing as fast or rather slowing down, especially in the big consumer of raw materials and energy, China.  China’s economy and trade is impacting on those exporters of commodities like oil, gas,  copper, aluminum  and  iron ore like Australia,  Brazil, Canada,  Russia, Venezuela  and African countries. Where China was growing at a purported 10 percent plus, annually, the evidence is it has fallen to less than a third of that towering inflated level. We have come to realize those growth estimates were based on questionable figures  prepared by the Chinese government.  Some economic experts suggest the annual growth in GDP may be less than three percent.  So with that news came the sudden plummeting in the world trading markets for commodities, especially oil.

There is  also the great geo-resource political game in the Middle East going on between Saudi Arabia and Iran, and let’s not forget Russia.  Saudi Arabia as the keystone in the OPEC oil Cartel is not listening to the complaints of the other members of the group at meetings in Vienna demanding that it reduce domestic production. It is pumping oil and still making money, because it costs less than $5 a barrel. This despite a yawning budget deficit of $98 billion. The Saudis have an estimated $600 billion in hard currency reserves, which provides a cushion to ride out the geo-political storm. They are using the oil weapon to beat back competitors including Iran across the Persian Gulf, Russia which  has military in Syria supporting the Assad regime, and  the newly resurgent producer, the US.   Russia, as Shoshana Bryen of the Washington, D.C.-based Jewish Policy Center pointed out in a recent interview, mispriced its budget at $119 a barrel of oil, then redid the numbers at $87 dollars only to see it plummet to less than $30 at one point.

So what is the impact here in the U.S.?

When was your last trip to fill up your car at the gas station?  Here on the Gulf Coast in the U.S., regular unleaded gas is currently selling for less than $1.80 a gallon.  That means savings to consumers who appear to be putting away the difference awaiting a return to a more confident economy.   Diesel that at one point was priced at nearly $1 dollar a gallon above gasoline has shrunk to less than ten cents a gallon differential. That means that the cost for moving shipments via long haul truckers has gotten cheaper. It means that jet fuel cost is less reflected in the huge profits being declared by the major airlines. Some of that may be due to the lagging airline ticket surcharges that remain in place.  However, the drop in oil production is also impacting the profit margins of rail carriers who minted money from train loads of combustible leaving the Bakken formation in North Dakota. The drop in oil prices occasioned by the glut also means that the cost of petro chemical feed stocks is enhancing profit margins for plastics,.

Remember, the discussion about lifting the 43 year old oil export ban?

One of the by-products of that was the convergent pricing of U.S. crude has converged with world pricing.  If you went onto the COMDEX oil trading floor in lower Manhattan, you would see traders vying for futures contracts in West Texas Intermediate (WTI) versus Brent-the so-called North Sea crude oil benchmark. The lifting of the oil export ban in the U.S. virtually eliminated the difference making Brent the world standard.  As of Friday, January 22, 2016 WTI was $32.19 per barrel for March 2016 deliveries, a 9.0 % jump, and Brent priced out of the London ICE was $32.18. Heavier grades like Canadian Tar sands or Venezuelan heavy sulfur crude require more refining to produce various products. These grades actually sell at discounts from those benchmarks by as much as five dollars.

Can we expect the oil glut to last? Hardly. The current excess supply will work itself off and oil futures will gradually begin to rise again. That will bring rigs on stream here in the U.S. to start producing again, it may cause Iran to produce more than the declared 500,000 barrels  annually and the Saudis would just be minting more billions to add to its hard currency reserves. However, by mid century those fabled Saudi sweet crude reserves may likely begin to tail off. Energy, whether oil or gas will reflect the cyclical demands of the world economies.  The U.S. stands in pretty good shape to weather the current volatility in trading markets; thanks to technology, entrepreneurial prowess and the lifting of the oil export embargo. Don’t panic and consider investing in contrarian values in the equity and debt markets. That is what the long term value investors do. They buy when values are relatively cheap compared to long term returns.

EDITORS NOTE: This column originally appeared in the New English Review. An earlier version was published in the Newsletter of the Lisa Benson Show National Security Task Force Newsletter, January 23, 2016.

MIT: Incandescents Now More Efficient than LEDs by Jeffrey A. Tucker

Researchers at the MIT are publicizing that they have fixed the incandescent light-bulb with a brilliant improvement. They have wrapped the interior filament in a crystal glass that both bounces light and contains heat. It recycles energy in a way that addresses the main complaint against Edison’s bulb: It burns far too much energy for the light that it produces.

Why is this interesting? About a decade ago, governments around the world developed a fetish for banning incandescents (through an efficiency rule) and replacing them with expensive LED technology and florescent bulbs. It happened in Europe first but eventually came to the United States. The last American factory to produce them closed in 2010, and they are ever harder to find in even the big-box hardware stores. (As with all such bans, there are exceptions for elites who desire specialty bulbs.)

The change has been seriously annoying for many consumers. It has even given rise to hoarding and gray markets (in Germany, such bulbs were repackaged as “heat balls”). It has produced something of a political backlash, too.

On a personal note, my own dear mother replaced all her incandescents with fluorescents several years ago. I was sitting in her house feeling vaguely irritated by the searing lights in the room — cold and dreary — and had to turn them off. Sitting in the dimly lit room, my thought was: this is what the government has done to us. A great invention from the dawn of modernity is being driven out of use. Do I have to bring my own candles next holiday season?

Why should governments be in the position of deciding what technologies can and cannot be used, as if consumers are too stupid to make such decisions for themselves? Who is to decide what is efficient, and what the proper trade off should be between the energy expended and the light produced?

Maybe some people don’t mind the “inefficiency” of incandescent bulbs relative to the warm and wonderful light they produce. Entrepreneurs need to be able to discern and serve their needs.

The bans have given rise to a vast debate about which bulb is best and what kind of light technology governments should and should not permit. But these are really the wrong questions. The real issue should be: Why should governments be in the business of picking right and wrong technologies at all?

As the MIT innovation in lighting suggests, there are possibilities yet undiscovered that regulators have not thought of. If you write detailed regulations about existing technologies, you are forestalling the possibilities that scientists and entrepreneurs will discover new ways of doing things in the future.

A vast regulatory apparatus on cell phone technology in 1990 could never have imagined something like a modern cellphone. Regulations on digital commerce in 2000 might have stopped the rise of peer-to-peer services like Uber. Indeed, one of the reasons that the digital world is so innovative is precisely because the regulators haven’t yet caught up with the pace of innovation.

Regulations on technology freeze the status quo in place and make it permanent. How, for example, will regulations respond to the news that a new and improved form of incandescent bulb is possible? Early tests show it to be more efficient than the replacements which the regulations favor. Will there be a new vote, a rewrite of the law, a governing body that evaluates new lightbulbs, the same way we approach prescription drugs? None of this can possibly match the efficiency of a market process of trial and error, of experimentation, rejection, and adoption.

In government, a ban is a ban, something to be enforced, not tweaked according to new discoveries and approaches.

Herein we see the problems with all attempts by government to tightly manage any technology. Bitcoin is a great example. As soon as the price began to rise and the crypto sector began to appear viable, government agencies got in the business of regulating them as if the sector was already taking a shape that would last forever. And because technology and industry are always on the move, there is never a rational time to intervene with the proclamation “this is how it shall always be.”

Regulatory interventions stop the progress of history by disabling the limitless possibilities of the human imagination.

By the time regulators get around to rethinking the incandescent, the industry will probably have moved on to something new and even better, something no one can imagine could exist today.

Jeffrey A. TuckerJeffrey A. Tucker

Jeffrey Tucker is Director of Digital Development at FEE, CLO of the startup Liberty.me, and editor at Laissez Faire Books. Author of five books, he speaks at FEE summer seminars and other events. His latest book is Bit by Bit: How P2P Is Freeing the World.  Follow on Twitter and Like on Facebook.

Marco Rubio’s Recent Climate Change of Heart ‘Disingenuous’

ken fieldsNEW YORK, NY /PRNewswire-USNewswire/ — In response to Marco Rubio’s recent campaign event in New Hampshire where the candidate appears to have made a climate change of heart and has called for America to be “number one in wind, and number one in solar, and number one in biofuels, and number one in renewables, number one in energy efficiency. Let’s lead in all of these things,” independent presidential candidate Ken Fields (pictured right) responded by saying:

“For someone who has so vehemently opposed any acknowledgement of the scientific consensus backing the evidence of human-caused climate change due to our planet’s reliance on fossil fuels, Rubio’s change of heart seems disingenuous at best. He has voted against energy efficiency and clean energy tax incentives. It’s hard to believe him.”

When pressed for further comment, Fields stated, “The recent and continued volatility in global oil markets should be evidence enough that energy security is not simply a matter of having and exploiting our own fossil fuel resources, but rather being completely independent of fossil fuels altogether.”

Fields officially launched his campaign last week on January 8th, 2016. His platform revolves around his slogan, “Greatness Must Be Earned” and to do great things, he has advocated the transition to 100% renewable energy for the country over the next 20 years. His policy plan includes, but is not limited to, creating the public and private mechanisms to encourage and nurture the financial markets to participate, a tax holiday for repatriated corporate capital that is invested in renewables and a carbon tax and dividend plan.

For further information on his policies and positions feel free to visit www.kenfields.net.

Devastating Impact of Marijuana Legalization on Colorado’s Children

National Families in Action reports:

colorado marijuana use youth coverPast-month marijuana use among Colorado’s adolescents, ages 12-17, was 74 percent higher (12.56% vs. 7.22% nationwide) than the national average for the two years following legalization in the state, according to a new report from the Rocky Mountain High Intensity Drug Trafficking Area.

Further, the average usage rate in states that have not legalized marijuana for medical use is lower (5.99%) than the average in states that have (8.52%) and far lower than the states that have fully legalized pot (11.31%).

Past-month marijuana use among college-age young adults, ages 18-25, was 62 percent higher than the national average (31.24% vs. 19.32%). Use in full legalization states was nearly double that of use in non-medical pot states (27.86% vs. 16.43% in 2014).

Adult past-month use was 104 percent greater than the national average (12.45% vs. to 6.11% in 2014). Adult use in full legalization states was 11.83% vs. 4.7% in non-legalization states.

Read the Rocky Mountain HIDTA Report here.


ABOUT NATIONAL FAMILIES IN ACTION (NFIA)

NFIA consists of families, scientists, business leaders, physicians, addiction specialists, policymakers, and others committed to protecting children from addictive drugs. Our vision is:

  • Healthy, drug-free kids
  • Nurturing, addiction-free families
  • Scientifically accurate information and education
  • A nation free of Big Marijuana
  • Smart, safe, FDA-approved medicines developed from the cannabis plant (and other plants)
  • Expanded access to medicines in FDA clinical trials for children with epilepsy

RELATED ARTICLE: Last Thing Struggling Students Need is More Marijuana – Hudson Institute