Recent Energy & Environmental News

The latest Energy and Environmental Newsletter, is now online.

Three particularly revealing items from a very busy news cycle:

  1. As a reward for her efforts to assist New Englanders threatened by industrial wind energy, citizen advocate Annette Smith was sued for “practicing law.” Fortunately this sham was resolved shortly, in favor of common sense. (See here and here.)
  2. In an attempt to promote fiscal responsibility(!), some 350 of Australia’s climate scientists were given layoff notices. The argument to keep these positions was revealing. Before: they have high confidence computer models, and strong certainty that we understand the climate. After: there are many climate unknowns, and the models need a lot more work. (See here and here.)
  3. After dealing with thousands of adults on environmental and energy issues, it’s clear that our current education system is not working. We need to start someplace to fix this, so here are my initial recommendations.

Some of the more interesting energy articles in this issue are:

11 Ways To Kill Industrial Wind Projects

The Windmills of Bernie’s Mind

Archive Study: Renewables Won’t Save Us, So What Will?

Professor Investigating Flint: Greed has Killed Public Science

Proposed Oregon Law will monitor net impacts of energy policies (!!)

Commentary on US Supreme Court “Clean Energy” decision (and another)

Offshore Wind Turbine Maintenance Cost: “100 Times More Expensive Than A New Turbine Itself”!

Bad Incentives Undermine the Scientific Process

Some of the more informative Global Warming articles in this issue are:

What Do We Know About CO2 and Global Temperatures?

The Four Errors of Mann’s Recent Peer-Reviewed Study

Greens vs Transparency

Dr. Christy’s Congressional AGW Testimony

Climate Scientists Misapplied Basic Physics

Can we just hit the “restart” button with Climate Science?

300 Scientists Officially Protest NOAA Data Secrecy (+ more)

House Votes for Open, Accountable Science

Audubon goes over the edge

PS: As always, please pass this on to open-minded citizens. If there are others who you think would benefit from being on our energy & environmental email list, please let me know. If at any time you’d like to be taken off the list, please let me know that too.

Settled Science: Global warming causes sea levels to rise — oops — fall, er slowdown?  Whatever!

Screenshot 2016-01-25 at 10.20.32 AM-down

2016 Claim: Wait! What?! Study: There is so much global warming that it is slowing the rise of sea levels – ‘Is there anything global warming can’t do? Now it seems that there is so much global warming that it is slowing the rise of sea levels.’

Climate Astrology: ‘Global Warming’ commands sea level rise Increases…& sea level rise slowdown: NASA discovers that ‘global warming’ is slowing and not increasing sea level rise – NASA study claim: ‘Because the Earth has become more parched, partly because humans are pumping out more ground water, the rising oceans are being absorbed by lakes, rivers, and underground acquirers, much like a sponge absorbs water. An extra 3.2 trillion tons of water has thus been soaked up and stored and is not pouring into the streets of coastal cities.’

NASA Study Concludes ‘Global Warming’ Is Actually Slowing Sea Level Rise – A new NASA study concludes global warming increases the amount of water stored underground which, in turn, slows the rate of sea level rise. At a time when scientists are worried about accelerating sea level rise, NASA scientist John Reager and his colleagues found an extra 3,200 gigatons of water was being stored by parched landscapes from 2002 to 2014, slowing sea level rise by 15 percent.

2016 Study: Parched Earth soaks up water, slowing sea level rise

Flashback: Prominent Dutch Scientist: ‘I find the Doomsday picture Al Gore is painting – a 6m sea level rise, 15 times IPCC number — entirely without merit’

Flashback 1987: FSU Professor: Global Warming Causes Sea Level To Fall

The Palm Beach Post – July 6, 1987: By Mary McLachlin – Palm Beach Post Staff Writer – Via Real Climate Science website

Excerpt: Florida State University Geology Professor William Tanner: “Tanner plotted 4000 years of sea-level data on 5,000 years of climatological data published in last year’s Encyclopedia of Climatology and found some interesting correlations. Every time the climate warmed a couple of degrees, the sea level went down. Every time the climate cooled a couple of degrees, the sea level went up. This happened four times, each cycle taking about 100 years, and spaced about 900 years apart.”

“He says sea level rise has been about six inches over the past century, and he now expects that to slow down and even reverse itself if humans continue warming the Earth.”

“We’ve made the assumption — and it’s logical — that if things get warm, the glaciers get warm, the glaciers are going to melt,” Tanner said. “But that’s not what these two curves show, no matter how logical it may be. Everybody’s been depending on logic without much data.”

“Tanner says he believes that when the climate warms just a little, it causes more evaporation from the oceans and they go down. He sees two separate systems at work — a big one in which the climate gets every warm or very cold and the oceans rise or fall dramatically, and a small system in which minor changes in temperature cause the opposite reactions.”

“My colleagues here to whom I have presented it in detail think it’s reasonable and probably correct.”

Screenshot 2016-01-25 at 10.20.32 AM-down

More on Geologist Dr. William F. Tanner here.  – William F. Tanner (1917-2000) Geologist – Of Tallahassee, Florida died on April 9, 2000. Tanner was an ASA fellow and a member of ASA’s Affil. of Christian Geologists. A prof. of geology at Florida State U. with emphasis on sedimentology, he was born in Milledgeville, Georgia in 1917. He holds a B.A. from Baylor University, an M.A. from Texas Technological College, and a Ph.D. from Oklahoma University, all in Geology. He has served as an Instructor at Oklahoma University, a visiting Professor of Geology at Florida State University, and Associate Professor and Professor of Geology at Florida State University. Since 1974 he has been Regents Professor. He has had geological experience in much of the U.S., mostly in the Southeast, Southwest, and Rocky Mountain areas,- maritime eastern Canada and Canadian Rockies,- Mexico, Panama, Colombia, Ecuador, Peru, Chile, Uruguay, various parts of Brazil, and Venezuela. His specialties within geology include sedimentology, sediment transport (including beach and river erosion), paleogeography and paleoclimatology, history of the atmosphere and petroleum geology. Dr. Tanner is Editor of “Coastal Research, ” Science Editor for the New Atlas of Florida, and Editor of six volumes on coastal sedimentology. He is the author of 275 technical papers.

Real Climate Science website note: 

This picture of Boca Raton was in that issue of the paper.

Screenshot 2016-01-25 at 10.16.34 AM

And this is what that beach looks like today. Nothing has changed.

Background on sea level rise: 

Flashback 1977: West Antarctic Ice Sheet Melt To Raise Sea Level 20 Ft. – National Science Foundation reveals: ‘It has nothing to do with a warmer climate, just the dynamics of unstable ice’

1977: ROSS ICE SHELF, Antarctica-A huge portion of the Antarctic ice mass appears to be collapsing into the sea, a catastrophe that could raise the levels of the oceans by almost 20 feet.  “We’re seeing the West ice sheet on its way out,” said Richard Cameron of the National Science Foundation. “It seems to be doing something completely different than the east ice sheet. It has nothing to do with a warmer climate, just the dynamics of unstable ice.”…”We’re doing about the most we can do right now to study the possible collapse of the west ice sheet,” said Dr. Richard Cameron, NSF program manager for glaciology. “It has become an area of concern because we could be on the brink of a rise in sea levels.” SUCH A RAPID rise is not unprecedented. It may have caused the Great Deluge described in the Old Testament.

Flashback: Planet Healer Obama Calls It: In 2008, he declared his presidency would result in ‘the rise of the oceans beginning to slow’ — And By 2011, Sea Level Drops!

Climate Depot’s Morano: ‘It is just possible that Obama has powers and abilities far beyond those of mortal men — since sea levels actually cooperated with Obama’s pledge!”

Flashback 1986 : Scientists Were “Sure” Sea Level Would Rise One Foot By 2016 –

Analysis of latest sea leel rise claims: Examination of the data from the paper, however, shows the range of proxy sea levels is approximately 10 meters, far too large to discern the tiny ~1.5 mm/yr sea level rise over the past 150 years. The authors instead assume from other published studies of tide gauge measurements that the ~1.5 mm/yr sea level rise over the past 150+ years began at that point in time. Other papers find sea levels rising only 1.1-1.3 mm/yr over the past 203 years, and without acceleration. 

Regardless, even the IPCC concedes that there was no significant anthropogenic influence on climate prior to 1950, thus man is not be responsible for sea level rise beginning 150-200 years ago, at the end of the Little Ice Age.

The sea level rise over the past ~200 years shows no evidence of acceleration, which is necessary to assume a man-made influence. Sea level rise instead decelerated over the 20th centurydecelerated 31% since 2002 and decelerated 44% since 2004 to less than 7 inches per century. There is no evidence of an acceleration of sea level rise, and therefore no evidence of any man-made effect on sea levels. Sea level rise is primarily a local phenomenon related to land subsidence, not CO2 levels. Therefore, areas with groundwater depletion and land subsidence have much higher rates of relative sea level rise, but this has absolutely nothing to do with man-made CO2.


Read more:

VIDEO: The Supreme Court Pause of EPA’s Carbon Regulations Explained

Why did the Supreme Court pause EPA’s Clean Power Plan?

The Supreme Court granted a stay of EPA’s carbon regulations—the Clean Power Plan.

The Wall Street Journal editorial board called it an “important rebuke to the political method of the anticarbon activists in the EPA and White House.”

Ditching fossils fuels will be a capital-intensive and generation-long transition, to the extent it is possible, and states must submit compliance plans as soon as this September that are supposed to last through 2030, or be subject to a federal takeover.

The legal challenges will take years, but the EPA hopes to engineer a fait accompli by bullrushing the states into making permanent revisions immediately. Once the Clean Power Plan starts, it becomes self-executing. If the EPA loses down the road, it will laugh that the opinion is too late and thus pointless.

[ … ]

The stay suggests that a majority of the Court won’t allow this deliberate gaming of the slow pace of the legal process to become de facto immunity for anything the EPA favors. It’s especially notable because courts tend to be highly deferential to executive regulation.

What exactly did the court do?

Why did the court do this?

And why have states, businesses, labor unions, and trade associations–including the U.S. Chamber—welcomed this decision as they fight EPA’s regulatory overreach?

I spoke with Heath Knakmuhs, senior director of policy at the Institute for 21st Century Energy to get some answers.

And to understand the international implications of the Supreme Court’s stay, read Stephen Eule’s piece.

New York’s Chilling Global Warming Witch Hunt by Walter Olson

New York Attorney General Eric Schneiderman is pursuing an investigation of the Exxon Corporation in part for making donations to think tanks and associations like the American Enterprise Institute and American Legislative Exchange Council, which mostly work on issues unrelated to the environment but have also published some views flayed by opponents as “climate change denial.”

Assuming the First Amendment protects a right to engage in scholarship, advocacy, and other forms of supposed denial, it is by no means clear that information about such donations would yield a viable prosecution. Which means, notes Hans Bader of the Competitive Enterprise Institute, that the New York probe raises an issue of constitutional dimensions not just at some point down the road, but right now:

A prolonged investigation in response to someone’s speech can violate the First Amendment even when it never leads to a fine. For example, a federal appeals court ruled in White v. Lee, 227 F.3d 1214 (9th Cir. 2000) that lengthy, speech-chilling civil rights investigations by government officials can violate the First Amendment even when they are eventually dropped without imposing any fine or disciplinary action.

It found this principle was so plain and obvious that it denied individual civil rights officials qualified immunity for investigating citizens for speaking out against a housing project for people protected by the Fair Housing Act.

In another case, in which a company had been sued seeking damages over its participation in trade-association-related speech, a federal appeals court found that the pendency of the lawsuit all by itself caused enough of a burden on the firm’s speech rights that the court used its mandamus power to order the trial judge to dismiss the claims, a remarkable step.

Moreover, Bader writes, a string of federal precedents indicate that the constitutional rights Schneiderman is trampling here are not just Exxon’s but those of the organizations it gave to, which have a right to challenge his action whether or not the oil company chooses to do so:

These groups themselves can sue Schneiderman under the First Amendment, if Schneiderman’s pressure causes them to lose donations they would otherwise receive. Government officials cannot pressure a private party to take adverse action against a speaker.

Meanwhile, writing at Liberty and Law, Prof. Philip Hamburger of Columbia Law School takes a different tack: the subpoenas imperil due process and separation of powers because they issue at the whim of Schneiderman’s office.

Earlier ideas of constitutional government “traditionally left government no power to demand testimony, papers, or other information, except under the authority of a judge or a legislative committee.” In more recent years executive subpoena power has proliferated; so has the parallel power of lawyers in private litigation to demand discovery, but the latter at least in theory goes on under judicial supervision that can check some of its abuse and invasiveness.

Extrajudicial subpoenas by AG offices are particularly dangerous, Hamburger argues, because of their crossover civil/criminal potential: the targets do not enjoy a high level of procedural protection when “attorneys general claim to be acting merely in a civil rather than a criminal capacity,” yet the same offices can and do threaten criminal charges. Especially dangerous is New York’s Martin Act, a charter for general invasion of the private papers of anyone and anything with a connection to New York financial transactions.

An attorney general’s concern about fraud or the “public interest” is no justification for allowing him to rifle through private papers.

When he thereby extracts the basis for a criminal prosecution, he evades the grand jury process. When he thereby lays the groundwork for a civil enforcement proceeding, he evades the due process of law, for there ordinarily is no discovery for a plaintiff until he commences a civil action.

Even worse, when a prosecutor uses a subpoena to get a remunerative settlement, it is akin to extortion — this being the most complete end run around the courts.

Previously on the probe here and here (and earlier here and here), and on the New York attorney general’s office here and here.

Cross-posted from Overlawyered.

Walter OlsonWalter Olson
Walter Olson is a senior fellow at the Cato Institute’s Center for Constitutional Studies.

Is the Scientific Process Broken? by Jenna Robinson

The scientific process is broken. The tenure process, “publish or perish” mentality, and the insufficient review process of academic journals mean that researchers spend less time solving important puzzles and more time pursuing publication. But that wasn’t always the case.

In 1962, chemist and social scientist Michael Polyani described scientific discovery as a spontaneous order, likening it to Adam Smith’s invisible hand. In “The Republic of Science: Its Political and Economic Theory,” originally printed in Minerva magazine, Polyani used an analogy of many people working together to solve a jigsaw puzzle to explain the progression of scientific discovery.

Polanyi begins: “Imagine that we are given the pieces of a very large jigsaw puzzle, and … it is important that our giant puzzle be put together in the shortest possible time. We would naturally try to speed this up by engaging a number of helpers; the question is in what manner these could be best employed.”

He concludes,

The only way the assistants can effectively co-operate, and surpass by far what any single one of them could do, is to let them work on putting the puzzle together in sight of the others so that every time a piece of it is fitted in by one helper, all the others will immediately watch out for the next step that becomes possible in consequence.

Under this system, each helper will act on his own initiative, by responding to the latest achievements of the others, and the completion of their joint task will be greatly accelerated. We have here in a nutshell the way in which a series of independent initiatives are organized to a joint achievement by mutually adjusting themselves at every successive stage to the situation created by all the others who are acting likewise.

Polyani’s faith in this process, decentralized to academics around the globe, was strong. He claimed, “The pursuit of science by independent self-co-ordinated initiatives assures the most efficient possible organization of scientific progress.”

But somewhere in the last 54 years, this decentralized, efficient system of scientific progress seems to have veered off course. The incentives created by universities and academic journals are largely to blame.

The National Academies of Science noted last year that there has been a tenfold increase since 1975 in scientific papers retracted because of fraud. A popular scientific blog, Retraction Watch, reports daily on retractions, corrections, and fraud from all corners of the scientific world.

Some argue that such findings aren’t evidence that science is broken — just very difficult. News “explainer” Vox recently defended the process, calling science “a long and grinding process carried out by fallible humans, involving false starts, dead ends, and, along the way, incorrect and unimportant studies that only grope at the truth, slowly and incrementally.”

Of course, finding and correcting errors is a normal and expected part of the scientific process. But there is more going on.

A recent article in Proceedings of the National Academy of Sciences documented that the problem in biomedical and life sciences is more attributable to bad actors than human error. Its authors conducted a detailed review of all 2,047 retracted research articles in those fields, which revealed that only 21.3 percent of retractions were attributable to error. In contrast, 67.4 percent of retractions were attributable to misconduct, including fraud or suspected fraud (43.4 percent), duplicate publication (14.2 percent), and plagiarism (9.8 percent).

Even this article on FiveThirtyEight, which attempts to defend the current scientific community from its critics, admits, “bad incentives are blocking good science.”

Polanyi doesn’t take these bad incentives into account—and perhaps they weren’t as pronounced in 1960s England as they are in the modern United States. In his article, he assumes that professional standards are enough to ensure that contributions to the scientific discussion would be plausible, accurate, important, interesting, and original. He fails to mention the strong incentives, produced by the tenure process, to publish in journals of particular prestige and importance.

This “publish or perish” incentive means that researchers are rewarded more for frequent publication than for dogged progress towards solving scientific puzzles. It has also led to the proliferation of academic journals — many lacking the quality control we have come to expect in academic literature. This article by British pharmacologist David Colquhoun concludes, “Pressure on scientists to publish has led to a situation where any paper, however bad, can now be printed in a journal that claims to be peer-reviewed.”

Academic journals, with their own internal standards, exacerbate this problem.

Science recently reported that less than half of 100 studies published in 2008 in top psychology journals could be replicated successfully. The Reproducibility Project: Psychology, led by Brian Nosek of the University of Virginia, was responsible for the effort and included 270 scientists who re-ran other people’s studies.

The rate of reproducibility was likely low because journals give preference to “new” and exciting findings, damaging the scientific process. The Economist reported in 2013 that “‘Negative results’ now account for only 14% of published papers, down from 30% in 1990” and observed, “Yet knowing what is false is as important to science as knowing what is true.”

These problems, taken together, create an environment where scientists are no longer collaborating to solve the puzzle. They are instead pursuing tenure and career advancement.

But the news is not all bad. Recent efforts for science to police itself are beginning to change researchers’ incentives. The Reproducibility Project (mentioned above) is part of a larger effort called the Open Science Framework (OSF). The OSF is a “scholarly commons” that works to improve openness, integrity and reproducibility of research.

Similarly, the Center for Scientific Integrity was established in 2014 to promote transparency and integrity in science. Its major project, Retraction Watch, houses a database of retractions that is freely available to scientists and scholars who want to improve science.

A new project called Heterodox Academy will help to address some research problems in the social sciences. The project has been created to improve the diversity of viewpoints in the academy. Their work is of great importance; psychologists have demonstrated the importance of such diversity for enhancing creativity, discovery, and problem solving.

These efforts will go a long way to restoring the professional standards that Polyani thought were essential to ensure that research remains plausible, accurate, important, interesting, and original. But ultimately, the tenure process and peer review must change in order to save scientific integrity.

This article first appeared at the Pope Center for Higher Education.

Jenna RobinsonJenna Robinson

Jenna Robinson is director of outreach at the Pope Center for Higher Education Policy.

Policy Science Kills: The Case of Eugenics by Jeffrey A. Tucker

The climate-change debate has many people wondering whether we should really turn over public policy — which deals with fundamental matters of human freedom — to a state-appointed scientific establishment. Must moral imperatives give way to the judgment of technical experts in the natural sciences? Should we trust their authority? Their power?

There is a real history here to consult. The integration of government policy and scientific establishments has reinforced bad science and yielded ghastly policies.

An entire generation of academics, politicians, and philanthropists used bad science to plot the extermination of undesirables.

There’s no better case study than the use of eugenics: the science, so called, of breeding a better race of human beings. It was popular in the Progressive Era and following, and it heavily informed US government policy. Back then, the scientific consensus was all in for public policy founded on high claims of perfect knowledge based on expert research. There was a cultural atmosphere of panic (“race suicide!”) and a clamor for the experts to put together a plan to deal with it. That plan included segregation, sterilization, and labor-market exclusion of the “unfit.”

Ironically, climatology had something to do with it. Harvard professor Robert DeCourcy Ward (1867–1931) is credited with holding the first chair of climatology in the United States. He was a consummate member of the academic establishment. He was editor of the American Meteorological Journal, president of the Association of American Geographers, and a member of both the American Academy of Arts and Sciences and the Royal Meteorological Society of London.

He also had an avocation. He was a founder of the American Restriction League. It was one of the first organizations to advocate reversing the traditional American policy of free immigration and replacing it with a “scientific” approach rooted in Darwinian evolutionary theory and the policy of eugenics. Centered in Boston, the league eventually expanded to New York, Chicago, and San Francisco. Its science inspired a dramatic change in US policy over labor law, marriage policy, city planning, and, its greatest achievements, the 1921 Emergency Quota Act and the 1924 Immigration Act. These were the first-ever legislated limits on the number of immigrants who could come to the United States.

Nothing Left to Chance

“Darwin and his followers laid the foundation of the science of eugenics,” Ward alleged in his manifesto published in the North American Review in July 1910. “They have shown us the methods and possibilities of the product of new species of plants and animals…. In fact, artificial selection has been applied to almost every living thing with which man has close relations except man himself.”

“Why,” Ward demanded, “should the breeding of man, the most important animal of all, alone be left to chance?”

By “chance,” of course, he meant choice.

“Chance” is how the scientific establishment of the Progressive Era regarded the free society. Freedom was considered to be unplanned, anarchic, chaotic, and potentially deadly for the race. To the Progressives, freedom needed to be replaced by a planned society administered by experts in their fields. It would be another 100 years before climatologists themselves became part of the policy-planning apparatus of the state, so Professor Ward busied himself in racial science and the advocacy of immigration restrictions.

Ward explained that the United States had a “remarkably favorable opportunity for practising eugenic principles.” And there was a desperate need to do so, because “already we have no hundreds of thousands, but millions of Italians and Slavs and Jews whose blood is going into the new American race.” This trend could cause Anglo-Saxon America to “disappear.” Without eugenic policy, the “new American race” will not be a “better, stronger, more intelligent race” but rather a “weak and possibly degenerate mongrel.”

Citing a report from the New York Immigration Commission, Ward was particularly worried about mixing American Anglo-Saxon blood with “long-headed Sicilians and those of the round-headed east European Hebrews.”

Keep Them Out

“We certainly ought to begin at once to segregate, far more than we now do, all our native and foreign-born population which is unfit for parenthood,” Ward wrote. “They must be prevented from breeding.”

But even more effective, Ward wrote, would be strict quotas on immigration. While “our surgeons are doing a wonderful work,” he wrote, they can’t keep up in filtering out people with physical and mental disabilities pouring into the country and diluting the racial stock of Americans, turning us into “degenerate mongrels.”

Such were the policies dictated by eugenic science, which, far from being seen as quackery from the fringe, was in the mainstream of academic opinion. President Woodrow Wilson, America’s first professorial president, embraced eugenic policy. So did Supreme Court Justice Oliver Wendell Holmes Jr., who, in upholding Virginia’s sterilization law, wrote, “Three generations of imbeciles are enough.”

Looking through the literature of the era, I am struck by the near absence of dissenting voices on the topic. Popular books advocating eugenics and white supremacy, such as The Passing of the Great Race by Madison Grant, became immediate bestsellers. The opinions in these books — which are not for the faint of heart — were expressed long before the Nazis discredited such policies. They reflect the thinking of an entire generation, and are much more frank than one would expect to read now.

It’s crucial to understand that all these opinions were not just about pushing racism as an aesthetic or personal preference. Eugenics was about politics: using the state to plan the population. It should not be surprising, then, that the entire anti-immigration movement was steeped in eugenics ideology. Indeed, the more I look into this history, the less I am able to separate the anti-immigrant movement of the Progressive Era from white supremacy in its rawest form.

Shortly after Ward’s article appeared, the climatologist called on his friends to influence legislation. Restriction League president Prescott Hall and Charles Davenport of the Eugenics Record Office began the effort to pass a new law with specific eugenic intent. It sought to limit the immigration of southern Italians and Jews in particular. And immigration from Eastern Europe, Italy, and Asia did indeed plummet.

The Politics of Eugenics

Immigration wasn’t the only policy affected by eugenic ideology. Edwin Black’s War Against the Weak: Eugenics and America’s Campaign to Create a Master Race(2003, 2012) documents how eugenics was central to Progressive Era politics. An entire generation of academics, politicians, and philanthropists used bad science to plot the extermination of undesirables. Laws requiring sterilization claimed 60,000 victims. Given the attitudes of the time, it’s surprising that the carnage in the United States was so low. Europe, however, was not as fortunate.

Freedom was considered to be unplanned, anarchic, chaotic, and potentially deadly for the race. 

Eugenics became part of the standard curriculum in biology, with William Castle’s 1916 Genetics and Eugenicscommonly used for over 15 years, with four iterative editions.

Literature and the arts were not immune. John Carey’s The Intellectuals and the Masses: Pride and Prejudice Among the Literary Intelligentsia, 1880–1939 (2005) shows how the eugenics mania affected the entire modernist literary movement of the United Kingdom, with such famed minds as T.S. Eliot and D.H. Lawrence getting wrapped up in it.

Economics Gets In on the Act

Remarkably, even economists fell under the sway of eugenic pseudoscience. Thomas Leonard’s explosively brilliant Illiberal Reformers: Race, Eugenics, and American Economics in the Progressive Era (2016) documents in excruciating detail how eugenic ideology corrupted the entire economics profession in the first two decades of the 20th century. Across the board, in the books and articles of the profession, you find all the usual concerns about race suicide, the poisoning of the national bloodstream by inferiors, and the desperate need for state planning to breed people the way ranchers breed animals. Here we find the template for the first-ever large-scale implementation of scientific social and economic policy.

Students of the history of economic thought will recognize the names of these advocates: Richard T. Ely, John R. Commons, Irving Fisher, Henry Rogers Seager, Arthur N. Holcombe, Simon Patten, John Bates Clark, Edwin R.A. Seligman, and Frank Taussig. They were the leading members of the professional associations, the editors of journals, and the high-prestige faculty members of the top universities. It was a given among these men that classical political economy had to be rejected. There was a strong element of self-interest at work. As Leonard puts it, “laissez-faire was inimical to economic expertise and thus an impediment to the vocational imperatives of American economics.”

Irving Fisher, whom Joseph Schumpeter described as “the greatest economist the United States has ever produced” (an assessment later repeated by Milton Friedman), urged Americans to “make of eugenics a religion.”

Speaking at the Race Betterment Conference in 1915, Fisher said eugenics was “the foremost plan of human redemption.” The American Economic Association (which is still today the most prestigious trade association of economists) published openly racist tracts such as the chilling Race Traits and Tendencies of the American Negro by Frederick Hoffman. It was a blueprint for the segregation, exclusion, dehumanization, and eventual extermination of the black race.

Hoffman’s book called American blacks “lazy, thriftless, and unreliable,” and well on their way to a condition of “total depravity and utter worthlessness.” Hoffman contrasted them with the “Aryan race,” which is “possessed of all the essential characteristics that make for success in the struggle for the higher life.”

Even as Jim Crow restrictions were tightening against blacks, and the full weight of state power was being deployed to wreck their economic prospects, the American Economic Association’s tract said that the white race “will not hesitate to make war upon those races who prove themselves useless factors in the progress of mankind.”

Richard T. Ely, a founder of the American Economic Association, advocated segregation of nonwhites (he seemed to have a special loathing of the Chinese) and state measures to prohibit their propagation. He took issue with the very “existence of these feeble persons.” He also supported state-mandated sterilization, segregation, and labor-market exclusion.

That such views were not considered shocking tells us so much about the intellectual climate of the time.

If your main concern is who is bearing whose children, and how many, it makes sense to focus on labor and income. Only the fit should be admitted to the workplace, the eugenicists argued. The unfit should be excluded so as to discourage their immigration and, once here, their propagation. This was the origin of the minimum wage, a policy designed to erect a high wall to the “unemployables.”

Women, Too

Another implication follows from eugenic policy: government must control women.

It must control their comings and goings. It must control their work hours — or whether they work at all. As Leonard documents, here we find the origin of the maximum-hour workweek and many other interventions against the free market. Women had been pouring into the workforce for the last quarter of the 19th century, gaining the economic power to make their own choices. Minimum wages, maximum hours, safety regulations, and so on passed in state after state during the first two decades of the 20th century and were carefully targeted to exclude women from the workforce. The purpose was to control contact, manage breeding, and reserve the use of women’s bodies for the production of the master race.

Leonard explains:

American labor reformers found eugenic dangers nearly everywhere women worked, from urban piers to home kitchens, from the tenement block to the respectable lodging house, and from factory floors to leafy college campuses. The privileged alumna, the middle-class boarder, and the factory girl were all accused of threatening Americans’ racial health.

Paternalists pointed to women’s health. Social purity moralists worried about women’s sexual virtue. Family-wage proponents wanted to protect men from the economic competition of women. Maternalists warned that employment was incompatible with motherhood. Eugenicists feared for the health of the race.

“Motley and contradictory as they were,” Leonard adds, “all these progressive justifications for regulating the employment of women shared two things in common. They were directed at women only. And they were designed to remove at least some women from employment.”

The Lesson We Haven’t Learned

Today we find eugenic aspirations to be appalling. We rightly value the freedom of association. We understand that permitting people free choice over reproductive decisions does not threaten racial suicide but rather points to the strength of a social and economic system. We don’t want scientists using the state to cobble together a master race at the expense of freedom. For the most part, we trust the “invisible hand” to govern demographic trajectories, and we recoil at those who don’t.

But back then, eugenic ideology was conventional scientific wisdom, and hardly ever questioned except by a handful of old-fashioned advocates of laissez-faire. The eugenicists’ books sold in the millions, and their concerns became primary in the public mind. Dissenting scientists — and there were some — were excluded by the profession and dismissed as cranks attached to a bygone era.

Eugenic views had a monstrous influence over government policy, and they ended free association in labor, marriage, and migration. Indeed, the more you look at this history, the more it becomes clear that white supremacy, misogyny, and eugenic pseudoscience were the intellectual foundations of modern statecraft.

Today we find eugenic aspirations to be appalling, but back then, eugenic ideology was conventional scientific wisdom.

Why is there so little public knowledge of this period and the motivations behind its progress? Why has it taken so long for scholars to blow the lid off this history of racism, misogyny, and the state?

The partisans of the state regulation of society have no reason to talk about it, and today’s successors of the Progressive Movement and its eugenic views want to distance themselves from the past as much as possible. The result has been a conspiracy of silence.

There are, however, lessons to be learned. When you hear of some impending crisis that can only be solved by scientists working with public officials to force people into a new pattern that is contrary to their free will, there is reason to raise an eyebrow. Science is a process of discovery, not an end state, and its consensus of the moment should not be enshrined in the law and imposed at gunpoint.

We’ve been there and done that, and the world is rightly repulsed by the results.

Jeffrey A. TuckerJeffrey A. Tucker

Jeffrey Tucker is Director of Digital Development at FEE, CLO of the startup, and editor at Laissez Faire Books. Author of five books, he speaks at FEE summer seminars and other events. His latest book is Bit by Bit: How P2P Is Freeing the World.  Follow on Twitter and Like on Facebook.

Is Cheap Gas a Bad Thing? by Randal O’Toole

Remember peak oil? Remember when oil prices were $140 a barrel and Goldman Sachs predicted they would soon reach $200? Now, the latest news is that oil prices have gone up all the way to $34 a barrel. Last fall, Goldman Sachs predicted prices would fall to $20 a barrel, which other analysts argued was “no better than its prior predictions,” but in fact they came a lot closer to that than to $200.

Low oil prices generate huge economic benefits. Low prices mean increased mobility, which means increased economic productivity. The end result, says Bank of America analyst Francisco Blanch, is “one of the largest transfers of wealth in human history” as $3 trillion remain in consumers’ pockets rather than going to the oil companies. I wouldn’t call this a “wealth transfer” so much as a reduction in income inequality, but either way, it is a good thing.

Naturally, some people hate the idea of increased mobility from lower fuel prices. “Cheap gas raises fears of urban sprawl,” warns NPR. Since “urban sprawl” is a made-up problem, I’d have to rewrite this as, “Cheap gas raises hopes of urban sprawl.” The only real “fear” is on the part of city officials who want everyone to pay taxes to them so they can build stadiums, light-rail lines, and other useless urban monuments.

A more cogent argument is made by UC Berkeley sustainability professor Maximilian Auffhammer, who argues that “gas is too cheap” because current prices fail to cover all of the external costs of driving. He cites what he calls a “classic paper” that calculates the external costs of driving to be $2.28 per gallon. If that were true, then one approach would be to tax gasoline $2.28 a gallon and use the revenues to pay those external costs.

The only problem is that most of the so-called external costs aren’t external at all but are paid by highway users. The largest share of calculated costs, estimated at $1.05 a gallon, is the cost of congestion. This is really a cost of bad planning, not gasoline. Either way, the cost is almost entirely paid by people in traffic consuming that gasoline.

The next largest cost, at 63 cents a gallon, is the cost of accidents. Again, this is partly a cost of bad planning: remember how fatality rates dropped nearly 20 percent between 2007 and 2009, largely due to the reduction in congestion caused by the recession? This decline could have taken place years before if cities had been serious about relieving congestion rather than ignoring it. In any case, most of the cost of accidents, like the other costs of congestion, are largely internalized by the auto drivers through insurance.

The next-largest cost, pegged at 42 cents per gallon, is “local pollution.” While that is truly an external cost, it is also rapidly declining as shown in figure 1 of the paper. According to EPA data, total vehicle emissions of most pollutants have declined by more than 50 percent since the numbers used in this 2006 report. Thus, the 42 cents per gallon is more like 20 cents per gallon and falling fast. [Ed. note: And pollution is also mostly due to congestion.]

At 12 cents a gallon, the next-largest cost is “oil dependency,” which the paper defines as exposing “the economy to energy price volatility and price manipulation” that “may compromise national security and foreign policy interests.” That problem, which was questionable in the first place, seems to have gone away thanks to the resurgence of oil production within the United States, which has made other oil producers, such as Saudi Arabia, more dependent on us than we are on them.

Finally, at a mere 6 cents per gallon, is the cost of greenhouse gas emissions. If you believe this is a cost, it will decline when measured as a cost per mile as cars get more fuel efficient under the current CAFE standards. But it should remain fixed as a cost per gallon as burning a gallon of gasoline will always produce a fixed amount of greenhouse gases.

In short, rather than $2.38 per gallon, the external cost of driving is closer to around 26 cents per gallon. Twenty cents of this cost is steadily declining as cars get cleaner and all of it is declining when measured per mile as cars get more fuel-efficient.

It’s worth noting that, though we are seeing an increase in driving due to low fuel prices, the amount of driving we do isn’t all that sensitive to fuel prices. Real gasoline prices doubled between 2000 and 2009, yet per capita driving continued to grow until the recession began. Prices have fallen by 50 percent in the last six months or so, yet the 3 or 4 percent increase in driving may be as much due to increased employment as to more affordable fuel.

This means that, though there may be some externalities from driving, raising gas taxes and creating government slush funds with the revenues is not the best way of dealing with those externalities. I’d feel differently if I felt any assurance that government would use those revenues to actually fix the externalities, but that seems unlikely. I actually like the idea of tradeable permits best, but short of that the current system of ever-tightening pollution controls seems to be working well at little cost to consumers and without threatening the economic benefits of increased mobility.

This post first appeared at

Randal O’TooleRandal O’Toole

Randal O’Toole is a Cato Institute Senior Fellow working on urban growth, public land, and transportation issues.

Zika Virus Shows It’s Time to Bring Back DDT by Diana Furchtgott-Roth

The Zika virus is spreading by mosquitoes northward through Latin America, possibly correlated with birth defects such as microcephaly in infants. Stories and photos of their abnormally small skulls are making headlines. The World Health Organization reports that four million people could be infected by the end of 2016.

On Monday, the WHO is meeting to decide how to address the crisis. The international body should recommend that the ban on DDT should be reversed, in order to kill the mosquitoes that carry Zika and malaria, a protistan parasite that has no cure.

Zika is in the news, but it is dwarfed by malaria. About 300 million to 600 million people suffer each year from malaria, and it kills about 1 million annually, 90 percent in sub-Saharan Africa. We have the means to reduce Zika and malaria — and we are not using it.

Under the Global Malaria Eradication Program, which started in 1955, DDT was used to kill the mosquitoes that carried the parasite, and malaria was practically eliminated. Some countries such as Sri Lanka, which started using DDT in the late 1940s, saw profound improvements. Reported cases fell from nearly 3 million a year to just 17 cases in 1963. In Venezuela, cases fell from over 8 million in 1943 to 800 in 1958. India saw a dramatic drop from 75 million cases a year to 75,000 in 1961.

This changed with the publication of Rachel Carson’s 1962 book, Silent Spring, which claimed that DDT was hazardous. After lengthy hearings between August 1971 and March 1972, Judge Edmund Sweeney, the EPA hearing examiner, decided that there was insufficient evidence to ban DDT and that its benefits outweighed any adverse effects. Yet, two months afterwards, then-EPA Administrator William D. Ruckelshaus overruled him and banned DDT, effective December 31, 1972.

Other countries followed, and DDT was banned in 2001 for agriculture by the Stockholm Convention on Persistent Organic Pollutants. This was a big win for the mosquitoes, but a big loss for people who lived in Latin America, Asia, and Africa.

Carson claimed that DDT, because it is fat soluble, accumulated in the fatty tissues of animals and humans as the compound moved through the food chain, causing cancer and other genetic damage. Carson’s concerns and the EPA action halted the program in its tracks, and malaria deaths started to rise again, reaching 600,000 in 1970, 900,000 in 1990 and over 1,000,000 in 1997 — back to pre-DDT levels.

Some continue to say that DDT is harmful, but others say that DDT was banned in vain. There remains no compelling evidence that the chemical has produced any ill public health effects. According to an article in the British medical journal the Lancet by Professor A.G. Smith of Leicester University,

The early toxicological information on DDT was reassuring; it seemed that acute risks to health were small. If the huge amounts of DDT used are taken into account, the safety record for human beings is extremely good. In the 1940s many people were deliberately exposed to high concentrations of DDT thorough dusting programmes or impregnation of clothes, without any apparent ill effect… In summary, DDT can cause toxicological effects but the effects on human beings at likely exposure are very slight.

Even though nothing is as cheap and effective as DDT, it is not a cure-all for malaria. But a study by the Uniformed Services University of the Health Sciences concluded that spraying huts in Africa with DDT reduces the number of mosquitoes by 97 percent compared with huts sprayed with an alternative pesticide. Those mosquitoes that do enter the huts are less likely to bite.

By forbidding DDT and relying on more expensive, less effective methods of prevention, we are causing immense hardship. Small environmental losses are inferior to saving thousands of human lives and potentially increasing economic growth in developing nations.

We do not yet have data on the economic effects of the Zika virus, but we know that countries with a high incidence of malaria can suffer a 1.3 percent annual loss of economic growth. According to a Harvard/WHO study, sub-Saharan Africa’s GDP could be $100 billion greater if malaria had been eliminated 35 years ago.

Rachel Carson died in 1964, but the legacy of Silent Spring and its recommended ban on DDT live with us today. Millions are suffering from malaria and countless others are contracting the Zika virus as a result of the DDT ban. They were never given the choice of living with DDT or dying without it. The World Health Organization should recognize that DDT has benefits, and encourage its use in combating today’s diseases.

This article first appeared at E21, a project of the Manhattan Institute.

Diana Furchtgott-RothDiana Furchtgott-Roth

Diana Furchtgott-Roth, former chief economist of the U.S. Department of Labor, is director of Economics21 and senior fellow at the Manhattan Institute.

The Ethanol Mandate Is Literally Impossible by Alan Reynolds

In recent years, politicians set impossibly high mandates for the amounts of ethanol motorists must buy in 2022, while also setting impossibly high standards for the fuel economy of cars sold in 2025. To accomplish these conflicting goals, motorists are now given tax credits to drive heavily-subsidized electric cars, even as they will supposedly be required to buy more and more ethanol-laced fuel each year.

Why have such blatantly contradictory laws received so little criticism, if not outrage? Probably because ethanol mandates and electric car subsidies are lucrative sources of federal grants, loans, subsidies and tax credits for “alternative fuels” and electric cars. Those on the receiving end lobby hard to keep the gravy train rolling while those paying the bills lack the same motivation to become informed, or to organize and lobby.

With farmers, ethanol producers and oil companies all sharing the bounty, using subsidies and mandates to pour ever-increasing amounts of ethanol into motorists’ gas tanks has been a win-win deal for politicians and the interest groups that support them and a lose-lose deal for consumers and taxpayers.

The political advantage of advocating contradictory future mandates is that the goals usually prove ridiculous only after their promoters are out of office. This is a bipartisan affliction.

In his 2007 State of the Union Address, for example, President Bush called for mandating 35 billion gallons of biofuels by 2017, an incredible target equal to one-fourth of all gasoline consumed in the United States in 2006. Not to be outdone, “President Obama said during the presidential campaign that he favored a 60 billion gallon-a-year target.”

The Energy Independence and Security Act of 2007 (EISA) did not go quite as far as Bush or Obama, at least in the short run. It required 15 billion gallons of corn-based ethanol by 2015 (about 2 billion more than were actually sold), but 36 billion gallons of all biofuels by 2022 (which would be more than double last year’s sales). The 2007 energy law also raised corporate average fuel economy (CAFE) standards for new cars to 35 miles per gallon by 2030, which President Obama in 2012 ostensibly raised to 54.5 mpg by 2025 (a comically precise guess, since requirements are based on the size of vehicles we buy).

The 36 billion biofuel mandate for 2022 is the mandate Iowa Governor Terry Branstad (and Donald Trump) now vigorously defend against the rather gutsy opposition of Sen. Ted Cruz. But it is impossible to defend the impossible: Ethanol consumption can’t possibly double as fuel consumption falls.

From 2004 to 2013, cars and light trucks consumed 11% less fuel. The Energy Information Agency likewise predicts that fuel consumption of light vehicles will fall by another 10.1% from 2015 to 2022.  So long as ethanol is no more than 10% of a gallon (much higher than Canada or Europe), ethanol use must fall as we use less gasoline rather than rise, as the mandates require. If we ever buy many electric cars or switch from corn to cellulosic sources of ethanol, as other impossible mandates pretend, then corn-based ethanol must fall even faster.

If raising ethanol’s mandated share above 10% is any politician’s secret plan, nobody dares admit it. Most pre-2007 cars can’t handle more than 10 percent ethanol without damage, and drivers of older cars often lack the income or wealth to buy a new one. Since ethanol is a third less efficient than gasoline, adding more ethanol would also make it even more impossible for car companies to comply with Obama’s wildly-ambitious fuel economy standards (which must also reduce ethanol use, if they work).

The 2007 law also mandated an astonishing 16 billion gallons of nonexistent “cellulosic” ethanol by 2022 from corn husks or whatever. We were already supposed to be using a billion gallons of this marvelous snake oil by 2013. Despite lavish taxpayer subsidies, however, production of cellulosic biofuel was only about 7.8 million barrels a month by April, 2015 (about 94 million a year). The Environmental Protection Agency (EPA) mandate in June 10, 2015 was 230 million billion in 2016, which is more fantasy.

It doesn’t help that the Spanish firm Abenoga – which received $229 million from U.S. taxpayers to produce just 1.7 million gallons of ethanol – is trying to sell its plant in Kansas to avoid the bankruptcy fate of cellulosic producer KiOR. It also doesn’t help that a $500,000 federally-funded study paid finds biofuels made with corn residue release 7% more greenhouse gases than gasoline.

The contradictory, fantastic and often scandalous history of ethanol mandates illustrates the increasing absurdity of mandates from Congress and the EPA.

The 2007 biofuel mandate was not just bad policy. It was and remains an impossible, bizarre policy.

This post first appeared at

Alan ReynoldsAlan Reynolds

Alan Reynolds is one of the original supply-side economists. He is Senior Fellow at the Cato Institute and was formerly Director of Economic Research at the Hudson Institute.

Lesbians Castrating 11-Year Old Tommy to become Tammy

Lesbians adopting little boys only to castrate them….welcome to Obama’s new America. Katie McGuire reported:

Pauline Moreno and Debra Lobel, a lesbian couple from California, claimed their 11-year-old son Thomas didn’t want to be a boy. Thomas, who prefers to go by “Tammy,” wanted to be a girl. So his mothers gave him hormone treatments to delay puberty so that he could fully “transition” into a female through surgery when he is old enough.

Wake up America! What are we doing to our children? Where are the Christians of America?

RELATED ARTICLE: Lesbian couple in California chemically alter their 11-year old boy to prep for sex-change surgery

Education Emergency: Our Children (and U.S.) at Risk

As an independent physicist I’ve spent 40± years on environmental advocacy, and energy education. In the later part of this journey I’ve become increasingly distressed about what is happening in our education system.

After speaking out about this several times, in 2013 I was asked to put on a presentation to the US House Science, Space and Technology Committee, as well as to the North Carolina Legislators. The unabridged version of both of those talks is online at

Since then, most of what I’ve seen indicates that the situation is getting worse, rather than remedied. This is a summary of key education parts that need to be immediately addressed. Hopefully it will encourage citizens to get more involved with rectifying this extraordinarily important matter.

1 – We can not effectively fix anything until we are on the same page. I believe that the place to start here, is that we need to fully agree on the overall objective of the education system. Exactly what is the product we expect to get at the end of a laborious 12+ year assembly line?

In my view, the number one criteria for determining whether the educational system has been a success or not is: do these graduates have the ability and inclination to do Critical Thinking?

Google founder Vint Cerf says that there is no more important skill to teach than Critical Thinking. He calls it the one tool we have to defend ourselves from the onslaught of misinformation we are saturated with today. He argues that Critical Thinking would enable citizens to be more thoughtful about what information they accept, then process, and then use. That skill is a major benefit in literally every aspect of life.

My experience is that while the education system gives lip-service to Critical Thinking, when the rubber-meets-the-road, it’s not really happening. An easy test is to ask any college or high school student today what they think about global warming. Do they provide a thoughtful, thorough analysis — or simply regurgitate propaganda?

My first recommendation is that this be adopted by every state education department, every local school board, every academic institution, etc:

“It is our obligation to produce critically thinking graduates.”

2 – I’m a zealous defender of my profession, Science. Most people are not aware of it, but Science is under a ferocious attack, worldwide. The reason is that individuals and organizations promoting political agendas, or their own economic interests, are acutely aware that real Science is not their
friend — as it will expose them for what they are.

Those self-serving parties realize that even though most citizens have faith in Science, very few actually understand what Science is. So they take advantage of that discrepancy, by purposefully making false Science claims. They are fully aware that only a small number of people will understand the fraud — and even fewer will say anything public about it.

From what I’ve seen, the most egregious assaults on Science are taking place in such newbie science branches such as Environmental Science, Earth Science, Ecology, etc.

This campaign is being supported by slick internet video “science” series like Crash Course, Bozeman Science, etc. Listen carefully to the Crash Course founder explaining why they made over 200 education videos. He says “We don’t really have a coherent answer.” SAY WHAT?! I call these QVC Science, as (IMO) they are effectively polished sales pitches.

Propagandizing Science starts in our local schools. The good news is that the solution is also there — and is entirely under our control (see #3). Recommendation number two is that I’m advocating that every state education department, every local school board, every academic institution, formally adopt and implement this standard:

“Science education will be apolitical.”

3 – In my countrywide travels and correspondences I’ve heard from many parents of students. Quite a few have complained about various matters going on in their district. I asked them what response they got when they expressed their concerns to the teacher, principal, school board or superintendent? Most said essentially the same thing: they were reluctant to speak out for fear of retribution to their child. What a wonderful system.

The remaining citizens are those with no school children. Those people understandably believe that the school system is being held accountable by those with the most at stake: parents of current children. But no!

My wife and I are in the second group. We were warned that because we had no kids in the system, that defenders of the status quo would instead attack us personally if we spoke up publicly about the secondary school system. We’d be accused of being anti-superintendent, anti-school board, anti-teacher, and/or anti-children.

It seems rather hypocritical that school districts who pride themselves for enforcing a “no tolerance” bullying policy between students, would actually tolerate intimidation of citizens who have the temerity to speak up about school system improvements…

Most people (including us) would like the federal government to stay out of the education business. Additionally we would also prefer that the state have minimal involvement in the education process. We want the ability to locally decide what is best for our children and our community. We rarely hear about the flip side to this freedom: responsibility. If we want to control things ourselves, for our interests, then that means that there has to be real community involvement — which includes unfettered and unpenalized inputs from parents and citizens.

So my third suggestion is that every state education department and school district officially adopt the following position for their interfaces with parents and the public (prominently putting it on their websites, letterhead, etc):

“Please tell us how we can do a better job!”

When inputs from the public are received the choice is very simple. The recipients can be genuinely appreciative that citizens take the time to make constructive suggestions to improve student education — or they can circle the wagons, and defend the status quo. Ironically, it’s the later action that necessitates more higher level intervention…

Whether you have children in the education system or not, is irrelevant. The future of our country, is literally at stake here. We all are going to sink or swim based on whether we have an effective education system. Please carefully investigate what is happening in your community.

“The function of education is to teach one to think intensively, and to think critically.” — Martin Luther King, Jr.

EDITORS NOTE: The featured image is courtesy of

Greece, Cyprus and Israel to build Eastern Mediterranean Gas Pipeline

Auspicious meetings were held in Nicosia, Cyprus with members of the emerging Trilateral Eastern Mediterranean Gas Pipeline alliance: Israeli Prime Minister Benjamin Netanyahu, Cyprus President Nicos Anastasiades and Greek Prime Minister Alexis Tsipras.

Watch this Jerusalem Post news video of the historic triple alliance meeting in Nicosia:

leaders on mediteranian pipeline

Israeli Prime Minister Benjamin Netanyahu, left, Cyprus President Nicos Anastasiades and Greek Prime Minister Alexis Tsipras at Nicosia trilateral meeting, January 27, 2016.

The Jerusalem Post reported the triple alliance leaders announcing plans to set up the long delayed Eastern Mediterranean gas pipeline:

NICOSIA – Israel, Cyprus and Greece decided at their first ever tripartite meeting to set up a steering committee to look into laying a gas pipeline from Israel to Cyprus, and then to Greece for further export to Europe.

The decision was announced by Prime Minister Benjamin Netanyahu, standing next to Cyprus President Nikos Anastasiades and Greek Prime Minister Alexis Tsipras.

Each leader delivered a statement noting the historic nature of the meeting, and highlighting the possibilities this emerging alliance has for the region. They did not answer any questions from the press.

While both Anastasiades and Tsipras stressed, without mentioning Turkey by name, that this cooperation was not “against anyone else,” Netanyahu did not make a reference at all to Turkey, either directly or indirectly.

National Infrastructure, Water and Energy Minister Yuval Steinitz, who was part of the Israeli delegation, told reporters on the plane en route to Nicosia that Israel wanted to have the ability to export the gas both through Greece and Turkey. Laying the pipeline to Turkey is considerably cheaper than through Cyprus and Greece.

Anastasiades, as host of the summit, spoke first, and said this cooperation was based on an appreciation that “it is imperative to work collectively through coordination.” He said that the three leaders signed a joint declaration, which he termed a “historic document” that deals with cooperation in the energy, tourism, research, water-management, anti-terrorism and immigration spheres. He said that a trilateral steering committee will monitor the agreement.

Netanyahu, who said that as the son of a historian he was averse to using the term “historic,” said that the term did however fit the meeting. “I believe this meeting has historic implications,” he said. “The Last time Greeks, Cypriots and Jews sat around a table and talked about a common framework was 2,000 years ago.”

In addition to the gas pipeline, Netanyahu also spoke of a plan to lay an underwater cable to connect the electric grids of all three countries. “You can export gas through electricity,” he said.

Tsipras said that cooperation with Israel and Cyprus was a “strategic choice” for Athens.

EasternMedPipeline(1)In a January 2015, New English Review article, “Could Israel Lose the Energy Prize in the Eastern Mediterranean,” we noted this about the prospects for the Triple Alliance Eastern Mediterranean Pipeline.

“On December 9, 2014, Israel, Cyprus and Greece pitched the Eastern Mediterranean pipeline a day before a conference organized jointly byNatural Gas Europe, the Greek Energy Forum, ESCP Europe, RCEM and the European Economic and Social Committee (EESC). The conference was titled “2030 EU Energy Security, the Role of the Eastern Mediterranean Region” and took place at EESC headquarters in Brussels. Natural Gas Europe in an article on the EESC conference noted the comments of Greek Energy Minister, Ioannis Maniatis:

Europe will need an extra 100 bcm of natural gas in the next 15 years, and in light of Europe’s increasing dependence on imports to fulfill its energy needs, the EU must find a sustainable model to ensure it is a competitive economy.

The EU needs to reduce external dependence, increase efficiency, diversify its sources and routes of supply, and improve interconnectors, he added. Fully connected energy grids, greater transparency, good governance and a thorough understanding of global events should also be the focus of the EU according to Maniatis. He explained that Greece’s importance is growing. The East Med pipeline pitched by Israel, Cyprus and Greece would run from Israel and Cyprus via Greece to Italy and then to the rest of Europe is technically feasible and attached to attractive prospects said Maniatis. He told the audience that the results of a feasibility study on the East Med pipeline will be released next year and that the pipeline would serve as a new source and provider of natural gas comparable to the Southern Corridor. The attractiveness of the East Med Pipeline, said Maniatis, is that unlike the Southern Corridor, it would pass exclusively through four member states and hence deserves strong EU backing for its materialization.

The Eastern Mediterranean Pipeline had received the endorsement of the EC as a priority project for underwriting in November 2013. According to The Guardian that could provide the Eastern Mediterranean pipeline project “access to a €5.85bn fund, and preferential treatment from multilateral banks.”

Natural Gas Europe reported at the time the options under consideration:

The basic plan will see the pipeline stretch from the Leviathan field offshore Israel on to Cyprus ending in eastern part of the Island of Crete in Greece. Three alternate routes were discussed:

  • To the Peloponnesus Peninsula joint via spur with the Trans-Adriatic Pipeline (TAP)
  • From Crete to northern Greece where it would join the Interconnector Greece-Bulgaria (IGB)
  • From Crete to the Revythousa LNG terminal close to Athens. The terminal would be significantly upgraded to accommodate large amounts of gas exports thereafter.

The technically difficult 1,880 kilometer long submarine pipeline project, reaching depths of more than 2,000 meters, would connect Leviathan and Aphrodite gas fields ultimately to Italy. Cost for the project was estimated at over $20 Billion and would likely not be concluded at the earliest until 2020, assuming that production of the Leviathan field in the Israeli EEZ begins in 2017. With the demise of both the Turkish Leviathan-Ceyhan pipeline and the Australian Woodside Pty. Ashdod LNG –Eilat pipeline for delivery of gas to the Asian markets, the Eastern Mediterranean pipeline project may have serious consideration. There is the alternative of the onshore LNG facility at Vassilikos on Cyprus’ south shore to be built by the Consortium at an estimated cost of $10 billion. A Memorandum of Understanding for planning the Vassilikos LNG complex was signed by Cyprus and the Consortium in June 2013. In the interim, offshore floating LNG processing platforms that might be leased to ship processed gas via pressured LNG vessels to receiving terminals in Greece and Italy. However, Noble Energy was not initially supportive of the Eastern Mediterranean pipeline option, instead concentrating on sales from Leviathan to regional users like Jordan and Egypt and building the proposed Cypriot LNG processing facility.”

Israel has overcome the ruling of its former Anti Trust Authority general director, approving an offshore gas development plan with US Partner Noble Energy, inc. and Israeli partner Delek Group involving the Leviathan, Tamar and adjacent Aphrodite gas fields in the Cyprus Exclusive Economic Zone. With yesterday’s announcement in Nicosia by the Triple Alliance of Israel, Cyprus and Greece, a way can now be seen to go forward with the Eastern Mediterranean Gas Pipeline and the LNG facility in Cyprus.  At the time we wrote the January 2015 NER article, Russian President Putin and Turkish President Erdogan had announced a $12 billion Turkey Stream pipeline deal to supply Europe with natural gas. Given the break off in relations between Russia and Turkey over the latter’s downing of a Russian Su-24 bomber, Putin has suspended the project.  That sent Erdogan scrambling to re-open diplomatic relations with Jerusalem seeking supplies of Israeli gas.  The dour circumstances propounded in our January 2015 article appear to be lifted by the geo-resource and political wars in the Syrian and ISIS conflicts.  That is enhanced by the settlement of Israel’s plan for development and distribution of its offshore gas and oil fields.

RELATED ARTICLE: 10 Reasons Israel Is Not An ‘Apartheid’ State

EDITORS NOTE: This column originally appeared in the New English Review.

Climate Confusion

Many Americans are again confused over how the President and the United Nations can say we are at grave risk from man-made global warming (a.k.a climate change) when we continue to get pummeled by brutal, record shattering, winter storms. If this situation has you confused, take heart. You are not alone.

Once again the natural world has slapped the ‘warmist’ community down hard with yet another record breaking blizzard in the northeastern US between January 22 and January 24, 2016. Winter storm ‘Janus’ (a Weather Channel designation) dumped record snow totals in the major cities of the USA with a major snowstorm that stretched from Arkansas to Massachusetts. Here are but a few examples of the storm’s wrath:

New York City saw 26.8 inches of snow fall in Central Park, the second highest ever recorded. It missed tying the all time record by one tenth of an inch. JFK Airport had 30.5 inches of snow. Washington’s Dulles airport measured 28.3 inches, the second highest ever. Baltimore had 29.2 inches, its largest snow total ever recorded. The list of snow events and the breadth of this winter calamity that dumped record snow from the central US to the mid-Atlantic states to the Northeast was truly one for the record books.

What is also shocking about this ‘snowmageddon’ is that according to the manmade global warming crowd, none of this was possible. We were told by United Nations scientists there was not supposed to be any snow anywhere on the planet after 2003!  And who can forget the previous terrible winter of 2014-2015 here in the US, where new temperature and snow records were routinely broken. Again, that mercilessly long and cold winter was not possible either according to the climate models from the UN and the U.S. government. How can the impossible happen so often?

We should not forget other monstrously bad predictions, the ‘warmist’ community has proffered. NOAA scientists were telling us along with Al Gore, that the Arctic sea ice would be completely gone by 2008, then revised that to 2013. Of course neither happened. Global sea ice, especially in Antarctica, is in fact, growing rapidly.

Greenhouse gas emissions recently reached 400 ppm, yet the predicted overheating of the Earth is not happening – on the contrary. The 800 lb gorilla in the climate laboratory that the manmade warming community ignores is, that there has been no meaningful growth in global temperatures for eighteen years! That includes the so-called warmest year ever – 2015. Unfortunately, my colleagues and I have observed that the US government can no longer be relied upon to tell the truth about the Earth’s climate or its temperature.

The United Nations certainly cannot be trusted either. The corruption of climate science via its climate reports issued since 1990, has been so deep seated within that organization for so long, that we must now conclude they simply are unable and unwilling to be truthful. The UN climate models of which they, the US media, and our government are so enamored, are well over 100% in error in many of the models in predicting global temperature variation. Yet, the predictions from these failed models are still offered up as evidence of the need to shut down coal and CO2 production worldwide. Further, recent data suggests that the Earth’s climate appears to be relatively insensitive to CO2!  Even the UN is now confused.

It is my fondest hope that we can put the sad era of manmade global warming behind us soon and begin the preparations needed for the rapidly approaching cold epoch, a message I have been spreading since 2007. Starting this year, a long term decline in global temperature begins. It will be at the bottom during the 2020’s through the 2030’s. This time will be grim for our species as the cold era starts its destruction of crops around the world.

We humans are easily confused about the climate. Many of us actually believe what we want to believe, and not what the facts tell us we should. Worse; we are often intentionally deceived by our leaders.

The natural world does not suffer from these afflictions. It is never confused.

RELATED ARTICLE: 300 Scientists Want NOAA To Stop Hiding Its Global Warming Data

January 27, 1951 Operation Desert Rock: First test of a U.S. Nuclear Bomb in Nevada

‘Able’ was the first air-dropped nuclear device to be exploded on American soil. The test took place on 27 January 1951 at Frenchman Flat, a dry lakebed in the Nevada Test Site. The 1-kiloton explosion launched the fourth U.S. nuclear test series code-named ‘Ranger’, which consisted of five air-dropped nuclear tests in early 1951.

first nuclear test nevada

The vertical stripes are smoke trails from rockets used to signal the speed and distance of shock waves from the explosion in the early days of nuclear testing.

The initial post-war U.S. nuclear tests – including the similarly named Able test on 1 July 1946 at the Bikini atoll – had been conducted at remote atolls in the Pacific Ocean, far from U.S. mainland. With the first Soviet nuclear test in 1949, the United States had lost its monopoly on nuclear weapons. The United States decided to significantly expand nuclear testing programme and chose the Nevada Test Site as the main location for subsequent tests.

The Able test was followed by about 100 more atmospheric nuclear tests at the Nevada Test Site. By the end of the 1950s, the grave effects of radioactivity on personnel involved in the testing and the surrounding population became evident. Public outrage helped to conclude the 1963 Partial Test Ban Treaty (PTBT), which banned all nuclear tests above ground, in the atmosphere, underwater and in outer space. Nuclear weapon testing underground, though, not only continued butincreased in numbers. A total of 928 nuclear tests were conducted at the Nevada Test Site, more than anywhere else.

In a 1955 brochure on ‘Atomic Test Effects in the Nevada Test Site Region‘, the Atomic Energy Commission assured residents close to the test site that radiation levels were “only slightly more than normal radiation which you experience day in and day out wherever you may live.” The nuclear weapon tests in Nevada were even promoted as tourist attractions.

U.S. troops participated in nuclear testing with little or no protective clothing.

Until today, the scale of the harm caused by radioactivefallout from the Nevada Test Site remains controversial. A 2006 study (PDF) by Steven L. Simon, André Bouville and Charles E. Land finds that exposure to fallout from atmospheric testing will continue to have adverse health effects in the form of increased rates of certain types of cancer such as leukemia. The National Cancer Institute’s 1999report finds that internal exposure to iodine-131 was the most serious health consequence for downwinders. Milk contaminated with iodine-131 was consumed by children in particular.

In 1990, the U.S. Congress adopted the Radiation Exposure Compensation Act (RECA) which allows downwinders from Utah, Nevada and Arizona to apply for a US$ 50.000 compensation payment in cases where a disease was caused by fallout from nuclear testing. There have been, however, repeated calls for expansions of the Compensation Act to boost payments and include other U.S. states in the compensation scheme. In November 2011, the U.S. Senate unanimously approved a resolution which designates 27 January as a National Day of Remembrance. The resolution recognizes that “downwinders paid a high price” for the development of the U.S. nuclear weapons program.

The United States conducted its last nuclear test ‘Divider‘ in September 1992. In 1996, it was the first country to sign the Comprehensive Nuclear-Test-Ban Treaty (CTBT), which bans all nuclear explosions. However, it has yet to ratify the Treaty, a step that is mandatory for its entry into force. The same applies to seven other nuclear-capable States: China, the Democratic People’s Republic of Korea, Egypt, India, Israel, Iran and Pakistan.

EDITORS NOTE: This column is courtesy of Special Forces Gear.

Recent Energy & Environmental News


How wind turbines can affect climate by creating fog. Photo courtesy of Professor E. A. Shinn, University of South Florida.

Energy and Environmental Newsletter, is now online.

Some of the more intriguing energy articles in this issue are:

Some of the most interesting Global Warming articles in this issue are:

PS: Please pass this on to open-minded citizens. If there are others who you think would benefit from being on our energy & environmental email list, please let me know. If at any time you’d like to be taken off the list, please let me know that too.

PPS: I am not an attorney, so no material appearing in any of the Newsletters should be construed as giving legal advice. My recommendation has always been: consult a competent attorney when you are involved with legal issues.