Tag Archive for: economics

Not a Single U.S. State Is Requiring Kids to Get Vaccinated to Attend Public School. Why?

Economics may offer a clue as to why not one state is mandating vaccination to attend school in the 2022-2023 school year, even though many government officials support coercive vaccination policies.


September has arrived and many children are back in public schools (though fewer than previous years).

At a recent event, one parent joked to me we’re now officially in “vaccine season.” The comment made me laugh, but there’s at least a kernel of truth to it. It’s not unusual for states to require that children receive an array of vaccinations—from polio, diphtheria, and chickenpox to measles, mumps, and meningitis—to be enrolled in a public school system.

One vaccine that parents will not find on any state’s required list in 2022 are the Covid-19 shots, which have been a source of great debate in the US and other countries.

While a few US cities continue to push vaccine mandates to attend, Pew Charitable Trusts pointed out earlier this year that states have been surprisingly wary of mandating Covid shots for children.

“[Only] two states—California and Louisiana—have added COVID-19 vaccines to the list of immunizations mandated for schoolchildren,” Michael Ollove pointed out in January. “Both requirements would be enforced next school year, and then only if the vaccines receive full authorization by the U.S. Food and Drug Administration.”

Things have changed since then.

In May, Louisiana Gov. John Bel Edwards announced the Louisiana Department of Health would not require children attending the state’s daycares or K-12 schools to provide proof of vaccination. California, which in October 2021 became the first state to announce Covid vaccine requirements for school, announced in April that it would not require vaccination, noting the vaccines had not at that time been approved by the FDA for all school-age children. (They are now.)

The fact that not a single US state is requiring students to be vaccinated against Covid to attend K-12 school is probably a bit surprising to readers. (It was to this author.)

I’d like to think that policymakers and politicians finally woke up to the fact that vaccine mandates are immoral, inhumane, and a clear violation of bodily integrity. But that seems unlikely considering that many vaccine mandates remain in place, particularly at the federal and municipal levels.

It’s also possible that lawmakers have realized vaccinated individuals can still get sick and spread the virus, and therefore concluded vaccinations are a matter of personal health, not public health. Yet once again this theory is undermined by the presence of other vaccine mandates that remain in place. Some may contend that we’ve simply beaten the virus and mandates are no longer necessary, but official statistics show Covid deaths and cases remain stubbornly high.

So what’s the answer?

What’s most likely is that political considerations are at play. Yet this thesis too, at first blush, appears to be undermined by the reality that polls show Americans support Covid vaccine mandates in schools.

Some basic economics, however, can help us see that the politics are more complicated than that.

Public Choice Theory is a field of economics pioneered by the Nobel Prize-winning economist James M. Buchanan and economist Gordon Tullock. It rests on a simple assumption: politicians and bureaucrats make decisions primarily based on self-interest and incentives just like everyone else, not out of an altruistic goal of serving “the public good.” (This is why public choice economists have dubbed it “politics without romance.”)

I’ve previously pointed out that politicians were incentivized during the pandemic to embrace Covid restrictions even if they didn’t work because of the political climate in 2020. The absence of government regulations was viewed as actual violence by some public health experts, and those who didn’t embrace strict interventions were accused of genocide.

Moreover, the costs of these regulations tended to be dispersed, delayed, and hidden from view. Depression, drug overdoses, lost learning, and speech impediments were among the consequences of NPIs (Non-Pharmaceutical Interventions) imposed by governments. But the results of these policies were relatively “unseen” (to use a term from the 19th century economist Frederic Bastiat), at least compared to Covid deaths, which public health officials, the media, and even ordinary citizens tracked obsessively.

The costs of NPIs were quite serious, but they were quite low politically for the reasons stated above. The political costs of keeping a state open were much higher. No politician wants to explain why Mrs. Jackson, the 60-year-old math teacher, died from Covid while schools in your state remained open. (It would be just as tragic if Mrs. Jackson had died at home when schools were closed, but at least no politician would be blamed for her death in this case.)

In other words, the incentive structure early in the pandemic encouraged interventions, even if those interventions were ineffective and ultimately ended up doing more harm than good.

The incentive structure for vaccines is very different, particularly for young people.

Children can and do die from Covid, of course, but their risk is extremely low compared to other age groups. Even more important, perhaps, is that the costs of mandatory vaccination are not delayed, dispersed, or hidden from view. They are immediate, concentrated, and highly visible.

The sad reality is that vaccine injuries, though rare, do occur, as the CDC notes. And when they occur, they are the opposite of “unseen,” which means the political repercussions have the potential to be swift—and severe.

After all, when a young person dies after taking a vaccine designed to protect him, it’s a tragedy. When a young person dies of myocarditis after taking a vaccine he was forced to take to attend school, it’s a tragic event and a political disaster with a wide radius, even if some studies show the risk of myocarditis is greater after Covid infection than after Covid vaccination.

All of this analysis is dark and a bit troubling, of course. Now you see why they call public choice theory “politics without romance.”

But it might help explain why even state leaders comfortable with mandatory vaccination and vaccine passports have been reluctant to compel children to get the shot, even if they truly believe it could save lives.

Whether mandatory vaccination would have done more harm than good is a question we’ll never know, though it’s a debate that will likely continue for years to come. But because vaccines have the power to both save lives and claim lives, the decision to accept or refuse them can only morally be made by one person: the individual (or parents, if the decision concerns a child).

So at least state leaders are getting it right this time, even if they are doing so for the wrong reasons.

AUTHOR

Jon Miltimore

Jonathan Miltimore is the Managing Editor of FEE.org. His writing/reporting has been the subject of articles in TIME magazine, The Wall Street Journal, CNN, Forbes, Fox News, and the Star Tribune. Bylines: Newsweek, The Washington Times, MSN.com, The Washington Examiner, The Daily Caller, The Federalist, the Epoch Times.

EDITORS NOTE: This FEE column is republished with permission. ©All rights reserved.

No, Slavery Did Not Make America Rich

The historical record of the post-war economy demonstrates slavery was neither a central driving force of, or economically necessary for, American economic dominance. 


In 1847, Karl Marx wrote that

Without slavery you have no cotton; without cotton you have no modern industry…cause slavery to disappear and you will have wiped America off the map of nations.

As with most of his postulations concerning economics, Marx was proven wrong.

Following the Civil War and the abolition of slavery in 1865, historical data show there was a recession, but after that, post-war economic growth rates rivaled or surpassed the pre-war growth rates, and America continued on its path to becoming the number one political and economic superpower, ultimately superseding Great Britain (see Appendix Figure 1).

The historical record of the post-war economy, one would think, obviously demonstrated slavery was neither a central driving force of, or economically necessary for, American economic dominance, as Marx thought it was. And yet, somehow, even with the benefit of hindsight, there are many academics and media pundits still echoing Marx today.

For instance, in his essay published by The New York Times’ 1619 Project, Princeton sociologist Matthew Desmond claims the institution of slavery “helped turn a poor, fledgling nation into a financial colossus.”

“The industrial revolution was based on cotton, produced primarily in the slave labor camps of the United States,” Noam Chomsky similarly stated in an interview with the Times. Both claims give the impression that slavery was essential for industrialization and/or American economic hegemony, which is untrue.

The Industrial Revolution paved the way for modern economic development and is widely regarded to have occurred between 1760 and 1830, starting in Great Britain and subsequently spreading to Europe and the US.

As depicted in Figure 1., raw cotton produced by African-American slaves did not become a significant import in the British economy until 1800, decades after the Industrial Revolution had already begun.

Although the British later imported large quantities of American cotton, economic historians Alan L. Olmstead and Paul W. Rhode note that “the American South was a late-comer to world cotton markets,” and  “US cotton played no role in kick-starting the Industrial Revolution.”

Nor was the revolution sparked by Britain’s involvement with slavery more broadly, as David Eltis and Stanley L. Engerman assessed that the contribution of British 18th-century slave systems to industrial growth was “not particularly large.”

There is also the theory that the cotton industry, dependent on slavery, triggered industrialization in the northern United States by facilitating the growth of textile industries. But as demonstrated by Kenneth L. Sokoloff, the Northern manufacturing sector was incredibly dynamic, and productivity growth was broad-based and in no way exclusive to cotton textiles.

Eric Holt has further elaborated, pointing out that

the vast literature on the industrial revolution that economic historians have produced shows that it originated in the creation and adoption of a wide range of technologies, such as the steam engine and coke blast furnace, which were not directly connected to textile trading networks.

The bodies of the enslaved served as America’s largest financial asset, and they were forced to maintain America’s most exported commodity… the profits from cotton propelled the US into a position as one of the leading economies in the world and made the South its most prosperous region.

This is the argument made by P.R. Lockhart of Vox.

While slavery was an important part of the antebellum economy, claims about its central role in the Industrial Revolution and in America’s rise to power via export-led growth are exaggerated.

Olmstead and Rhode have observed that although cotton exports comprised a tremendous share of total exports prior to the Civil War, they accounted for only around 5 percent of the nation’s overall gross domestic product, an important contribution but not the backbone of American economic development (see Appendix Figure 2).

One can certainly argue that slavery made the slaveholders and those connected to the cotton trade extremely wealthy in the short run, but the long-run impact of slavery on overall American economic development, particularly in the South, is undeniably and unequivocally negative.

As David Meyer of Brown University explains, in the pre-war South, “investments were heavily concentrated in slaves,” resulting in the failure “to build a deep and broad industrial infrastructure,” such as railroads, public education, and a centralized financial system.

Economic historians have repeatedly emphasized that slavery delayed Southern industrialization, giving the North a tremendous advantage in the Civil War.

Harvard economist Nathan Nunn has shown that across the Americas, the more dependent on slavery a nation was in 1750, the poorer it was in 2000 (see Appendix Figure 3.). He found the same relationship in the US. In 2000, states with more slaves in 1860 were poorer than states with fewer slaves and much poorer than the free Northern states (see Appendix Figure 4.)

According to Nunn,

looking either across countries within the Americas, or across states and counties within the U.S., one finds a strong significant negative relationship between past slave use and current income.

Slavery was an important part of the American economy for some time, but the reality is that it was completely unnecessary and stunted economic development, and it made Americans poorer even over 150 years later.

The historical and empirical evidence is in accordance with the conclusion of Olmstead and Rhode—that slavery was

a national tragedy that…inhibited economic growth over the long run and created social and racial divisions that still haunt the nation.

Figure 1. US share of British Cotton Imports over time

Figure 2. Cotton Exports and Gross Domestic Product

Figure 3. Partial correlation plot between the slave population as a share of the total population in 1750 and national income per capita in 2000 of countries of the Americas

Figure 4. Bivariate plot showing the relationship between the slave population as a share of the total population in 1860 and state incomes per capita in 2000

AUTHOR

Corey Iacono

Corey Iacono is a Master of Business graduate student at the University of Rhode Island with a bachelor’s degree in Pharmaceutical Science and a minor in Economics.

EDITORS NOTE: This FEE column is republished with permission. ©All rights reserved.

CNN Medical Analyst Says Masking Stunted Her Toddler’s Language Development—and Taught Her an Important Lesson about Tradeoffs

A year ago, Dr. Leana Wen was arguing unvaccinated people shouldn’t be allowed to leave their homes. But now she says she’s abandoned her “extremely cautious” Covid views.


During the 1960s, the phrase “the personal is political” became a rallying cry for second-wave feminists challenging the social framework that existed at the time.

There was an unhealthy collectivist undercurrent to this idea—“There are no personal solutions at this time,” wrote Women’s Liberation Movement member Carol Hanisch in an essay on the topic, “There is only collective action for a collective solution”—but the phrase also contains an element of truth.

Personal experience does play an undeniable role in how many humans perceive politics and social structures, which brings me to CNN medical analyst Dr. Leana Wen.

Throughout the pandemic, Wen was in what I’ll call the “pro-mandate” camp.

In March 2021, she excoriated governors who rescinded or failed to pass mask mandates in their states.

“We are not out of the woods. We haven’t reached the end of the pandemic,” Wen said in a pro-mask CNN piece. “It’s counterproductive and truly infuriating these governors are treating this as if the pandemic is over. It’s not true.”

Later that year, she went so far as to argue that unvaccinated people shouldn’t be allowed to leave their homes.

“We need to start looking at the choice to remain unvaccinated the same as we look at driving while intoxicated,” Wen told CNN’s Chris Cuomo. “You have the option to not get vaccinated if you want, but then you can’t go out in public.”

A year later, Wen’s views have changed. In a recent Washington Post article, she explained why she’ll no longer be masking her children and how she shifted away from “being extremely cautious” with Covid protocols.

“I accept the risk that my kids will probably contract covid-19 this school year, just as they could contract the flu, respiratory syncytial virus and other contagious diseases,” she writes. “As for most Americans, covid in our family will almost certainly be mild; and, like most Americans, we’ve made the decision that following precautions strict enough to prevent the highly contagious BA.5 will be very challenging.”

Wen’s observations are not wrong. The new variants are less deadly, and this is particularly true for children, which has always been the case.

A year ago, when Wen was still advocating strict mandates, we pointed out that the CDC’s own data showed small children were at far greater risk of dying from the flu, drowning, vehicle collisions, cancer, and other things than Covid.

This data, for whatever reason, apparently did little to persuade Wen in 2021, however. What does appear to have changed her mind is that her child appears to have suffered from the mandates.

“Masking has harmed our son’s language development,” she bluntly asserts in the article.

Throughout the pandemic, few policies have been debated with more fury than mask mandates. The vast majority of these debates focus on a single point: does masking prevent or even reduce Covid transmission? Some studies say yes, others cast doubt on their efficacy.

For many, however, the efficacy of masking became a sort of dogma that could not even be questioned. (If you doubt this, consider that until a few days ago one faced risk of suspension on YouTube for suggesting that masks don’t play a role in preventing Covid transmission.)

Far less discussion focused on the costs of forcing people to wear masks, and Wen now sees this as a mistake.

“There is a tradeoff,” Wen says.

Many, however, refused to acknowledge this and argued that masking is simply a moral imperative. I recently had a discussion at a family gathering with a person who supports mask mandates. He became indignant when my sister-in-law said she didn’t think it was right to force her children to wear masks at school all day long.

“It’s about protecting others,” he said. “It’s the smallest thing.”

The fact that he was not wearing a mask himself as he said this didn’t seem the least bit ironic to him, but it proved Wen’s point: there are tradeoffs. (If there was not, we’d wear them all the time.)

The idea of tradeoffs is perhaps the most basic principle in all of economics. It’s rooted in a simple idea: in order to have or do one thing, one must sacrifice having or doing something else. All things come with opportunity costs, big and small. (A minor tradeoff with masking is simply being able to breathe more freely.)

For most of the pandemic, many Americans and most public health officials refused to acknowledge the reality of tradeoffs. In 2021, The New York Times described a phenomenon known as “Covid Absolutism.” It consists of two primary factors: 1. Taking every conceivable step that could reduce the spread of Covid regardless of its actual effectiveness; 2. Downplaying or ignoring the unintended consequences and tradeoffs of these policies.

Basic economics, however, teaches us the folly of this thinking.

“There are no solutions, there are only trade-offs,” Thomas Sowell famously observed.

This was the economic lesson Wen learned during the pandemic. She didn’t learn it in a classroom or in a textbook. She learned it in her personal experience when her own child began to struggle with language development (not a minor tradeoff), just like countless other children.

Writing in The Atlantic, Stephanie Murray also wrote about the reality of tradeoffs, stating that many parents with youngsters who are struggling see the potential benefits of masking as a poor trade for what they lose developmentally.

“Children with speech or language disorders offer perhaps the clearest example of these murky trade-offs,” she writes.

This is precisely why decision-making must be left to individuals, not bureaucrats. Nobody is more capable of weighing the pros and cons of a trade or action better than the people who themselves stand to lose or benefit from that trade or action (or in this care, their parents).

Dr. Wen no doubt knows a great deal about public health, just like Anthony Fauci and Rochelle P. Walensky. But even Fauci and Walensky, I suspect, would concede that it’s Wen who knows what’s better for her child.

It must be stressed that it’s not just that Wen wants what’s best for her child. It’s that she actually knows what’s best for her child because she has infinitely more knowledge about her child than any distant bureaucrat or meddling politician could ever possess.

Nobel Prize-winning economist F.A. Hayek detailed this “local knowledge” concept in his work exploring “the knowledge problem,” and he showed why central planners seeking to engineer society through force are capable of producing little beyond “planned chaos.” This is why it’s so important that freedom of decision-making is left to those who have the most local knowledge and can most accurately assess the risks and rewards of any given action.

The good news is that Wen, to her credit, appears to have learned something throughout the tragedy of the Covid pandemic, as have so many others.

The tragedy is that for so long she overlooked tradeoffs and used her platform to advocate coercive policies that deprived individuals of the ability to choose, a tragedy that is compounded by the fact that Wen now finds herself a target of cancellation for advocating a more sensible approach.

It’s an ironic twist considering that only a year ago Wen herself was a proponent of confining unvaccinated people to their homes, and not one we should celebrate.

But hopefully it can be a learning experience for Wen and others, who now recognize the danger in turning what should be individual decisions over to bureaucrats and political tribes.

AUTHOR

Jon Miltimore

Jonathan Miltimore is the Managing Editor of FEE.org. His writing/reporting has been the subject of articles in TIME magazine, The Wall Street Journal, CNN, Forbes, Fox News, and the Star Tribune. Bylines: Newsweek, The Washington Times, MSN.com, The Washington Examiner, The Daily Caller, The Federalist, the Epoch Times.

RELATED ARTICLE: The CDC is Broken and Apologies Can’t Fix It: “We’ve Made Some Mistakes, But NOW You Can Trust Us!”

EDITORS NOTE: This FEE column is republished with permission. ©All rights reserved.

The Biden Administration Says U.S. Not in a Recession, but Federal Statutes Say Otherwise. Who is Right?

Is the U.S. economy in recession? The answer is, paradoxically, both easier and more complicated than you might think.


As expected the United States posted negative growth for the second consecutive quarter, according to government data released on Thursday.

“Real gross domestic product (GDP) decreased at an annual rate of 0.9 percent in the second quarter of 2022, following a decrease of 1.6 percent in the first quarter,” the US Bureau of Economic Analysis announced.

The news prompted many outlets, including The Wall Street Journal, to use the R word—recession, which historically has been commonly defined as “economic decline during which trade and industrial activity are reduced, generally identified by a fall in GDP in two successive quarters.”

The White House does not agree, however, and following the release of the data, President Biden said the US economy is “on the right path.”

The comments come as little surprise. Treasury Secretary Janet Yellen had recently hinted that the White House would contend the economy wasn’t actually in a recession even if Q2 data indicated the economy had contracted for a second consecutive quarter.

“There is an organization called the National Bureau of Economic Research that looks at a broad range of data in deciding whether or not there is a recession,” Yellen said. “And most of the data that they look at right now continues to be strong. I would be amazed if they would declare this period to be a recession, even if it happens to have two quarters of negative growth.”

“We have a very strong labor market,” she continued. “When you are creating almost 400,000 jobs a month, that is not a recession.”

Yellen is not wrong that NBER, a private nonprofit economic research organization, looks at a much broader swath of data to determine if the economy is in a recession, or that many view NBER’s Business Cycle Dating Committee as the “official recession scorekeeper.”

So White House officials have a point when they say “two negative quarters of GDP growth is not the technical definition of recession,” even though it is a commonly used definition.

On the other hand, it’s worth noting that federal statutes, the Congressional Budget Office, and other governing bodies use the two consecutive quarters of negative growth as an official indication of economic recession.

Phil Magness, an author and economic historian, points out that several “trigger” provisions exist in US laws (and Canadian law) that are designed to go into effect when the economy posts negative growth in consecutive quarters.

“For reference, here is the definition used in the Gramm-Rudman-Hollings Act of 1985,” Magness wrote on Twitter, referencing a clause in the Act. “This particular clause has been subsequently retained and replicated in several trigger clauses for recessionary measures in US federal statutes.”

It’s worth noting that Magness doesn’t contend the two consecutive quarters definition is the best method of determining whether an economy is in a recession, but simply points out that claims that it’s an “informal” definition of recession are untrue.

“It may not be a perfect metric, but it has a very long history of being used to determine policy during recessions,” Magness writes.

Some readers may find it strange that so much heat, ink, and energy is being spent on something as intangible as a word, which is a mere abstraction that has no value. And some policy experts agree.

“Whether [we’re] in a technical recession is less interesting to me than the following 3 questions,” Brian Riedl, an economist at the Manhattan Institute, recently said. “1) Are jobs plentiful? (Yes – good) 2) Are real wages rising? (Falling fast – bad) 3) Is inflation hitting fixed income fams? (Yes – bad.)”

Others contend that definitions matter, and that by ignoring the legal definition of recession, the Biden White House can continue to argue that the US economy is “historically strong” even as economic growth is negative, inflation is surging, and real wages are crashing.

As Charles Lane recently pointed out in the Washington Post, words have power. He shares a colorful anecdote involving Alfred E. “Fred” Kahn, an economist who served in the Carter Administration who was instructed to never use the words “recession” or “depression” again.

In 1978, Kahn — a Cornell University economist in charge of President Jimmy Carter’s inflation-fighting efforts — said that failure to get soaring prices under control could lead to a “deep, deep depression.” Carter’s aides, perturbed at the possible political fallout, instructed him never to say that word, or “recession,” again.

We don’t know whether this instruction stirred the wrath of Kahn, a verbal stickler notoriously disdainful of cant and euphemism; in a previous government job, he had sent around a memo telling staff not to use words like “herein.”

It did trigger his wit, though: In his next meeting with reporters, Kahn puckishly said the nation was in “danger of having the worst banana in 45 years.”

Lane’s anecdote about Kahn is instructive because it reveals something important about these debates. While they may have a certain amount of importance as far as political spin goes, they are meaningless as far as economic reality is concerned. Substituting the word “banana” for recession did not change economic conditions or the economic outlook one bit, which no doubt was precisely Lane’s point.

My colleague Peter Jacobsen made this point effectively earlier this week.

“[You] don’t need a thermometer to feel if it’s hot outside,” he wrote. “Economic issues, especially inflation, top the list of concerns for voters going into the 2022 midterms, and it isn’t particularly close. So officially defined recession or not, it doesn’t really matter.”

Moreover, Jacobsen explains, macroeconomic data like GDP have historically been the tool of politicians and bureaucrats, who use them to justify economic interventions.

“When GDP numbers fall below a certain level, politicians can use that data to try to push income back up. Or perhaps when the economy is ‘running too hot’ politicians can use fiscal and monetary policy to slow down the economy.

All of these metaphors about economies running hot or stalling are based on a central planning view of the economy. In this view, the economy is like a machine which we can adjust to bring about the proper results. Without macroeconomic statistics, central planners have fewer means by which to justify particular interventions. We can’t claim we need stimulus if we can’t point to some data indicating it’s necessary.”

The takeaway here is an important one. We don’t need “bureaucratic weathermen” telling us when the economy is good or bad anymore than we need them “managing” the economy with the money supply, which is precisely how we got here in the first place.

So while the debates over the R word are likely to continue, it’s important to remember it doesn’t really matter if you call this economy a recession or a banana. The fundamentals speak for themselves.

AUTHOR

Jon Miltimore

Jonathan Miltimore is the Managing Editor of FEE.org. His writing/reporting has been the subject of articles in TIME magazine, The Wall Street Journal, CNN, Forbes, Fox News, and the Star Tribune. Bylines: Newsweek, The Washington Times, MSN.com, The Washington Examiner, The Daily Caller, The Federalist, the Epoch Times.

RELATED VIDEO: GDP Report Shows Economic Plunge

EDITORS NOTE: This FEE column is republished with permission. ©All rights reserved.

Data Show California Is a Living Example of the Good Intentions Fallacy

“Concentrated power is not rendered harmless by the good intentions of those who create it.”


During a speech at Harvard several years ago, Charlie Munger related a story about a surgeon who removed “bushel baskets full of normal gallbladders” from patients. The doctor was eventually removed, but much later than he should have been.

Munger, the vice chairman of Berkshire Hathaway, wondered what motivated the doctor, so he asked a surgeon who participated in the removal of the physician.

“He thought that the gallbladder was the source of all medical evil, and if you really love your patients, you couldn’t get that organ out rapidly enough,” the physician explained.

The doctor was not motivated by profit or sadism; he very much believed he was doing right.

The anecdote is a perfect illustration of the righteousness fallacy, which Barry Brownstein noted is rampant in modern politics and a key driver of democratic socialism.

The Righteousness Fallacy (also known as the fallacy of good intentions) is described by author Dr. Bo Bennett as the idea that one is correct because their intentions are pure.

It recently occurred to me that California is a perfect example of this fallacy. Consider these three facts about the Golden State:

  1. California spends about $98.5 billion annually on welfare—the most in the US—but has the highest poverty rate in America.
  2. California has the highest income tax rate in the US, at 13.3 percent, but the fourth greatest income inequality of the 50 states.
  3. California has one of the most regulated housing markets in America, yet it has the highest homeless population in American and ranks 49th (per capita) in housing supply.

That politicians would persist with harmful policies should come as little surprise. The Nobel Prize-winning economist Milton Friedman once observed the uncanny proclivity of politicians “to judge policies and programs by their intentions rather than their results.”

In his book Capitalism and Freedom, Friedman described the danger of such thinking.

[The threat comes] … from men of good intentions and good will who wish to reform us. Impatient with the slowness of persuasion and example to achieve the great social changes they envision, they’re anxious to use the power of the state to achieve their ends and confident in their ability to do so. Yet… Concentrated power is not rendered harmless by the good intentions of those who create it.

I don’t doubt that California lawmakers, like the physician who was removing healthy gall bladders, believe they are doing the right thing. Yet they, like the physician, need to wake up to reality and realize they aren’t making people better.

AUTHOR

Jon Miltimore

Jonathan Miltimore is the Managing Editor of FEE.org. His writing/reporting has been the subject of articles in TIME magazine, The Wall Street Journal, CNN, Forbes, Fox News, and the Star Tribune. Bylines: Newsweek, The Washington Times, MSN.com, The Washington Examiner, The Daily Caller, The Federalist, the Epoch Times.

EDITORS NOTE: This FEE column is republished with permission. ©All rights reserved.

Will ESG Reform Capitalism—or Destroy It?

What “stakeholder capitalism” really means for the world.


Stakeholder capitalism has taken the global economy by storm in recent years. Its champions proclaim that it will save—and remake—the world. Will it live up to its hype or will it destroy capitalism in the name of reforming it?

Proponents pitch stakeholder capitalism as an antidote to the excesses of “shareholder capitalism,” which they condemn as too narrowly focused on maximizing profits (especially short-term profits) for corporate shareholders. This, they argue, is socially irresponsible and destructive, because it disregards the interests of other stakeholders, including customers, suppliers, employees, local communities, and society in general.

Stakeholder capitalism is ostensibly about incentivizing business leaders to take these wider considerations into account and thus make more “sustainable” decisions. This, it is argued, is also better in the long run for businesses’ bottom lines.

Today’s dominant strain of stakeholder capitalism is the doctrine known as ESG, which stands for “environmental, social, and corporate governance.” The label was coined in the 2004 report of Who Cares Wins, a joint initiative of elite financial institutions invited by the United Nations “to develop guidelines and recommendations on how to better integrate environmental, social and corporate governance issues in asset management, securities brokerage services and associated research functions.”

Who Cares Wins operated under the auspices of the UN’s Global Compact, which, as the report states, “is a corporate responsibility initiative launched by Secretary-General Kofi Annan in 2000 with the primary goal of implementing universal principles in business.”

Much progress has been made toward that goal. Since 2004, ESG has evolved from “guidelines and recommendations” to explicit standards that hold sway over huge swaths of the global economy.

These standards are set by ESG rating agencies like the Sustainability Accounting Standards Board (SASB) and enforced by investment firms that manage ESG funds. One such firm is Blackrock, whose CEO Larry Fink is a leading champion of both ESG and SASB.

In December, Reuters published a report titled “How 2021 became the year of ESG investing” which stated that, “ESG funds now account for 10% of worldwide fund assets.”

And in April, Bloomberg reported that ESG, “by some estimates represents more than $40 trillion in assets. According to Morningstar, genuine ESG funds held about $2.7 trillion in managed assets at the end of the fourth quarter.”

To access any of that capital, it is no longer enough for a business to offer a good return on investment. It must also report “environmental” and “social” metrics that meet ESG standards.

Is that a welcome development? Will the general public as non-owning “stakeholders” of these businesses be better off thanks to the implementation of ESG standards? Is stakeholder capitalism beginning to reform shareholder capitalism by widening its perspective and curing it of its narrow-minded fixation on profit uber alles?

To answer that, some clarification is in order. First of all, “shareholder capitalism” is a misleading term for laissez faire capitalism. It is true that, as Milton Friedman wrote in his 1970 critique of the “social responsibility of business” rhetoric of the time:

“In a free‐enterprise, private‐property system, a corporate executive is an employee of the owners of the business. He has direct responsibility to his employers. That responsibility is to conduct the business in accordance with their desires, which generally will be to make as much money as possible while conforming to the basic rules of the society, both those embodied in law and those embodied in ethical custom.”

Since the owners of a publicly traded corporation are its shareholders, it is true that they are and ought to be the “bosses” of a corporation’s employees—including its management. It is also true that corporate executives properly have a fiduciary responsibility to maximize profits for their shareholders.

But that does not mean that shareholders reign supreme under capitalism. As the great economist Ludwig von Mises explained in his book Human Action:

“The direction of all economic affairs is in the market society a task of the entrepreneurs [which, according to Mises’s technical definition includes shareholding investors]. Theirs is the control of production. They are at the helm and steer the ship. A superficial observer would believe that they are supreme. But they are not. They are bound to obey unconditionally the captain’s orders. The captain is the consumer.”

The “sovereign consumers,” as Mises calls them, issue their orders through “their buying and their abstention from buying.” Those orders are transmitted throughout the entire economy via the price system. Entrepreneurs and investors who correctly anticipate those orders and direct production accordingly are rewarded with profits. But if one, as Mises says, “does not strictly obey the orders of the public as they are conveyed to him by the structure of market prices, he suffers losses, he goes bankrupt, and is thus removed from his eminent position at the helm. Other men who did better in satisfying the demand of the consumers replace him.”

Under laissez faire capitalism, consumers, not shareholders, are the principal stakeholders whose preferences reign supreme. And shareholder profit is a measure of—and motivating reward for—success “in adjusting the course of production activities to the most urgent demand of the consumers,” as Mises wrote in his paper “Profit and Loss.”

This is highly relevant to the “stakeholder capitalism” discussion, because it means that, to the extent that the profit-and-loss metric is discounted for the sake of competing objectives (like serving other “stakeholders,” the sovereign consumers are dethroned, disregarded, and relatively impoverished.

Now it’s at least conceivable that ESG standards are not competing, but rather complementary to the profit-and-loss metric and thus serving consumers. In fact, that’s a big part of the ESG sales pitch: that corporations who adopt and adhere to ESG standards will enjoy higher long-term profits, because breaking free of their fixation on short-term shareholder returns will enable them to embrace more “sustainable” business practices.

In a free market, whether that promise would be fulfilled or not would be for the sovereign consumers to decide, and ESG would rise or fall on its own merits.

Unfortunately, our market economy is far from free. The State has rigged capital markets for the benefit of its elite lackeys in the financial industry: like the “Who Cares Wins” fat cats who started the ESG ball rolling in 2004 under the auspices of the United Nations.

One of the prime ways the State rigs markets is through central bank policy.

The prodigious amount of newly created money that the Federal Reserve and other central banks have pumped into financial institutions in recent years has transferred vast amounts of real wealth to those institutions from the general public. As a result, those institutions—big banks and investment companies—are now much more beholden to the State and much less beholden to consumers for their wealth.

As they say, “he who pays the piper calls the tune.” So it’s no surprise that these institutions are stumbling over themselves to get on board the State’s ESG bandwagon.

And that means that non-financial corporations also have to get with the ESG program if they want access to the Fed’s money tap and thus to capital. Especially as the average consumer becomes increasingly impoverished by disastrous economic policies, the incentive for corporations to earn market profit by pleasing consumers is being progressively superseded by the incentive to gain access to the Fed’s flow of loot by meeting the State’s “social” standards.

By increasingly controlling capital flows, the State is gaining ever more control over the entire economy.

This may explain the recent willingness of so many corporations to alienate customers and sacrifice profits on the altar of “green” and “woke” politics.

It is no coincidence that Klaus Schaub, the preeminent champion of the “Great Reset” also co-authored a book titled Stakeholder Capitalism. The upshot of stakeholder capitalism is that the State supplants the consumer as the supreme stakeholder in the economy. The sick joke of stakeholder capitalism is that it “reforms” capitalism by transforming it into a form of socialism.

AUTHOR

Dan Sanchez

Dan Sanchez is the Director of Content at the Foundation for Economic Education (FEE) and the editor-in chief of FEE.org.

EDITORS NOTE: This FEE column is republished with permission. ©All rights reserved.

These Widespread Shortages Can’t Be Explained by Supply Constraints Alone

Poorer markets can still clear. So why won’t they?


All sorts of shortages are now popping up in our economy. At the head of the list is undoubtedly infant formula, but there are literally dozens of other items in short supply. There are so many of them that I feel constrained to mention them in alphabetical order, lest I inadvertently miss one or engage in double counting.

Here they are, as best I can list them: aluminum, avocado, bicycles, blood collection tubes, blood for transfusions, canned vegetables, cat food, chlorine, Christmas trees, coal, coins, commercial air tickets, computer chips, cream cheese, dye used in CT scans, eggs, fuel oil, garage doors, gasoline, girl scout cookies, hand sanitizer, home covid tests, infant formula, juice boxes, liquor, lithium, lumber, maple syrup, meat, motorcycles, natural gas, paper towels, pet food, potatoes, semiconductors, soap, soda, sunflower oil, toilet paper, tomato paste and wine. Peanut butter has not yet been mentioned in this regard but will soon, undoubtedly, be added prominently to this list.

I’m not kidding: each and every one of these items has been mentioned in this regard in the major media. What is going on here? Has the economy gone crazy, or what? According to several headlines, that is just about what is occurring. Here are a few of them: “The world is still short of everything; get used to it.” “America is running out of everything.” “Product shortages and soaring prices reveal fragility of U.S. supply chain.”

If the shortage list is long, the presumed causes of this economic malfunction are almost as large. For peanut butter, it will be a recall due to contamination; a salmonella outbreak. But this is an input into many other products, such as fudge, chocolates and peanut butter sandwiches, which will also soon be hard to find. For many items on the list the antecedent is the Coronavirus, which has led to supply chain problems. Paying workers to stay home and earn as much or more than their salaries, a few months ago, also contributed. Blame was also laid at a harsh winter. Imports from abroad have been subject to sudden border closures. Ships stuck at harbors on the west coast have been vulnerable to shortages of truck drivers and regulations. Computer chips have been susceptible to supply inelasticity; new offerings as a result of higher prices take a great amount of time to become forthcoming. Consumers have been castigated for hoarding. Staffing problems have been held responsible for commercial air travel disruptions. Drought, the bird flu and the Ukraine war have been held culpable.

But we have had all of these things before, war, pestilence, disease, bad weather, ill health, government regulations, before. However, massive shortages, not of everything under the sun, but almost pretty close, have never before disrupted the economy to anything like the degree we are presently experiencing (apart from the two world wars, of course).

Where is the much-vaunted free enterprise system in all of this? Nowhere, that is where. Has it succumbed to so-called “market failure?” Not a bit of it. Rather, the difficulty is that public policy has made capitalism operate with one arm tied behind its back, and it has not been able to function when hemmed in by a plethora of restrictions, limitations and regulations.

Basic introductory Economics 101 teaches us that a shortage occurs when demand for an item exceeds its supply. What invariably occurs then? Why, prices rise. When this takes place, businesses are incentivized to produce more, buyers to purchase less. Voila, the shortage ends. Why doesn’t this occur under the Biden Administration? Why do we have so many shortages?

One possibility not at all in the public eye is that business firms are afraid to raise prices lest they be charged with price gouging. And why in turn might this be the case? The Bidenites are not exactly friends of the free enterprise system. Yes, to be sure, prices have indeed been rising. But are they increasing fast enough so as to quell shortages? Evidently not. Why not? This is possibly due to fear of being accused of gouging, and being subject to antitrust attentions. Wages, too, are on the incline. But likely not sufficiently so as to overcome the supply inelasticity difficulty. Why not? Firms may well be leery of so doing, in case they have to be decreased later on, and will be accused of exploiting, or victimizing laborers, or some such.

Prices and wages are typically somewhat sticky; that is, they are not instantaneously and fully flexible. But an anti-business philosophy of the sort now prevailing in Washington D.C. makes them even less able to perform the tasks for which we need them, than would otherwise be the case.

AUTHOR

Walter Block

Walter Edward Block is an American economist and anarcho-capitalist theorist who holds the Harold E. Wirth Eminent Scholar Endowed Chair in Economics at the J. A. Butt School of Business at Loyola University New Orleans. He is a member of the FEE Faculty Network.

EDITORS NOTE: This FEE column is republished with permission. ©All rights reserved.

19 Nuggets of Wisdom from the Best Economics Writer You’ve Never Heard Of

Americans would do well to make up the deficit in their knowledge of the works of Arthur Seldon and his “life for liberty.”


May 29 marks the birth of Arthur Seldon. While too-little known to American readers, he was editorial director of the London-based Institute of Economic Affairs for more than thirty years, during which, The Economist wrote, it “brought to the lay reader the ideas of all the leading free-market economists and thinkers of the day.”

Seldon produced a seven-volume set of collected works, including books, monographs, essays and articles, as well as editing hundreds of papers, monographs, and pamphlets.

In such a vast body of work, one cannot easily winnow out the best of Arthur Seldon’s insights. Therefore, consider some of the wisdom in just one of his books—Capitalism—winner of the Fisher Arts Literary Prize and celebrating its 30th anniversary this year:

  1. “Capitalism…creating high and rising living standards for the masses without sacrificing personal liberty speaks for itself. Only the deaf will not hear and the blind will not see.”
  2. “Even bad men are led by the market process to do good, but good men are induced by the political process to do harm. [So] discipline the writ of politics to the bare minimum.”
  3. “Private property is a potent working institution. Public ownership is…political power cornered by handfuls of irresponsible non-ownerships.”
  4. “Capitalism…allows individuals to take the risks of living their lives as they see best.”
  5. “The capitalist market…puts power–effective purchasing power–directly into the hands of the common man and woman for them to use where they wish…That is why the market is more essentially democratic than government.”
  6. “Changing private identifiable property into public unidentifiable property is to destroy the incentives to protect, conserve, improve and render it productive by using it profitably in making goods and services for which consumers will pay.”
  7. “Pricing is the peaceful way of resolving argument and conflict.”
  8. “It was the development and refinement of the law of private property rights that explains… modern progress.”
  9. “That in practice markets are imperfect has obscured the more fundamental truth that they are the best-known way of enabling individuals to meet for mutual benefit . . . World practice and experience…show no better, less imperfect, mechanism.”
  10. “As government has been inflated…It has undermined the instrument that could have done more for the common man.”
  11. “Capitalism embraces the self-correcting mechanisms of open discussion in free society to identify error and open competition in free markets to apply the corrections.”
  12. “The market does not require people to be good: it takes people as they are and induces them to do good by using their capabilities to provide what others want.”
  13. “Wherever it is used, government is so disappointing or worse—inefficient, unaccountable and corrupt—that it is best not to use it at all except for functions where all its faults have to be tolerated to obtain the services required…In short, the price of government is so high that it should be avoided wherever possible.”
  14. “The state has shown itself the false god of all who have looked to it.”
  15. “Capitalism…requires the eventual withdrawal by government from most of its accumulated activities.”
  16. “The inducements of capitalism compel the money-makers to do good; the inducements of socialism enable the power-holders to do harm.”
  17. “The political process…has become the master rather than the servant of the people.”
  18. “Individuals are smothered by collective decisions in the political process.”
  19. “The market of capitalism treats people as individuals; the political process of socialism herds them into categories. Capitalism makes for harmony, socialism for friction.”

Americans would do well to make up the deficit in their knowledge of the works of Arthur Seldon and his “life for liberty,” as his biographer, Colin Robinson, described it. As the IEA website put it, “Seldon highlights the improvements of mankind which came about not through some central plan or social organization but through individuals recognizing an opportunity to produce goods and services which met a need expressed by the demand in the market.”

In so doing, he advanced every individual’s potential, which is expanded by private property and voluntary market arrangements, but constricted when political power hinders the freedom and cooperation they engender.

AUTHOR

Gary M. Galles

Gary M. Galles is a Professor of Economics at Pepperdine University and a member of the Foundation for Economic Education faculty network. In addition to his new book, Pathways to Policy Failures (2020), his books include Lines of Liberty (2016), Faulty Premises, Faulty Policies (2014), and Apostle of Peace (2013).

EDITORS NOTE: This FEE column is republished with permission. ©All rights reserved.

There Ain’t No Such Thing as a Cost-Plus Lunch! Who’s really to blame for rising prices?

Why are restaurants adding “inflation fees” to their checks?


A group of friends had just finished a meal at Romano’s Macaroni Grill in Honolulu when one of them noticed something odd about the check. As a local television news station reported in April, a “Temporary Inflation Fee” of $2.00  was nestled inconspicuously between the $4.50 Flavored Tea and the $14.00 Spinach & Artichoke Dip.

The restaurant chain’s website explained that the charge was added to “partially offset… operational cost increases” due to unusual economic conditions including “global supply chain shortages and ever-growing pressure from inflation.” The statement said, “we believe these burdens will eventually pass,” which is why they resorted to a temporary surcharge instead of simply raising the listed prices. An alternative explanation is that surcharges that show up on the check but not the menu are a sneaky way to try to raise prices without losing customers.

The Wall Street Journal recently cited this incident as part of a general trend:

“Lightspeed, a global developer of point-of-sale software, said fee revenue nearly doubled from April 2021 to April 2022, based on a sample of 6,000 U.S. restaurants that use its platform. The number of restaurants adding service fees increased by 36.4% over the same period.”

The Journal cited industry analysts who basically agreed with Romano’s, explaining that:

“…this wave of surcharges is mostly being driven by restaurants trying to cope with the impact of rising inflation and a tight labor market on their bottom lines.” (…)

“​​Inflation and the pandemic posed particular challenges for the restaurant industry. The average price of supplies for a restaurant operator increased by 17.5% since last year, according to NPD Group. By comparison, consumer spending at restaurants rose 5% during that time.

The increase in surcharges is a way for businesses to recoup at least some of those costs, said David Portalatin, a food-industry adviser with the group.”

In media coverage of today’s rising prices in general, this has become a prevailing narrative: “businesses are passing their rising costs onto consumers.”

While superficially plausible, this gets the economics of prices the wrong way round.

The explanation refers to “cost-plus pricing,” which is the business practice of setting prices by starting with your costs and then adding a markup.

Of course, nothing in economics says that a business owner cannot use this method to decide on a price to quote. Surely, some do exactly that. But it is only a heuristic and it is not what fundamentally drives price changes.

Just as “there ain’t no such thing as a free lunch” (TANSTAAFL), there ain’t no such thing as a cost-plus lunch.

To explain price increases as resulting from “passing costs on to the customer” is to implicitly embrace a “cost of production” theory of value and prices, which, in a nutshell, maintains that costs determine prices.

Of course, costs are prices, too. A business’s “costs” are the prices it pays for factors of production (land, labor, and capital goods). So, in a bigger nutshell, this theory posits that “factor prices determine product prices.”

But this is the exact opposite of how an economy actually works. As Murray Rothbard wrote in his economics treatise Power and Market, “Prices, however, are never determined by costs of production, but rather the reverse is true.” In other words, it is anticipated product prices that determine factor prices: prices that determine costs, not the other way around.

This insight was one of the great discoveries that resulted from the “Marginal Revolution” of economics in the 1860s and 70s. This was a literal “revolution” in the sense that it showed the old economic paradigm to be upside-down and then turned it right-side-up.

Before the Marginal Revolution, the “classical economists” largely subscribed to Adam Smith’s cost-of-production theory of value or David Ricardo’s labor theory of value. The latter, like the former, derived the value of products from the value of factors: specifically the factor of labor. (Incidentally, Karl Marx largely based his exploitation and class war theories on Ricardo’s labor theory of value.)

For example, classical economists might have traced the high value of a bottle of fine wine to the high real estate value of the vineyard and/or the amount of labor that went into producing the wine.

But the Marginal Revolutionaries—William Stanley Jevons, Leon Walras, and Carl Menger—upended that paradigm. They and their followers (especially the Austrian school of economics, founded by Menger) explained that the value of a good is based on its “marginal utility,” which is the usefulness for want-satisfaction of an additional unit of a good. And what’s useful about a factor of production is that it can help produce useful products.

For example, the utility of a wine vineyard is that it can yield wine grapes. The same goes for the utility of a vineyard worker’s labor. And the utility of wine grapes is their contribution toward producing enjoyable wine.

So Austrian economists do the opposite of what the classical economists did. Austrians trace the real estate price of the vineyard and the wages of the vineyard worker to the anticipated value of the wine at the end of the production line.

The insights of the Marginal Revolution made it clear that prices determine costs (product prices determine factor prices), not the other way around, and that ultimately consumer preferences determine all prices.

(Note: Alfred Marshall tried to reconcile the classical cost-of-production theory with marginal utility theory in a “neoclassical synthesis” that has influenced mainstream economics to this day. See here for Murray Rothbard’s Austrian critique of that attempt.)

So the “cost passing” explanation of rising prices is a retrogression to a long-overthrown economic paradigm: the economic equivalent of forgetting the heliocentric Copernican Revolution of astronomy and explaining planetary movements using the archaic geocentric model of Ptolemy. Just as the sun does not revolve around the earth, consumer prices do not revolve around producer costs: quite the opposite.

Many on the political left blame corporations for “price gouging” in order to fatten their profits. But blaming rising prices on profit-seeking is like blaming a plane crash on gravity.

Gravity is always pulling down on planes. To explain a plane crash, you have to explain what happened to the factors that had previously counteracted that downward pull. Why did gravity yank the plane down to earth when it did and not before?

Similarly, businesses are always seeking profit and are always ready to raise prices if that is what will maximize profits. To explain precipitous price hikes, you have to explain what happened to the factors that had previously put a lid on that upward price pressure. Why did profit-seeking propel prices skyward recently and not in 2019?

This question is also tricky for those (including some on the political right) who blame rising prices on rising costs. If businesses can preserve profits by raising prices now that their costs are higher, why wouldn’t they have increased profits by raising prices before when their costs were lower?

A business’s customers don’t care about that business’s costs. They care about value. Based on the value they expect from a product, there is a limited price range they’d be willing to pay for any given amount of it. That translates into the market demand for the product: the quantity of a good that would be bought at any given price point. The value of, and demand for, a product does not fluctuate with its production costs.

Even businesses don’t (or at least shouldn’t) really care about past costs when it comes to pricing. Past costs are sunk. Whatever was spent to produce it, at any given moment a business has a given inventory. Its best interest is to price that inventory so as to maximize revenue given current demand. Based on that definite demand, raising prices past a certain point will result in less revenue, regardless of past costs. If the most revenue they can hope for is less than their past expenditure, that’s just the way things turned out. They can learn from that error and from those losses by spending less and/or differently in the future. But they cannot change the past or defy the economic reality of the present.

As economist Jonathan Newman told FEE in an interview:

“There is no change in costs that directly affects the revenue-maximizing price. If the prevailing market price is one that maximizes revenue for the firm, then it is impossible for the firm to ‘pass on’ costs to the consumer by increasing prices, because this would result in less revenue.”

Newman reminds us that, “factors of production are valued because they help us make consumer goods, not the other way around. What consumers are willing to pay for consumption goods determines what entrepreneurs are willing to pay for land, labor, and capital goods.” He offers an extreme example to make this point:

“Suppose that tomorrow the government decides to tax the sale of ink for ballpoint pens at $1 billion per mL. Would pen makers be able to carry on as usual and pass this increased cost on to consumers? Would consumers be willing to pay $1,000,000,000.25 for a pen? Of course not. Anticipated consumer demand is a limit on what producers will pay for inputs. More expensive inputs does not mean consumers are ready to pay a higher price for outputs.”

So if “cost passing” isn’t what’s driving up prices, what is? Newman points to monetary expansion by central banks, especially the Federal Reserve:

“I suspect that many firms will be able to get away with increased prices because of this. Even if their stated intention is to ‘pass on’ or share costs with their customers, the increased demand from the trillions of dollars that have been injected into the economy over the past couple years is what really makes their price increases both necessary and feasible.”

It is important to note that monetary price inflation is also not “passed on” from suppliers to customers, as “inflation surcharges” might lead you to believe. Again, the reality is the reverse of that. Extra money enables customers to bid up the prices charged by their suppliers, who in turn use the extra money to bid up the prices charged by their suppliers, and so on. That is how new money raises prices across the board (although, unevenly) as it circulates through the economy.

Another contributing factor to rising prices, at least in many specific industries, is today’s supply chain crisis. To an extent, Romano’s and industry analysts are right to blame rising restaurant prices on supply constraints. But they are wrong to characterize it as a matter of “passing on” or “recouping” costs. Rather, it is a matter of greater scarcity translating into a higher marginal utility of certain goods and thus higher prices.

For example, a major factor in today’s high food prices is undoubtedly the war in Ukraine. Both Ukraine and Russia were major exporters of grain. But, owing to Russia’s blockade of Ukraine and the West’s sanctions on Russia, grain exports from both countries have been throttled.

As a result, food processors have less grain to produce foodstuffs like, for example, macaroni. And as a result of that, restaurants have less macaroni to produce macaroni dishes. And when there’s less of something, its price tends to go up. That is probably one of the reasons why the Honolulu diners at Romano’s Macaroni Grill discussed above paid $11.00 for “Signature Mac & Cheese Bites.”

This phenomenon is not “passing on costs.” It is the rippling repercussions of economic destruction and impoverishment. The word “passing” implies that consumers are impoverished while producers are not. But that is not the case. Diminished production and greater scarcity impoverish everyone involved.

It is also confusing to call that “inflation,” although both academia and the media tend to lump all price increases together under that term. For any given increase in prices, part of it may be caused by monetary expansion, and another might be due to supply constraints. Personally, I think it would be clearer to call only the former, and not the latter, “inflation.” Price increases due to an increasing abundance of money should be distinguished from price increases due to a declining abundance of goods and services, although the former very frequently causes the latter (especially by creating economic bubbles and crashes).

Especially since the advent of the Covid crisis in 2020, we have suffered plenty of both. Central banks have been driving up prices with money printing sprees undertaken to finance government spending sprees. Governments have also been driving up prices by sabotaging supply chains through lockdowns, business shutdowns, wars, trade restrictions, and other policies of mass economic destruction.

As prices continue to rise and living standards continue to drop, it is important to understand how it is happening, why it is happening, and who is truly to blame.

AUTHOR
Dan Sanchez

Dan Sanchez is the Director of Content at the Foundation for Economic Education (FEE) and the editor-in chief of FEE.org.

EDITORS NOTE: This FEE column is republished with permission. All rights reserved.

The Economic Theory That Explains Biden’s Response to the Baby Formula Shortage

In a famous lecture, economist Ludwig von Mises showed how government intervention begets more intervention.


Over the last month, president Biden invoked the Defense Production Act in an attempt to fix the formula shortage. In a statement, the White House highlighted that,

“The President is requiring suppliers to direct needed resources to infant formula manufacturers before any other customer who may have ordered that good. Directing firms to prioritize and allocate the production of key infant formula inputs will help increase production and speed up in supply chains.”

In other words, the government is now engaging in what economist Don Lavoie referred to as non-comprehensive economic planning. It’s imposing rules requiring businesses to operate in a way that bureaucrats believe will quickly resolve this crisis. But the planning seems to have failed. Since Biden invoked the DPA, the number of stores out of stock has increase to 70% according to ABC news.

While some may be surprised that the US government can so quickly command industry, it should be no surprise at all. In fact, some basic understanding of government intervention shows that this sort of result is seemingly inevitable.

There have been several good articles explaining the source of this infant formula shortage. FEE’s own Jon Miltimore produced a great story on the topic. But, to keep it short, Abbott, one of the country’s largest formula producers, had a plant shut down by the FDA due to safety concerns.

But how could shutting down one plant in the whole country cause this? Well, formula production is one of the most tightly regulated industries in the US. Because of this, it’s very difficult to enter the market, so there are a few firms that  dominate the industry. So, when one has problems, the national supply is severely impacted.

One of the most harmful regulations are related to WIC and SNAP programs aimed at providing taxpayer subsidized formula to low-income consumers.

As reported in Time, Congress, in a supposed attempt to limit the cost of this program, made each state select one company to have formula which can be bought with WIC and SNAP in 1989. Since up to two thirds of formula is purchased with WIC and SNAP, the winners of these bids are able to crush competition.

Furthermore, until recently, the FDA banned importation of formula that listed ingredients in an order not prescribed by US bureaucrats. This limit on imports further restricts competition on a basis unrelated to health.

Meanwhile Fortune highlights research that shows despite European brands meeting safety regulations by and large, the FDA still restricts these imports due to the instructions being confusing.

Economist Alex Tabarrok highlights how price controls may be playing a role in the shortage as well.

Policy analyst Gabriella Beaumont-Smith examines the trade restrictions on baby formula, which includes tariffs of up to 17.5 percent.

In short, the industry is tangled in a web of intervention which is killing competition.

It’s this abundance of regulation that makes Biden’s use of the Defense Production Act so unsurprising.

In 1950, economist Ludwig von Mises gave a lecture titled “The middle of the road policy leads to socialism.” In this lecture, Mises expounded upon a theory now known by many as “the dynamics of interventionism.”

Mises uses an example of the dairy industry to show how intervention unfolds dynamically. Imagine the government decides that the price of milk is too high for poor people to afford it. In order to remedy the problem, the government passes a price control. For example, “milk cannot cost more than $2/gallon.”

But another problem arises. At this lower price, dairy farmers can no longer sell their milk at a high enough price to make a profit. Instead, they would be better off exiting the industry. But if dairy farmers exit, there will be less milk to buy. If the government wants to continue to make milk affordable and accessible, they’ll have to bail out the dairy industry. One way they could do this is by setting a price control on feed for cows.

But then producers of cattle feed will make losses. So, the interventions must occur again.

Intervention begets intervention.

This dynamic is exactly what is occurring in the formula industry. FDA regulations have made it impossible in the current industry for sufficient competition to arise.

This lack of competition combined with FDA shutdowns exacerbates the possibility of shortages like this. The shortages lead to the executive branch using the Defense Production Act to control the industries which provide imports to the formula industry.

Again, intervention begets intervention.

Some may accept the argument but argue that now that we have a crisis, we need to use things like the Defense Production Act to end it.

I disagree.

Government bureaucrats have insufficient knowledge and incentives to craft regulations which actually help. The Defense Production Act won’t help, because the government does not effectively plan the economy.

Need proof? The Abbot formula plant was shut down in February. The politicians and bureaucrats in Washington had from February to May to create and carry out a plan which would prevent this crisis. They failed.

Rather than solve the problem by using the same means that created it, central planners would be wise to lay down their Excel spreadsheets and let the market solve problems.

Allowing consumers to give their money and provide profits to companies which best solve their needs is how babies get fed.

Time to clean up the web of intervention.

AUTHOR

Peter Jacobsen

Peter Jacobsen is an Assistant Professor of Economics at Ottawa University and the Gwartney Professor of Economic Education and Research at the Gwartney Institute. He received his PhD in economics from George Mason University, and obtained his BS from Southeast Missouri State University. His research interest is at the intersection of political economy, development economics, and population economics. His website can be found here.

RELATED ARTICLES:

MISINFORMATION WATCH: Baby Formula and Biden’s Misinformation Blame Game

Walgreens Starts RATIONING Baby Formula as Shortage Worsens Under Democrat Regime

MONSTERS: Biden White House Blames American Moms For Baby Formula Shortage, ‘They’re HOARDING’

EDITORS NOTE: This FEE column is republished with permission. ©All rights reserved.

How Socialism Discourages Work and Creates Poverty

Socialism diminishes people’s incentive to work to improve their circumstances by depriving them of the fruits of their effort.


Advocacy for “socialism,” which the Socialist Party USA defines as a “social and economic order in which workers and consumers control production,” has made a comeback in American politics in recent years. Public figures such as Vermont Senator Bernie Sanders sing its praises. But the truth is that socialism deeply undermines people’s ability (and motivation) to improve their own living conditions. The misery socialism has caused for millions of people refutes its promises—horrifically.

Socialism, advocates claim, will bring prosperity and better living conditions for everyone, a claim also made for communism, in which the government controls the means of production and the distribution of the results. British philosopher Bertrand Russell wrote that socialism is “calculated to increase the happiness, not only of proletarians, but of all except a tiny minority of the human race.” As have its advocates throughout history, the now-defunct Socialist Labor Party of America depicted socialism as utopian, writing: “Under socialism our farmlands would yield an abundance without great toil; the factories, mines and mills would be the safest, the most modern, the most efficient possible and productive beyond our wildest dreams—and without laborious work.” The website doesn’t specify how such magic would occur.

The website further insists that socialism would improve virtually every aspect of life, stating: “Our natural resources would be intelligently conserved. Our schools would have the finest facilities and they would be devoted to developing complete human beings, not wages [sic] slaves who are trained to hire themselves out for someone else’s profit. Our hospitals and social services would create and maintain the finest health and recreational facilities.”

But socialist policies, when enacted, have catastrophic effects on the lives of the people living under them. To enforce such policies, governments must take control of people’s property—whether by fully nationalizing businesses, mandating what and how much a company must produce, or seizing and distributing their products—thereby violating people’s right to the product of their own effort. The victims include entrepreneurs who have built or purchased businesses, landlords who maintain and manage properties, and everyone who earns a wage, from construction workers to artists.

By violating these rights, socialism diminishes people’s incentive to work to improve their circumstances by controlling or taking away the results of their effort. However hard you work, whatever you achieve, whatever value you create—it won’t be reflected in your earnings.

The novelist Ayn Rand dramatized the effects of such a doctrine in her magnum opus, Atlas Shrugged. In the novel, a small town factory enacted Marx’s slogan “From each according to his ability, to each according to his need” as policy, so that each person’s pay depended on what managers considered as their level of need compared to their colleagues’. They did this based on such factors as the number of children the employees supported, family members’ illnesses, and so on. People began to spend more time sharing their woes with the management than working, and many of the best employees left the company entirely. Within four years, the factory closed. One character explained the hopelessness the policy created: “What was it we were supposed to want to work for? For the love of our brothers? What brothers? For the bums, the loafers, the moochers we saw all around us? And whether they were cheating or plain incompetent, whether they were unwilling or unable—what difference did that make to us? If we were tied for life to the level of their unfitness, faked or real, how long could we care to go on?”

He explained that the company had once been a thriving one that people were proud to work for, but now hard times were the status quo: “We were beasts of burden struggling blindly in some sort of place that was half-hospital, half-stockyards—a place geared to nothing but disability, disaster, disease—beasts put there for the relief of whatever whoever chose to say was whichever’s need.”

This story, although fictional, points to an important fact about human nature: If people can’t change their situation, they won’t try to. Knowing the outcome in advance, they will feel no motivation to make Herculean efforts for miniscule or nonexistent rewards. As economist Ludwig Von Mises put it:

To make a man act, uneasiness and the image of a more satisfactory state alone are not sufficient. A third condition is required: the expectation that purposeful behavior has the power to remove or at least to alleviate the felt uneasiness. In the absence of this condition no action is feasible. Man must yield to the inevitable. He must submit to destiny. [emphasis added]

Socialist policies severely restrict individuals’ ability to improve their conditions, so productivity suffers and living conditions plummet. Historical examples of socialism, as well as modern-day Venezuela and North Korea, show the misery that results.

In Soviet Russia, the government attempted to distribute the results of sixty years of steady GDP growth equally by seizing personal fortunes and dictating wages. But buying power for the average person dropped sharply, and whether a person could actually spend his or her wages was largely dependent on knowing the right people. Economist Mark Harrison explains: “The distribution of consumer goods and services was characterized by shortage and privilege. Every Soviet adult could count on an income, but income did not decide access to goods and services – that depended on political and social status.”

People who lived under the Soviet regime and now live in modern Russia appreciate that they have more opportunities to improve their lives than they used to. Back in 2007, interviewers asked Russians about their memories and opinions of life under the Soviet regime; many of them recalled that the USSR had “fewer possibilities.” One respondent explained, “Now there are so many chances. You can earn enough money even to buy an apartment. Certainly it is very, very difficult, but possible.” Another participant elaborated, “Now I can earn money and there are many ways of doing so. . . . In the Soviet Union, engineers and other technical employees of middle and high rank did not have [a] right to a second job. People who had the time and energy and wanted to provide more for their families could not do it.”

In other words, people were willing to work extremely hard to improve their conditions—but weren’t allowed to.

In Venezuela, socialism has driven a once-prosperous country into the ground. University professors juggle multiple jobs to keep food on the table. Others try to escape a desperate situation; more than six million have fled in recent years, and in 2017 the suicide rate was nearly double the global average. Venezuelans are willing to work to improve their circumstances—but the socialist regime’s oppression and economic destruction consistently frustrate their efforts.

North Korea was conceived as a communist nation following the Second World War, but formally switched to a form of “self-reliant” socialism following the Korean War. The leadership of the Worker’s Party of Korea has brought widespread misery in the form of horrific rights violations, including torture, severe censorship, forced labor, and arbitrary detention. Their policies have also led to nearly half the country suffering from inconsistent access to food and water—in stark contrast to their far more capitalist neighbor, South Korea, which has flourished in recent decades.

Advocates of socialism protest that historical examples of socialism were not “true socialism” or “the right kind of socialism.” But it is socialism—people giving government control of producing things—that undermines people’s ability and willingness to produce and provide for themselves in all these examples.

With free markets, by contrast, people are free to own private property and run businesses without the government dictating production or distribution. People are rewarded for their hard work and ability. By innovating, excelling at work, and creating more and better products or services, they can make more money, which they can use to pay for better living quarters, education, electronics, travel, or other life-improving goods or services produced by others. Hence, in mostly free and capitalistic countries, such as the US, the United Kingdom, Ireland, and Hong Kong, people have enjoyed massive economic growth, which has corresponded with a major increase in average living standards.

When human beings struggle, create, and innovate, but their efforts do not improve their own circumstances, they burn out or give up. Marx, Russell, Sanders, and other proponents of socialism and communism claim that their preferred systems are “for the people”—but the truth is that they work against the nature and needs of human beings.

AUTHOR

Angelica Walker-Werth

Angelica Walker-Werth is an Ayn Rand Fellow with FEE’s Hazlitt Project and a recent graduate of Clemson University. She is an assistant editor and writer at The Objective Standard and a fellow and research associate at Objective Standard Institute. Her hobbies include gardening and travel.

RELATED ARTICLE: Bernie Sanders Just Proposed a 95% Business Tax. Here’s Why That’s So Absurd

RELATED VIDEO: DeSantis Says That Americans Are Fleeing “Dumpster Fire States”

EDITORS NOTE: This FEE column is republished with permission. ©All rights reserved.

How CO2 Supply Chain Mayhem Almost Caused a Meat Shortage in Britain

In recent months, many of us have faced empty shelves, long lines, and frustrating delays as supply chains have seized up around the country, and indeed the world. Some have argued that the government should step in to fix these issues, blaming the problems on “corporate greed” and “the free market”. But while it may be tempting to blame private companies for our current woes and see the government as the savior, the reality is not that simple. Indeed, far from being the solution, government intervention in the market is arguably the primary cause of these problems in the first place.

A good case study for this issue is Great Britain. Back in September, the nation’s supply chain issues got so bad that they almost had major disruptions in their food supply. The UK government has been intervening in an attempt to fix the problems in the short run, but the situation is still extremely precarious.

So who is responsible for these issues? Well, let’s follow the supply chain link-by-link and see if it can lead us to the culprit.

The immediate problem that food producers are facing is a shortage of food-grade carbon dioxide (CO2). The meat industry is particularly affected by this shortage, since CO2 is used in many meat production processes. But aside from that, the gas also plays a key role in modified atmosphere packaging, which is used to prolong the shelf life of many food products. It’s also used in carbonated drinks (hence the name) like beer and soda, and in its solid form as dry ice it is used to keep fresh food cool during transportation.

Why is there a shortage of CO2? Well, most food-grade CO2 comes from fertilizer plants, because CO2 is a byproduct of the fertilizer manufacturing process. These plants, however, have been producing far less CO2 than normal. So to understand why there’s so little CO2, we need to investigate the fertilizer plants. This brings us to the next link in the chain.

Two of the biggest fertilizer plants in the UK are owned by a company called CF Industries. Together, they normally produce about 60 percent of the UK’s food-grade CO2. However, these plants were actually shut down for a large part of September, which drastically reduced the UK’s CO2 production.

The reason they were shut down is because natural gas, an essential part of the fertilizer process, has been very expensive in recent months. With the price of this key input so high, it was actually uneconomical for the plants to operate, so they decided to shut down temporarily in hopes of restarting their operations once the price of natural gas came back down. But why is natural gas suddenly so expensive? This brings us to the third link in the chain.

First, to say that natural gas prices are high in Britain is really quite the understatement. According to Industry group Oil & Gas UK, wholesale prices for gas in September were up 250 percent since January, and had increased 70 percent since August. As one UK energy CEO remarked, this is “the most extreme energy market in decades.”

So what’s causing the high prices? A number of factors. High global demand has played a role, especially since roughly 60 percent of the UK’s natural gas supply is imported. Lower solar and wind output have also been factors, as well as outages at some nuclear stations. The cold winter in 2020 also resulted in depleted stocks (since people use natural gas to heat their homes), and several gas platforms in the North Sea have closed to perform maintenance that was paused because of the COVID-19 lockdowns.

But one of the biggest sources of price volatility is the dearth of natural gas storage facilities in the UK.

“The UK currently has very modest amounts of storage, less than 6% of annual demand.” writes Michael Bradshaw, a Professor of Global Energy at the University of Warwick. “In Germany, France, and Italy, storage covers about 20% of annual demand,” he continues for context. Another report noted that the UK has enough storage to last for about 7 days, whereas Germany and France have roughly 90 days of storage.

While storage is far from the only factor affecting natural gas prices, it certainly plays a significant role. But why does Britain have so little storage capacity? This brings us to the final link in the chain.

One of the reasons for Britain’s low storage capacity is that a storage facility called Rough, which used to provide a significant percentage of the UKs natural gas storage, was decommissioned in 2017 as a result of age-related deterioration.

Industry leaders were concerned about the resulting lack of storage at the time, and have been warning about the issue ever since.

“Rough makes up an impressive 70% of the UK’s storage working gas volume,” Timera Energy noted back in 2017, when permanent closure was still being deliberated. “This can be contrasted with Rough’s contribution to the UK’s daily deliverability, at around 25%. And it is the deliverability that the UK market will miss most.”

They go on to explicitly discuss the likely impact of the closure on the price of natural gas. “The loss of deliverability should boost spot price volatility as it reduces the buffer of supply flexibility available to respond to swings in daily demand…The loss of working gas volume is likely to mean that supply shocks…have a sharper and more prolonged price impact.”

The need for more storage was reiterated in 2019 by another industry leader named InfraStrata Plc. “There is more demand in the market than we can satisfy,” said John Wood, the CEO of InfraStrata. “The market in the U.K. is sending out strong economic signals for additional gas storage capacity.”

So why wasn’t more storage built? Well, as it turns out, natural gas storage is taxed and regulated very heavily in the UK, much more so than other industries. Indeed, one of the largest gas storage operators in the country, called Storengy, explicitly called attention to these problems back in 2018, pointing out the “punitive” and “extortionate” tax levels that are applied to storage facilities as well as the numerous regulations that burden the industry.

As a result of these barriers, many potential storage projects have remained on the shelf, since they are prohibitively expensive in the current business environment. Thus, even though the demand is clearly there, the market has been unable to meet it, because taxes and regulations have severely crippled the industry.

This analysis is hardly exhaustive, of course. But at least with respect to the storage issue, it seems clear that government intervention in the market is the primary cause of the food supply chain disruptions.

One of the interesting things about this story is how it highlights the plethora of people, items, and systems that work together to keep our grocery shelves full. First, we discovered that food producers rely on CO2. That led us to investigate fertilizer plants and the crazy natural gas market, and then from there we explored natural gas storage and learned about the many ways that government intervention has been crippling that industry. Of course, most people wouldn’t intuitively connect gas storage regulations with food availability, but the rippling unintended consequences of these policies are very real nonetheless.

In his famous essay “I, Pencil,” Leonard Read similarly draws attention to the “innumerable antecedents” of everyday items, such as the seemingly simple lead pencil.

“Just as you cannot trace your family tree back very far, so is it impossible for me to name and explain all my antecedents,” Read wrote, speaking as the pencil. He goes on to discuss some of the many ancestors of the pencil, the people and things that went into producing it, and he points out how they all depend on one another. Indeed, you can’t mess with the trucking industry without impacting the production of pencils, just as you can’t mess with natural gas storage without impacting food supplies.

With that said, trucking and natural gas are not only ancestors of pencils and food. They are also ancestors of many other products, and this leads to an important insight. In reality, it’s actually somewhat misleading to speak of supply chains, as if the economy consisted of independent, linear processes. The economy is much more accurately characterized as one giant supply web, a multiplicity of interconnected processes that all depend on each other in various ways.

With this in mind, it quickly becomes apparent why interfering with the economy can be so dangerous. When the government breaks one part of the web, they aren’t just impacting one chain, they are creating countless unintended consequences, many of which are impossible to foresee.

If we’re lucky, those consequences will only lead to higher prices. If we’re not so lucky, empty grocery shelves await.

To address the looming crisis, the UK government ended up bailing out CF Industries, the company that owns the fertilizer plants. The deal, which was finalized on September 21, resulted in one of the two plants resuming operations, with the UK government providing “limited financial support,” which the Environment Secretary later clarified was “going to be into many millions, possibly the tens of millions [of euros].”

Since then, the government has brokered a deal between CF Industries and its CO2 buyers. Though the details are unclear, the government seems to be involved in setting the price of CO2, which would constitute even more intervention in the market.

But intervention is not the solution here. When governments intervene, they inevitably distort price signals, leading to increasingly inefficient outcomes. The real solution is for the government to stop causing the problem in the first place by removing the taxes and regulations that are standing in the way of the natural gas storage market.

Granted, it will take some time before the storage market can adjust, but even in the interim, the best way to address these problems is to let markets and prices do their thing.

COLUMN BY

Patrick Carroll

Patrick Carroll has a degree in Chemical Engineering from the University of Waterloo and is an Editorial Fellow at the Foundation for Economic Education.

EDITORS NOTE: This FEE column is republished with permission. ©All rights reserved.

The AEI Housing Center’s Critique of “How We Investigated Racial Disparities in Federal Mortgage Data”

The Rest of the Story

The American Enterprise Institute’s Housing Center released a special briefing: “The Rest of the Story The AEI Housing Center’s Critique of ‘How We Investigated Racial Disparities in Federal Mortgage Data.‘” The call reports on the Housing Center’s analysis of and critiques of a widely-circulated report by The Markup/Associated Press: “How We Investigated Racial Disparities in Federal Mortgage Data (2018).

Audio Recording

Key takeaways

  • A recent The Markup/Associated Press (AP) analysis found that decline rates, after controlling for 17 independent variables, are higher for the protected classes than for Whites.
  • However, as we have pointed out for home valuations and appraiser bias*, these arguments alleging systemic racism do not hold up to close scrutiny.
  • The Markup’s analysis did not include applicant credit scores, which are highly predictive of defaults. It also ignores lending outcomes – the other half of the story.
  • We address these issues by incorporating credit scores and evaluating risk-adjusted default rates by race and ethnicity. This allows us to evaluate lending outcomes, not just lending inputs.
  • We find that risk-adjusted default rates are higher for protected classes than for Whites by a statistically significant amount.
  • Given data limitations on denials, especially lack of credit score, it is impossible to calculate risk-adjusted decline rates for protected and non-protected class applicants. However, because we found that loans to protected class borrowers have higher risk-adjusted default rates than for Whites, this indicates lenders are extending more lenient underwriting to protected class borrowers than would otherwise be justified based on risk characteristics. Thus, one may infer that risk-adjusted decline rates, if calculable, would be lower for protected class applicants than for Whites, the opposite of the finding by The Markup/AP.

If you would like to receive invitations to our monthly update calls, please subscribe here. For data on mortgage risk, please use our Mortgage Risk Index Interactive.

“Restaurant Recession” Hits NYC Following $15 Minimum Wage

This will be a rough year for full-service NYC restaurants as they try to navigate a future with significant economic headwinds and significantly higher labor costs from the city’s $15 an hour minimum wage.

An article in the New York Eater (“Restaurateurs Are Scrambling to Cut Service and Raise Prices After Minimum Wage Hike“) highlights some of the suffering New York City’s full-service restaurants are experiencing following the December 31, 2018 hike in the city’s minimum wage to $15 an hour, which is 15.4% higher than the $13 minimum wage a year earlier and 36.4% higher than the $11 an hour two years ago. For example, Rosa Mexicana operates four restaurants in Manhattan and estimates the $15 mandated wage will increase their labor costs by $600,000 this year. Here’s a slice:

Now, across the city, restaurant owners and operators are reworking their budgets and operations to come up with those extra funds. Some restaurants, like Rosa Mexicano, are changing scheduling. Other restaurateurs are cutting hours and staffers, raising menu prices, and otherwise nixing costs wherever they can.

And though the new regulations are intended to benefit employees, some restaurateurs and staffers say that take home pay ends up being less due to fewer hours — or that employees face more work because there are fewer staffers per shift. “The bottom line is, we have to reduce the number of hours we spend,” says Chris Westcott, Rosa Mexicano’s president and CEO. “And unfortunately that means that, in many cases, employees are earning less even though they’re making more.”

In a survey conducted by New York City Hospitality Alliance late last year, about 75% of the more than 300 respondents operating full-service restaurants reported they’ll reduce employee hours this year because of the new wage increases, while 47% said they’ll eliminate jobs in 2019.

Note also that the survey also reported that “76.50% of respondents report reducing employee hours and 36.30% eliminated jobs in 2018 in response to mandated wage increases.” Those staff reductions are showing up in the NYC full-service restaurant employee series from the BLS, see chart above. December 2018 restaurant jobs were down by almost 3,000 (and by 1.64%) from the previous December, and the 2.5% annual decline in March 2018 was the worst annual decline since the sharp collapse in restaurant jobs following 9/11 in 2001.

As the chart shows, it usually takes an economic recession to cause year-over-year job losses at NYC’s full-service restaurants, so it’s likely that this is a “restaurant recession” tied to the annual series of minimum wage hikes that brought the city’s minimum wage to $15 an hour at the end of last year. And the NYC restaurant recession is happening even as the national economy hums along in the 117th month of the second-longest economic expansion in history and just short of the 120-month record expansion from March 1991 to March 2001.

Here’s more of the article:

“There’s a lot of concern and anxiety happening within the city’s restaurant industry,” says Andrew Rigie, executive director of the restaurant advocacy group. Most restaurant owners want to pay employees more, he says, but are challenged by “the financial realities of running a restaurant in New York City.” Merelyn Bucio, a server at a restaurant in Soho that she declined to name, says her hours were cut and her workload increased when wage rates rose. Server assistants and bussers now work fewer shifts, so she and other servers take on side work like polishing silverware and glasses. “We have large sections, and there are large groups, so it’s more difficult,” she says. “You need your server assistant in order to give guests a better experience.”

At Lalito, a small restaurant in Chinatown, they used to roster two servers on the floor, but post wage increases, there’s only one, who is armed with a handheld POS (point of sale) system, according to co-owner Mateusz Lilpop. Having fewer people working was the only way for him to reduce costs, he says. Since the hike, labor costs at Lalito have risen about 10 percent — from 30 to 35 percent to 40 to 45 percent of sales, he says.

These changes get passed onto the diner, some restaurateurs argue. Service can suffer due to fewer people on the floor, or more and more restaurateurs will explore the fast-casual format over full-service ones. Some restaurants are also raising prices for customers. According to the NYC Hospitality Alliance’s survey, close to 90 percent of respondents expect to raise menu prices this year. Lalito’s menu prices have increased by 10 to 15 percent. Lilpop says, and it’s not just the cost of paying his staff driving prices up — it’s a ripple effect from New York-based food purveyors’ own labor cost increases.

“If you have a farmer that has employees that are picking fruit, he has to increase his labor costs, which means he has to increase his fruit prices,” Lilpop says. “I have to buy that fruit from him at a higher rate, and it goes down the chain.”

A few economic lessons here.

  1. A reduction in restaurant staffing that results in a decline in customer service (e.g., longer wait times, less attentive wait staff, etc.) is equivalent to a price increase for customers.
  2. The increases in the city minimum wage to $15 an hour, in addition to directly increasing labor costs for restaurants, also affects the labor costs of companies that supply food, liquor, restaurant supplies, menus, etc. and causes a ripple effect of indirect higher operational costs throughout the entire restaurant supply chain as described above.
  3. Even for workers who keep their jobs, a higher minimum wage per hour doesn’t necessarily translate into higher weekly earnings, if the reduction in hours is greater than the increase in hourly wages. For example, 40 hours per week at $13 an hour generates higher weekly pre-tax earnings ($520) than 33 hours per week at the higher $15 an hour ($495).

Prediction: This will be a rough year for full-service NYC restaurants as they try to navigate a future with significant economic headwinds and significantly higher labor costs from the city’s $15 an hour minimum wage.

This article was reprinted from the American Enterprise Institute.

COLUMN BY

Mark J. Perry

Mark J. Perry

Mark J. Perry is a scholar at the American Enterprise Institute and a professor of economics and finance at the University of Michigan’s Flint campus.

EDITORS NOTE: This FEE column with images is republished with permission. Image Credit: Wikimedia Commons | CC BY 2.0

The New York Times Explains Why the Minimum Wage Should Be $0.00

The minimum wage is the Jason Vorhees of economics. It just won’t die.

No matter how many jobs the minimum wage destroys, no matter how many times you debunk it, it always comes back to wreak more havoc.

We’ve covered the issues at length at FEE, and quite effectively, if I do say so myself. But I have to admit that one of the greatest takedowns of the minimum wage you’ll ever find comes from an unlikely place: The New York Times.

There are many reasons people and politicians find the minimum wage attractive, of course. But the Times, in an editorial entitled “The Right Minimum Wage: 0.00,” skillfully rebuts each of these reasons in turn.

Noting that the federal minimum wage has been frozen for some six years, the Times admits that it’s no wonder that organized labor is pressuring politicians to increase the federal minimum wage to raise the standard of living for poorer working Americans.

“No wonder. But still a mistake,” the Times explains. “There’s a virtual consensus among economists that the minimum wage is an idea whose time has passed.”

But why has the idea “passed”? Why would raising the minimum wage not help the working poor?

“Raising the minimum wage by a substantial amount would price working poor people out of the job market,” the editors explain.

But wouldn’t the minimum wage increase the purchasing power of low-income Americans? Wouldn’t a meaningful increase allow a single breadwinner to support a family of three and actually be above the official U.S. poverty line?

Ideally, yes. But there are unseen problems, as the editors point out:

There are catches…[A higher minimum wage] would increase employers’ incentives to evade the law, expanding the underground economy. More important, it would increase unemployment: Raise the legal minimum price of labor above the productivity of the least skilled workers and fewer will be hired.

But if that’s true, why would progressives support such a law? What’s their rationale for supporting a minimum wage if it does more harm than good? Is it sheer political opportunism?

Not necessarily. The Times explains:

A higher minimum would undoubtedly raise the living standard of the majority of low-wage workers who could keep their jobs. That gain, it is argued, would justify the sacrifice of the minority who became unemployable.

There’s just one problem with this logic, the editors say:

The argument isn’t convincing. Those at greatest risk from a higher minimum would be young, poor workers, who already face formidable barriers to getting and keeping jobs. The idea of using a minimum wage to overcome poverty is old, honorable – and fundamentally flawed. It’s time to put this hoary debate behind us, and find a better way to improve the lives of people who work very hard for very little.

It’s a compelling, reasoned, and erudite argument. But it’s not exactly what one expects to see in The New York Times these days. (A naughty person might say the same about reason and erudition in general in the paper.)

So what gives? Alas, the editorial is a relic. It was written way, way back in 1987. A lot has changed since then.

We’ve had a couple wars. The internet was introduced to the masses. There was 9-11. We elected the nation’s first black president. The Cubs and Red Sox won the World Series. There was even a female reboot of Ghostbusters.

At least one thing, however, did not change. That would be the laws of economics. They hold as fast and true in 2018 as they did in 1987.

The Times’ editorial board might have changed. The perception of the minimum wage certainly changed. Relatively recent polls show seven out of ten Americans support raising the federal minimum wage. Several cities—Seattle, New York, and Minneapolis, among them—have passed laws that raised (or will soon raise) the minimum wage to $15 an hour.

So it’s safe to say the minimum wage laws have become more popular, no doubt in part from campaigns promoting them and an education system sympathetic to them. Still, economic laws do not change based on how popular humans find them. They remain true and constant whether they are popular or not.

In fact, some have observed that economic laws are inherently unpopular.

“In economics, the majority is always wrong,” John Kenneth Galbraith once allegedly quipped.

Now, there have been a lot of complaints directed at corporate media in recent years, but I believe in giving credit where credit is due. So let’s give the Times a hand.

The paper was right in 1987. And if politicians are genuinely interested in helping the poor, they’ll stick a stake in the heart of the minimum wage once and for all.

Jon Miltimore

Jon Miltimore

Jonathan Miltimore is the Managing Editor of FEE.org. Serving previously as Director of Digital Media at Intellectual Takeout, Jon was responsible for daily editorial content, web strategy, and social media operations. Before that, he was the Senior Editor of The History Channel Magazine, Managing Editor at Scout.com, and general assignment reporter for the Panama City News Herald. Jon also served as an intern in the speechwriting department under George W. Bush.

EDITORS NOTE: The featured image is provided by FEE and is republished with permission.