Friday, August 20, 2021

Mises Wire

Mises Wire


Bretton Woods and the Spoliation of Europe

Posted: 19 Aug 2021 12:00 PM PDT

Having marked the quinquagenary of the destruction of the gold standard Sunday, August 15, it is natural to be a little nostalgic for the Bretton Woods system. After all, it might not have been the classical gold standard, but at least it wasn't as bad as the fiat standard that succeeded it. As sites such as wtfhappenedin1971.com document, that year indeed looks to be a turning point in the economic history of the West. However, the suspension of convertibility of dollars into gold was simply the logical outcome of the system. The PhD standard was not an arrangement that emerged by default in 1971, as governments tried desperately to patch up the international monetary system before its final breakdown in 1973; Bretton Woods was itself the original ideal of government controlling and carefully managing monetary affairs scientifically.

Bretton Woods and Managed Money

The social engineering was on full display when international delegations from forty-four nations came to the resort town of Bretton Woods, New Hampshire, in July 1944 to agree on how the international monetary system should be set up after the world war. However, in reality the system was an American diktat; while the Europeans were not completely without influence, they were all either bankrupt or still occupied by German soldiers, whereas the Americans were now creditors to the world and sitting on a pile of gold that had flowed into the country in the 1930s and during the war. As Ben Steill documents in his account of the conference, the system agreed upon was essentially that proposed by Harry Dexter White, the creature of Henry Morgenthau.1 The (more) inflationary plan proposed by John Maynard Keynes was never really a serious alternative, although the arrogant and insufferable behavior of Keynes at the conference probably didn't do the British any favors.

The system was essentially one of managed, convertible currencies. The currencies of the participating countries were to be convertible into dollars at a fixed rate, and foreign central banks could redeem dollar claims into gold at the fixed rate of $35 per ounce. Thus, a gigantic inflation machine was created: the Americans could increase the supply of dollars with little restraint, since foreign central banks would then use dollar reserves as the basis for their own expansion of the domestic money supply. The watchword for all these deliberations and negotiations was "liquidity," as the emerging Keynesian orthodoxy lived in mortal fear of a lack of liquidity. Of course, what this meant in reality was that the more inflationary countries wanted someone else to finance the inevitable balance of payments deficits. The Keynes Plan's only substantial difference from the White Plan was the British wish for more liquidity to finance their balance of payments through Keynes's International Clearing Union scheme, which would have funded balance of payments deficits and issued its own paper currency, the "bancor"—in other words, a scheme to make the foreigners (i.e., the Americans) bear the burden of British inflation.

The Bretton Woods system began operating in 1945, when the International Monetary Fund (IMF) was set up. American officials were generally in favor of a return to freer trade and financial flows, managed through the new international institutions, of course. Yet while Bretton Woods was officially in operation, there was not much trade going on. Understandable, perhaps, when Europe was only emerging from the war in 1945, but as the forties dragged on, revival of international commerce still seemed distant. This is partly explained by the lingering command economies in Europe. The social democrats now in power were loath to abolish the totalitarian economic controls they had inherited from the Nazi occupiers (or had often enough themselves first instituted in the 1930s), thus stifling any sound economic development. Most famously, German economic revival began in 1948 with the abolition practically overnight of price controls.

Dollar "Shortage" and the European Payments Union

Problems in international trade persisted, however, as the Europeans simply had no dollars with which to buy American goods. There was a dollar shortage across Europe, and the new professional economists wrote volumes upon volumes of commentary on this. The problem of a lack of liquidity seemed bound to persist. At least partially, the Marshall Plan was an attempt to give a boost to trade: if the Europeans didn't have the money to buy American goods, the Americans would have to lend it to them.

This was a complete misdiagnosis. In international trade as elsewhere, to appropriate a line from Jean-Baptiste Say, commodities exchange for commodities. The general poverty of Europe was not, as such, the cause of the dollar shortage, as it would simply take time to rebuild production capabilities. Yet why was there a lack of dollars to carry on trade? Simply because the central planners at Bretton Woods and in the new planning bureaucracies had blundered: European exchange rates were fixed at too high a level vis-à-vis the dollar. Effectively, Europe was starved of dollars, because European currencies were overvalued, and thus European commodities could not exchange for American commodities, as the medium of exchange was lacking. Therefore, only foreign aid was possible, as there was no other way for Europeans to acquire the needed dollars.

Intra-European trade revived more quickly, as the European Payments Union (EPU; 1950–58) allowed the participating countries to clear claims and counterclaims against each other. Every country kept a balance with the EPU in "European Units of Account," defined as 0.888671 grams of gold, the supposed gold content of one dollar. Every month, the EPU would clear claims and counterclaims and each country would have to settle the net claim on it with the EPU (or receive a credit to its account, as the case may have been). The monthly clearing ensured that the EPU did not devolve into the liquidity machine the Keynesians dreamed of, although modest sums could be borrowed to offset short-term fluctuations in the balance of payments. European currencies were still not convertible into dollars, however, and the Bretton Woods system was thus in reality still not in operation, but this changed in 1958. Ironically, the author of the reforms that finally made the system work was the French economist Jacques Rueff, who would soon become its harshest critic.

The Short Life of Bretton Woods, 1958–71

President Charles de Gaulle had charged his finance minister with balancing the budget, and the finance minister in turn set up a commission headed by Rueff that proposed more expansive plans to restore France to economic health.2 The details of the Rueff Plan do not concern us here. What matters is that as part of the reforms the French franc was devalued. As the exchange rate was fixed at too high a level, either devaluation or severe deflation was necessary. Since domestic prices reflected the true value of the franc, the only sensible policy was devaluation. With one stroke, in December 1958, Rueff removed the main remaining obstacle to convertibility (the UK had already devalued the pound in 1949, while the German mark, hard money compared to the dollar and the pound, was periodically revalued). The Bretton Woods system could finally function, only fourteen years after the conference.

No sooner had it begun functioning than the system proved unsustainable. Dollar claims now began piling up in Europe that could finance American imports. On the classical gold standard, the net outflow of dollars to Europe would have caused a contraction in the American monetary base and in the credit superstructure, but in the Bretton Woods System, foreign central banks were supposed to consider dollars and dollar claims (i.e., US Treasurys) as good as gold. Whether by design or accident, the Bretton Woods inflation machine was now in motion.

Jacques Rueff started attacking the system as soon as the early 1960s. American expansion was clearly unsustainable, as European central banks soon began redeeming their dollar claims. Political pressure was not enough to keep dollars in Europe. Yet the consequences of the system were not limited to a drain of American gold that could empty Fort Knox within a decade or two. Redemption was simply the last escape the Europeans had. They had been tied into what Rueff himself—a great friend and admirer of the US—called an "unprecedented system of spoliation."3 As they were forced to hold depreciating dollar assets, creditor nations—France and Germany being the most prominent—were forced to subsidize Americans' asset purchasing in Europe and to partially finance the prestige projects and foreign policy of the American government. The international monetary system agreed on in Bretton Woods proved to be simply a means of control and exploitation, no less real because it was invisible to or misunderstood by most Europeans.

The true significance of the closing of the gold window in 1971 now emerges. Gold had been included in the plans for the system because of the veneer of solidity it gave, and because most planners, Harry White among them, could not yet conceive of a monetary order not based on gold. But now the American government and the special interests waxing fat off American inflation and the spoliation of Europe were faced with a stark choice: defend the remaining gold reserves by renouncing inflationary policy or abandon the role of gold entirely. From the point of view of shortsighted politicos controlled by special interests, that was an easy choice to make.

The end of gold did not mean the end of Bretton Woods, however: the Smithsonian Agreement of December 1971 attempted to institute a system of fixed exchange rates with no anchor in gold at all. This only lasted until February 1973, when the Bretton Woods system was abandoned de facto. Bureaucratic institutions being what they are, this was not officially ratified until January 1976, when the IMF decreed that the price of gold was allowed to float, recognizing the state of affairs existing since 1971.

Yet the dollar standard is still the basis for the international monetary system, and dollar claims are the principal component of international reserves. Far from being the end of the dollar standard, closing the gold window opened the way for continuing spoliation. Americans can buy assets and goods from the rest of the world without offering anything real in return. They do not have to produce in order to exchange; they can simply borrow. This is true in general and partly explains the high standard of living enjoyed by Americans, but the benefits accrue principally to the financial and political elites at the center of the system. The dollar standard keeps US government debt in high demand, as it is the principal reserve asset of foreign central banks, and US financial centers prosper due to the inflow of foreign capital. The much touted "depth" of American capital markets here have a ready explanation.

Conclusion

There is nothing inevitable or necessary about the dollar standard. Nothing stood in the way of reestablishing a proper gold standard at the end of World War II. In the 1950s, Ludwig von Mises explained that each country simply had to abstain from inflationary policy and make its currency convertible into gold at the market rate of exchange.4 This would have prevented not only supposed dollar shortages and balance of payments problems, but it would have prevented the system of spoliation and the social and economic deformations that go with it.5 Even the apparent beneficiaries of the system, the American people, would be better off in the long run with a sound international monetary order.

  • 1. Ben Steill, The Battle for Bretton Woods: John Maynard Keynes, Harry Dexter White, and the Making of a New World Order (Princeton, NJ: Princeton University Press, 2013).
  • 2. Christopher S. Chivvis, The Monetary Conservative: Jacques Rueff and Twentieth Century Free Market Thought (DeKalb, IL: Northern Illinois University Press, 2010).
  • 3. Jacques Rueff, The Monetary Sin of the West (New York: Macmillan, 1972), p. 191.
  • 4. Ludwig von Mises, The Theory of Money and Credit (New Haven, CT: Yale University Press, 1953), part 4, "Monetary Reconstruction." Part 4 was written in 1952 for the new edition.
  • 5. I describe some of these in my article: Kristoffer Hansen, "The Populist Case for the Gold Standard," Journal of Libertarian Studies 24, no. 2 (2020): 323–61.

This posting includes an audio/video/photo media file: Download Now

Kabul's Collapse and DC's Incurable Arrogance

Posted: 19 Aug 2021 11:00 AM PDT

There is no reason to expect the Afghanistan debacle to humble Washington policymakers. Korean War fiascos were swept under the rug, paving the way for the Vietnam War. The cycle didn't end there.

Original Article: "Kabul's Collapse and DC's Incurable Arrogance"

This Audio Mises Wire is generously sponsored by Christopher Condon. Narrated by Michael Stack.

Welfare Payments and Foreign Policy Fears Are the Only Things Holding America Together

Posted: 19 Aug 2021 10:00 AM PDT

Fear of China and Iran, combined with the more practical desire for continued "free" money from the federal government, will continue to fuel opposition to any serious movement toward secession.

Original Article: "Welfare Payments and Foreign Policy Fears Are the Only Things Holding America Together"

This Audio Mises Wire is generously sponsored by Christopher Condon. Narrated by Michael Stack.

The Secret Ronald Reagan Told Me about Gold and Great Nations

Posted: 19 Aug 2021 09:15 AM PDT

Today [August 15] marks 50 years since President Richard Nixon closed the "gold window," ending the ability of foreign governments to exchange United States dollars for gold. Nixon's action severed the last link between the dollar and gold, giving the U.S. a fiat currency.

America's experiment with fiat has led to an explosion of consumer, business, and—especially—government debt. It has also caused increasing economic inequality, a boom-bubble-bust business cycle, and a continued erosion of the dollar's value.

Nixon's closure of the gold window motivated me to run for office. Having read the works of the leading Austrian economists, such as Ludwig von Mises and Murray Rothbard, I understood the dangers of abandoning gold for a fiat currency and wanted a platform to spread these ideas.

When I first entered public life, support for restoring a gold standard, much less abolishing the Fed, was limited to so-called "gold bugs" and the then tiny libertarian movement. Even many economists who normally supported free markets believed the fiat system could be made to work if the Federal Reserve were forced to follow rules.

These rules were supposed to provide the Fed with clear guidance as to when to increase or decrease the money supply. This may sound good in theory, but a "rules-based monetary system" still allows the Federal Reserve to manipulate interest rates, which are the price of money, causing artificial booms and very real busts.

The stagflation of the Carter era did increase interest in monetary policy. The rise of the "supply-siders," who supported a limited role for gold, helped increase interest in the issue.

Ronald Reagan once told me that no nation has abandoned gold and remained great. As president, he supported the creation of the Gold Commission. However, he did not stop the establishment from stacking the commission with defenders of the monetary status quo.

The commission's two pro-gold members, Lewis Lehrman and myself, produced a minority report, written with the aid of Murray Rothbard, making the case for a gold standard. The report was published as The Case for GoldIt can be downloaded at Mises.org.

By the mid-1980s, any interest among the political and financial elites in questioning the Fed's power had disappeared. This was due to acceptance of the myth that Paul Volcker tamed inflation. In the 1990s, a virtual cult of personality arose around the "Maestro" Alan Greenspan, who once told me that the Fed had learned how to "replicate" the results of a gold-backed currency.

While my warnings that the Fed was leading the American economy over the cliff were dismissed in Washington, they found a receptive audience outside the Beltway. The response to my 2008 presidential campaign led to a birth of a new liberty movement that put monetary policy front and center.

The 2008 meltdown, big bank bailouts, and the Fed's subsequent failure to reignite the economy despite unprecedented money creation fueled the growth of the new movement. My Campaign for Liberty organization mobilized the new liberty movement to make Audit the Fed a major issue in Congress.

Fifty years after Nixon closed the gold window, prices are heading toward 1970s-era increases. Yet the Fed cannot increase interest rates as long as the politicians keep creating billions of new debts.

It is clear that America is heading toward another Federal Reserve–created economic crisis. The good news is the impending crisis gives us an opportunity to spread our message, grow our movement, and finally force Congress to audit and end the Fed.

Originally published by the New York Sun.

This posting includes an audio/video/photo media file: Download Now

WaPo Editors: "Liberty" Requires Us to Implement Vaccine Passports

Posted: 19 Aug 2021 09:00 AM PDT

Mandating private and government employees to be immunized against covid-19 and requiring the use of standardized electronic passes as proof of immunization across the nation is what liberty is made of, the editors of the Washington Post argued last week

State governors such as Florida governor Ron DeSantis (R), who are blocking or attempting to block "government agencies, local businesses or both from mandating vaccination," are engaged in "efforts that fly in the face of the values of liberty that their proponents purport to defend," the editors added. 

"The highly transmissible delta variant of the coronavirus has ushered in mask mandates in some places, but vaccination remains the key to containing the pandemic once and for all," the editors wrote. But to ensure we can all trust those who claim to be vaccinated, they added, states should be "developing a smartphone-compatible certificate that's easily downloadable and easily scannable." 

With this standardized approach to the vaccine mandate, they argued, Americans who are reluctant to get the jab would be forced to think differently. "At the least, enabling vaccine requirements will help organizations keep their spaces safer. At best, they also could inspire some holdouts to get the shot at long last."

But if "safety" is so important to these editors, shouldn't we also consider the safety of medical treatments (i.e., vaccines) themselves? Moreover, shouldn't we consider the ways that providers of vaccines can be held accountable when their vaccines do harm? 

That discussion, apparently, is not on the table. I have yet to see a proponent of covid-19 vaccine mandates that talks about the vaccine industry's immunity before federal law and how the current vaccination campaign is just a continuation of that scheme.

Ronald Reagan's Socialized Medicine 

The covid-19 vaccine isn't the first inoculation program that is both financially backed by the government and immune from legal accountability in US history. 

Thanks to President Ronald Reagan's National Childhood Vaccine Injury Act (NCVIA) of 1986, vaccine makers are able to develop vaccines, many of which are produced using unethical methods such as using cells taken from aborted fetal tissue, deliberately mislead patients and health officials by making false efficacy claims, and go on doing so unabatedly even after countless victims come forward saying they have been injured—sometimes for life—by their products. 

Due to the 1986 law, these victims don't get the chance to have their cases heard by a jury of their peers. Instead, their cases must necessarily be funneled through the National Vaccine Injury Compensation Program (VICP), which was created in 1988 after the NCVIA was signed into law.

The VICP is in place to shield manufacturers from liability related to their vaccine products, as explained by AMA Journal of Ethics.

The act establishes a special court program for vaccine injury claims that caps damages and allows for the injured party to be compensated without having to prove that the maker committed any wrongdoing. (emphasis added)

Since its inception, the VICP has paid out about $4.6 billion in settlements. But while the VICP is funded by an excise tax on each vaccine purchased, it is run by the US government. 

Considering that pharmaceuticals were threatening to give up on producing vaccines due to the expensive injury-related court battles prior to 1986 and that they remain unwilling to stand behind their products' safety to this day, it is clear that given the opportunity to function in a market unprotected by the federal government, these manufacturers would likely have not managed to stay in business. It is in this context that the covid-19 vaccines exist. 

Because currently the covid vaccines do not have full Food and Drug Administration (FDA) authorization, injury claims must be funneled through a different but similar program, the Countermeasures Injury Compensation Program (CICP), run by the Health and Human Services Department. But it is only a matter of time before the vaccine "courts" take over. 

With record-breaking numbers of adverse reactions reported to the Centers for Disease Control and Prevention and growing concerns regarding the covid vaccines' effectiveness, paper pushers are promising more mandates will come once the FDA concedes the vaccine manufacturers full approval. Considering that all other vaccines currently in use regularly across the nation were given the same FDA approval and yet remain immune from legal accountability, why should we trust whatever the health czars have to say?

This posting includes an audio/video/photo media file: Download Now

Paranoia about American Weakness Rests on a Flawed Understanding of History

Posted: 19 Aug 2021 04:00 AM PDT

For some members of the American foreign policy establishment, the world is permanently stuck in 1938 Munich. Ever since the Munich agreement and the strategy of "appeasement" failed to deter the start of the Second World War, some scholars have concluded that the only thing between global peace and complete chaos is the projection of American military strength around the world. The summer edition of the Hoover Digest features an article by Hoover Institution senior fellow Victor David Hanson based entirely on this facetious canard that is still somehow taken seriously as being the North Star of US foreign policy.

The crux of this way of thinking hinges upon a certain understanding of how wars start. Wars start, Hanson argues, because "innately aggressive cultures and governments, megalomania, and the desire for power, resources and empire prompt nations to bully or attack others." Almost as an afterthought, he tacks on that "perceptions of self-interest are not to be discounted either." But such perceptions are not to be found in the rest of Hanson's analysis. Rather, Hanson paints a picture of a world full of fundamentally "bad guys" who have little more depth than that of a cliché comic-book supervillain. 

Hanson cites the examples of Nazi Germany and imperial Japan in World War II as examples of dastardly powers bent on conquest that believed they could get away with it due to perceived weakness on the part of their adversaries. What else could explain powers that were dwarfed by the combined economic and military might of the US, the UK, and the USSR making a run for hegemony?

Well, actually, there are lots of things that explain why those states did what they did, and American weakness is certainly not part of it.

Take Japan, for example. Hanson would have us believe that the Japanese were simply too "deluded" to listen to dissenting voices that warned that war with the United States would be futile and just plunged in thinking that "American isolationism during the 1930s" was "proof of weakness and timidity." Had America been "stronger" and not "projected weakness," then of course they wouldn't have dreamed of attacking us!

However, this way of thinking flies in the face of history. In his classic work The Tragedy of Great Power Politics, John Mearsheimer argues that the Japanese were extremely rational in their decision-making that led up to attacking the US, and that it was the subject of a great deal of debate and discussion within the Japanese government.

Hanson is ostensibly a historian but simply attributes the Japanese drive for regional hegemony to their being "innately aggressive." On the contrary, the Japanese expansion was quite rational from their point of view. Mearsheimer cites historian E.H. Norman, who concluded "that all the lessons of history 'warned the Meji statesmen that there was to be no half-way house between the status of subject nation and that of a growing and victorious empire.'" Mearsheimer points out the testimony of Japanese general Ishiwara Kanji, who stated during his war-crimes trial that when Japan opened its doors (or rather had them forced open by US admiral Matthew C. Perry) "it learned that all those countries were a fearfully aggressive lot. And so for its own defense it took your country as its teacher and set about learning to be aggressive. You might say that we became your disciples. Why don't you subpoena Perry from the other world and try him as a war criminal?"

Hanson doesn't even get American history in the lead-up to World War II correct. Far from being "isolationist" in East Asia, the United States was very invested in containing Japan's expansion and diplomatically intervened numerous times, which culminated in an economic embargo against Japan in the summer of 1941 in an attempt to keep Japan from joining the German invasion of the USSR. In Mearsheimer's words, "the embargo left Japan with two terrible choices: cave in to American pressure and accept a significant diminution of its power, or go to war against the United States, even though an American victory was widely agreed to be the likely outcome." 

Rather than American "weakness," it was America's strength that allowed the government to corner imperial Japan, which in turn prompted their risky gamble to fight it out.

This history is important to keep in mind when members of the foreign policy establishment, like Hanson, whine and complain that the US is "losing credibility" or "projecting weakness" for not continuing the eternity war in Afghanistan or not being sufficiently hawkish on China or Iran. Their cartoonish conception of how foreign states act is not supported by history and contributes to the US government's insane defense expenditures and destructive crusades around the globe. 

This posting includes an audio/video/photo media file: Download Now

No comments:

Post a Comment

End of Summer Sale ☀️😎

20% OFF Inside!🤯 ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏...