Post by LWPD on Apr 29, 2012 9:23:18 GMT -5
Chris Whalen (Tangent Capital) recently penned an outstanding piece on the historic role of the U.S. Dollar. The amount of outstanding dollar denominated debt, relative to how little real capital in the world backs said debt, is astounding. Recent lessons gleamed from Europe show that nations (ie. Greece, Spain) can reach a point where no policy action can avert being frozen out of the international credit markets. Policies that cut spending, end up retracting growth, which further expand existing deficits and debts on the back-end (making the attempt pointless). Policies of going further into debt in an attempt to stimulate growth, expand existing debt on the front-end, leading to higher immediate market interest rates (due to a lack of creditor confidence) and artificial temporary gains (that contract once the the debt spending that supports it is removed). At that point, absent default, there is no way out of the debt trap. The only element that keeps the music going is outside entities (ie. ECB, IMF) acting as lenders of last resort, as they themselves issue debt to fund the debt of these countries.
In the 2011 fiscal year, the total revenue of the U.S. Government was $2.31 trillion, its total expenditures were $3.36 trillion. That's a trillion dollar short-fall. The special status of the U.S. Dollar as the world's reserve currency is what enables America to spend as it does. How long this will last, and how effectively the country will adapt in the face of losing the advantages that allow for debt financed consumption, remains to be seen.
Courtesy of The National Interest
U.S. Debt Culture and the Dollar's Fate
By Christopher Whalen
In our common narrative, the modern era of global finance—what we call the Old Order—begins with the Great Depression and New Deal of the 1930s. The economic model put in place by President Franklin D. Roosevelt and others at the end of World War II is seen as a political as well as economic break point. But arbitrarily selected demarcation points in any human timeline can be misleading. The purpose of narrative, after all, is to simplify the complex and, over time, to remake the past in today’s terms. As we approach any discussion of the Old Order, we must acknowledge that the image of intelligent design in public policy is largely an illusion.
There is no question that the world after 1950 was a reflection of the wants and needs of the United States, the victor in war and thus the designer of the peacetime system of commerce and finance that followed. Just as the Roman, Mongol and British empires did centuries earlier, America made the post–World War II peace in its own image. The U.S.-centric model enjoyed enormous success due to factors such as relatively low inflation, financial transactions that respect anonymity, an open court system and a relatively enlightened foreign policy—all unique attributes of the American system.
But the framework of the global financial system in the twentieth century and its U.S.-centric design were the end results of a series of terrible wars—starting, in the case of America, with the Civil War. The roots of the U.S.-centric financial order that arose at the end of World War II extend back into the nineteenth century and reflect the political response of a very young nation to acute problems of employment and economic growth—problems that remain unresolved today.
From an American perspective, the modern era of what we describe as the global financial system based upon the U.S. dollar begins with Abraham Lincoln, the great emancipator who took office in March 1861 as the American republic stood on the verge of dissolution. In those days, “money,” as understood by Americans, comprised gold and silver coins, foreign currency and notes issued by state-chartered banks that were convertible into metal, in that order of qualitative ranking.
The state-chartered banks of that era relied upon gold coin or specie as a store of value and means of exchange with other banks. Going back to Andrew Jackson’s epic campaign to extinguish the Second Bank of the United States in the 1830s, state-chartered banks large and small were suspicious of Washington and would not finance Lincoln’s war. Bankers in New York, Boston and London, for example, would have been happy to see the North and South separate without a war and the continuation of slavery so as not to disturb the cotton trade.
Lincoln tasked Secretary of the Treasury Salmon P. Chase to sell Treasury bonds and had Congress create national banks to buy the debt. The Treasury Department suspended convertibility and issued large quantities of “greenbacks” in the form of paper dollars to pay for the immediate cost of fighting the war. By the end of the Civil War, the greenback traded down to a fifth of face value when measured in gold. Yet Lincoln won the war, even in death, because the Union outspent the Confederacy using the credit afforded by paper money created by government fiat. And the Civil War set the precedent for Washington to engage in massive currency inflation in times of exigency and also to develop by fiat new platforms for creating financial leverage to meet national needs. As Wall Street speculator Ray Dalio wrote a century and a half later, “Virtually all of what they call money is credit (i.e., promises to deliver money) rather than money itself.” Lincoln used that fact to win the Civil War.
Even without a central bank, from the end of the Civil War to the start of World War I in 1914, the United States saw powerful economic growth. So strong was demand for a means of exchange that the much-abused greenback dollar traded back to par value against gold by the time President Ulysses S. Grant officially restored convertibility at the end of his term. The public remained highly skeptical of paper money or other promissory notes, with good reason. As Mark Twain immortalized in Pudd’nhead Wilson, Roxy lost the savings accumulated from eight years of labor as a riverboat servant when her bank failed. Roxy concluded that “hard work and economy” were insufficient to make her secure and independent.
By the turn of the twentieth century, many Americans had adopted a view similar to Roxy’s, which differed significantly from the rugged, self-reliant individualism of American pioneer mythology. Decades of financial crises tempered the independent, hard-money views of Americans. Growing urban populations worried about jobs and opportunity, while farmers, businesses and even conservative state-chartered banks fretted about access to credit. The solution that emerged was not the free market but increasingly the collective credit of the federal government in Washington.
By 1913, when the banking industry and progressive forces in Congress created the Federal Reserve System, America had been through several more financial and economic crashes, leaving bankers even more disposed to a government-sponsored entity (GSE) rescuing them from the harsh discipline of “market forces,” to recall the words of South Carolina’s Democratic senator Ernest Hollings. The private clearinghouse system developed in major U.S. cities during the nineteenth century was inadequate to provide liquidity in times of market upheaval. Thus twelve Federal Reserve banks were created to support the liquidity needs of banks and, in a big-picture sense, provide another layer of financial leverage on top of national banks to support the funding needs of the U.S. economy.
Yet, even after the Fed’s establishment, the U.S. economy continued to labor under the weight of deflation and slack demand. In that sense, the First World War was the first true watershed for America’s economic narrative; it forced Americans to look outward for growth and financial interaction with foreign states as a single nation. Public sentiment was split between sympathy for the British, French and Belgian forces, on the one hand, and for the Central powers led by Germany on the other. But all Americans welcomed the vast demand for goods and services as a relief from years of price deflation on the farm and slack job growth in urban markets driven by the adoption of new technology and imports from Europe. Allan Nevins and Henry Steele Commager wrote, “Economic considerations re-enforced sentimental and political ones.”
The gradual American economic mobilization to support the Allies in World War I not only marked a growing willingness of Americans to engage in foreign intervention overseas but also saw a vast transfer of wealth from Europe to the former colonies as a large portion of the Continent’s gold was sent to America to pay for the war. U.S. banks and eventually the Fed also provided financing for Allied purchases, which grew to a great torrent of raw materials and finished goods. So vast were the financial flows in the early days of World War I that J.P. Morgan could not manage the dollar-sterling transactions. At first, commercial paper issued by British banks could not be discounted with the Fed. The Federal Reserve Bank of New York stepped in, however, and effectively subsidized the British pound exchange rate when the Bank of England exhausted its gold reserves.
Victory bonds were sold widely to Americans to finance the war and manage domestic demand, thereby also socializing the idea of investing by individuals. A number of new GSEs were created to fund and manage the American war effort via the issuance of debt. Compared to the neoliberal orthodoxy of today, there was little fretting over market forces during World War I. Jobs and inflation were the top issues. Wage and price controls and other authoritarian mechanisms were employed by Washington without apology to limit cost increases because wages were constrained. The American farm sector recovered from years in the doldrums as global demand for cotton, grains and meat soared, pushing up domestic prices as well. By 1917, when the United States entered the war militarily on the Allied side, the American economy was running better than it had in many decades. The following year, however, when the debt available to the Allies dried up and exports to Europe slowed, the U.S. economy quickly faded as well. By 1919, the United States was entering a serious economic slump.
After World War I, America descended into a period of isolation and uneven economic circumstances presided over by the dominant Republican Party, which rejected U.S. involvement in world affairs and promptly raised tariffs to protect Americans from cheap imports. When Congress enacted Smoot-Hawley eight years later, the increase in tariffs was marginal compared with earlier increases in import taxes. But no amount of tariff protection could shield U.S. workers and industries from the impact of technological changes such as electricity and the growing use of machines.
The return to “normalcy” promised by President Warren Harding meant an environment where large corporations and banks prowled the economic landscape unhindered, and the federal government largely withdrew from the economy, compared with the policies of Teddy Roosevelt and Woodrow Wilson. The Fed played a relatively marginal role in the post–World War I period and did little to alleviate the economic stagnation that affected much of America’s rural areas. Urban workers had employment, but wages remained stagnant even as the concentration of wealth in the United States increased dramatically.
While a large part of the real economy suffered during the post–World War I period, speculation in real estate and on Wall Street grew through the 1920s. With it came financial fraud. The party ended, though, with the landmark 1925 Supreme Court decision written by Louis Brandeis in Benedict v. Ratner, which set a new standard for collateralized borrowing. The Brandeis decision, which ruled that the failure to specify the collateral was “fraud on its face,” arguably helped cause the great crash of 1929 because it effectively shut down the Wall Street sausage machine, cutting liquidity to the market.
The great Wall Street crash of 1929 completed the process of speculative boom and bust that made the market collapses and currency crises of the previous half century pale by comparison. John Kenneth Galbraith noted in The Great Crash of 1929 that Americans displayed “an inordinate desire to get rich quickly with a minimum of physical effort.” As a chronicler of the Great Depression, Galbraith describes the run-up to the Wall Street crash, including the real-estate mania in Florida in the mid-1920s. Few today recall that the precursor to the Great Depression was a real-estate bubble in the mid-1920s, an eerie parallel to the real-estate boom and bust of the 2000s. But in each case, it was the supply of credit in the form of debt that drove the boom and eventual bust in the economy.
IN THE wake of the financial and social catastrophe that followed the 1929 crash, the Franklin D. Roosevelt administration responded with government and more government. Whatever the laissez-faire excesses of the era of Republican rule in the 1920s, the New Deal Democrats lurched in the opposite direction. Historian Arthur M. Schlesinger Jr. noted that “whether revolution was a real possibility or not, faith in a free system was plainly waning.”
Roosevelt launched a campaign of vilification and intimidation against private business, a terrible but probably deliberate blunder that worsened the Depression and drove the formation of private debt capital in the United States to zero by the mid-1930s. Economist Irving Fisher notes in his celebrated 1933 essay, “The Debt-Deflation Theory of Great Depressions,” that FDR’s reflation efforts did help to avoid catastrophic price deflation, but he also blames Roosevelt for prolonging the Depression. The man Milton Friedman called America’s greatest economist wrote:
In fact, under President Hoover, recovery was apparently well started by the Federal Reserve open-market purchases, which revived prices and business from May to September 1932. The efforts were not kept up and recovery was stopped by various circumstances, including the political “campaign of fear.”
The Second World War and the new debt used to fund it ultimately rescued the United States from FDR’s economic mismanagement. The mobilization to meet the needs of the conflict quickly soaked up the excess workforce, either in terms of conscription or war industries, which were organized in a centralized fashion, as had been the case in World War I under production czar Herbert Hoover. The Fed played a secondary role in financing the New Deal and America’s military effort in World War II. By contrast, the Reconstruction Finance Corporation (RFC) under Jesse Jones took the lead as the government’s merchant bank and provided the financial muscle to fund government programs by issuing its own debt.
At the end of World War II, Britain was broke, and its leaders worried openly that the United States would take advantage of its parlous financial position in the postwar era. In geopolitical terms, the war was the handoff of imperial responsibility from London to Washington. During World War II, Britain liquidated $1 billion in overseas investments and took on another $3 billion in debt, much of which would be rescheduled and eventually forgiven. But when the British, Americans and other Allies met at Bretton Woods at the war’s end, the objective was to stimulate growth and thereby avoid another global war. The key decision taken at that meeting, which set the pattern for the post–World War II financial order, was equating the fiat paper dollar with gold.
When FDR confiscated public gold holdings in 1933 and devalued the dollar, the RFC and not the Fed was the instrument of government action. Jones took delight in having the Fed of New York execute open-market purchases of gold on behalf of the RFC. Together with giants like Leo Crowley—who organized the Federal Deposit Insurance Corporation (FDIC), ran the “lend-lease” operation in World War II and managed two reelection campaigns for FDR—Jones restructured the American economy and then financed the war’s industrial effort with massive amounts of debt.
Besides the RFC, many other parastatal entities were created before, during and after the Depression and war years that were modeled after the experiments of fascist European nations. These included the Federal Housing Administration, the Federal Home Loan Banks, Fannie Mae, the Export-Import Bank, the FDIC, the World Bank and the International Monetary Fund. All of these GSEs were designed to support economic growth via the issuance of debt atop a small foundation of capital—capital that was not in the form of gold but in the form of fiat greenback dollars and U.S. government debt.
Most industrial nations had backed away from gold convertibility by the 1950s, but the metal was still the symbolic unit of account in terms of national payments and private commercial transactions. By stating explicitly that the dollar was essentially interchangeable with gold, Bretton Woods vastly increased the global monetary base and created political possibilities for promoting economic growth that would not have been otherwise possible. Just as Lincoln used a vast expansion of the money supply and the issuance of debt to fund the Civil War, the cost of which approximated the U.S. gross national product of that era, the United States and Allied victors after World War II built the foundation of prosperity on old-fashioned money (gold) and debt (paper dollars). Civil War–era greenbacks originally bore interest to help make these “notes,” which were not backed by gold, more attractive to the public. But by 1945, the paper dollar had become de facto money for the entire world—one of many legacies of war.
Multilateral GSEs such as the World Bank and IMF fueled growth in the emerging world, while U.S. domestic growth in defense spending and later housing was driven by a growing number of domestic GSEs. “Created to rebuild Western Europe, the World Bank soon was eclipsed by the Marshall Plan and its appendages as West European capital markets recovered,” notes author and journalist Sol Sanders. “Looking for new fields to conquer, it turned to what then were unambiguously called undeveloped countries, entering its golden age under Eugene Black (1949–1963), a former Wall Street bond salesman.”
Carried by the demographic tsunami known as the baby boom, created when the “greatest generation” returned from the war, the U.S. economy fueled the rebuilding of European and Asian nations. The Marshall Plan supported growth in Europe while loans from the World Bank and IMF supported nations around the globe with everything from infrastructure loans to social-welfare schemes to explicit balance-of-payments financing—the latter something John Maynard Keynes would have condemned in a loud voice. Hardly a free trader, Keynes wrote in 1933:
I sympathize, therefore, with those who would minimize, rather than with those who would maximize, economic entanglement among nations. Ideas, knowledge, science, hospitality, travel—these are the things which should of their nature be international. But let goods be homespun whenever it is reasonably and conveniently possible, and, above all, let finance be primarily national.
Based on the dollar as the common currency of the free world, in the era known as the Cold War, the United States led a marathon of economic stamina against the Warsaw Pact nations. Loans to nations of all sizes and descriptions fueled global growth and also supported the geopolitical objective of blocking the military expansion of the Soviet Union. Developing nations such as Mexico, Brazil and India became clients of the World Bank and IMF through large loans, causing periodic political and economic crises and currency devaluations as the world reached the 1970s.
When the Berlin Wall fell in 1989, it was not from force of arms by the NATO alliance but the weight of spending and debt by the U.S. defense-industrial and multilateral-aid complex. As in World War II, the ability of America to outmatch the foe in terms of logistics and sheer weight of money—that is, credit—won the day over often-superior weapons and military forces. But while the United States won the Cold War in a geostrategic sense, the economic cost mounted enormously in terms of decades of debt issuance, accommodative monetary policy and extremely generous free-trade policies. Consumers felt the wasting effect of steady inflation, and the impact on American free-market values was corrosive in the extreme. Recalling the allegory in George Orwell’s Animal Farm, all the politicians in Washington, regardless of affiliation, became pigs. In the 1970s, when Washington tried to manage the economy via price controls, “this initiative was not the handiwork of left-wing liberals but of the administration of Richard Nixon,” wrote Daniel Yergin and Joseph Stanislaw, “a moderately conservative Republican who was a critic of government intervention in the economy.”
Through the 1970s and 1980s, as core industries were stripped out of the United States and moved offshore, lost jobs were replaced with domestic-oriented service industries. Chief among these was housing, a necessary and popular area of economic activity that supports employment but does not create any national wealth. The first surge in real-estate prices, which was again driven by the demographic force of the baby boom, ended with the savings-and-loan crisis of the late 1980s. Several of the largest U.S. banks tottered on the brink of failure in the early 1990s. But these crises only presaged the subprime meltdown of the 2000s.
As domestic growth slowed and inflation reared its ugly head, Americans for the first time since the years following World War II began to feel constrained by debt and a lack of opportunity. But instead of succumbing to the constraints of current income, Americans substituted ever-increasing amounts of debt in order to maintain national living standards. Through the 1990s and 2000s, the United States used a toxic combination of debt and easy-money policy to maintain growth levels while a politically cowed, “independent” central bank pushed interest rates lower and lower to pursue the twin goals of full employment and price stability. Under Chairman Alan Greenspan, the Fed kept the party going in terms of nominal growth, even if American consumers actually lost ground in terms of wages and inflation, proof that the Fed’s dual mandate to foster both employment and stable prices is impossibly conflicted.
The use of debt to bid up the prices of residential real estate from the late 1990s through 2007 is yet another example of the determinative impact of demographics on the economic narrative. Federal spending financed with debt started to grow dramatically in the 1980s, while mandates for future social-welfare benefits likewise began to soar. Domestic industries continued to lose ground to imports, which were encouraged through now-institutionalized free-trade policies to preserve the myth of low domestic inflation for consumers.
As the debt dependence of the United States grew from the 1980s onward, the rest of the world benefited from the steady demand for goods and services needed to satiate American consumers. So long as America was willing to incur debt to buy foreign goods, the global financial system functioned via a transfer of wealth from the now-developed U.S. economy to the less developed nations of the world. And to a large extent, the model worked. Today, India, Mexico and Brazil have all repaid their once-problematic foreign debts, leaving agencies such as the World Bank and IMF seemingly out of a job. The question remains how to turn the success of the new world as an export-oriented platform into a stable, competitive marketplace among global industries and nations.
IN A December 2011 comment in Project Syndicate, Mohamed El-Erian of PIMCO wrote:
A new economic order is taking shape before our eyes, and it is one that includes accelerated convergence between the old Western powers and the emerging world’s major new players. But the forces driving this convergence have little to do with what generations of economists envisaged when they pointed out the inadequacy of the old order; and these forces’ implications may be equally unsettling.
El-Erian points to a most troubling aspect of considering the state of the Old Order in global finance—namely, that much of it was a function of war, demographics and other factors far removed from the minds of today’s world leaders. Whereas after World War II there was a strong international consensus behind coordinated government planning when it came to global finance, today the resurgence of neoliberal thinking makes such concerted action unlikely. At the time of Bretton Woods, respected icons of the Old Order like Henry Morgenthau called publicly for government control of the financial markets; today, such views would be ridiculed as retrograde.
Yet even now, the blessed age of globalization—including support for free markets and free trade—may be receding after decades of torrid economic expansion around the globe driven by easy money and debt. “The aging of the baby boom will redirect spending toward domestically provided services and away from foreign supplied gadgetry,” one senior U.S. official said in comments for this article. “The same is true in other industrial countries. Export-led growth is overrated.”
With the subprime-mortgage crisis in the United States since 2007 and the subsequent collapse of the EU nations into a financial meltdown, the dollar remains the only currency in the world that investors trust as a means of exchange, despite America’s massive public debt. Even though the Old Order built around the dollar is in the process of disintegrating, there is simply no obvious alternative to the greenback as a means of exchange in the global economy, at least for now. As my friend and mentor David Kotok of Cumberland Advisors likes to say, “Being the reserve currency is not a job you ask for. It finds you.”
In any event, asking whether the dollar will remain the global reserve currency may be the wrong question. In practical terms, neither the euro nor any other currency is large enough to handle even a small fraction of global payments. The global energy market, for example, is too large for any currency other than the dollar to handle.
Furthermore, there are strong political reasons for the dollar’s preeminence. Far more solvent but also authoritarian nations such as Russia and China just don’t have the right combination of attributes to make their currencies a globally accepted means of exchange, much less a store of value. This fact still makes America the most attractive venue in the world for global commerce—and, yes, capital flight, albeit not a long-term store of value. But in order for the dollar to retain this privileged position, a great deal depends upon the United States turning away from years of ever-expanding government, ever-expanding debt and an ever-expanding money supply.
One of the great fallacies after World War II was that government needed to continue spending and borrowing in order to save the Allies from economic disaster, an offshoot of the Keynesian line of reasoning regarding state intervention in the economy. While choosing to rebuild the productive capacity of the world’s industrial states following the war was clearly the right policy up to a point, the resulting governmental expansion in all aspects of the U.S. domestic economy has sapped the long-term prospects of the world’s greatest market—and hence the global financial system.
Now the price has come due. Keynes and the other leading thinkers of the post–World War II era championed this leading role of government in economic affairs, but all ignored the fundamental truth that production and purchasing power are two entirely different things. Keynes believed, falsely, that purchasing power had to be kept high via government spending to support real production. But, as American economist Benjamin M. Anderson noted, “The prevailing view among economists . . . has long been that purchasing power grows out of production.”
Jobs created via productive economic activity increase the overall pool of wealth, but artificially augmenting consumer activity via government spending or monetary expansion merely slices the existing economic pie into ever-smaller pieces. Governments can use fiscal and monetary policy to encourage growth on the margins, but substituting debt-fueled public-sector spending or easy-money policies for basic economic activity is dishonest politically and madness in economic terms. Yet this is precisely the path championed by Keynes and recommended by most economists today. “We do not have to keep pouring more money into the spending stream through endless Government deficits,” argued economist and writer Henry Hazlitt in a 1945 editorial in the New York Times. “That is not the way to sound prosperity, but the way to uncontrolled inflation.” After living through almost a century of Keynesian-fueled boom and bust, the admonition of Hazlitt and other members of the free-market school is one that we would do well to heed today.
But it won’t be easy. As Friedrich Hayek wrote on this subject:
I do not think it an exaggeration to say that it is wholly impossible for a central bank subject to political control, or even exposed to serious political pressure, to regulate the quantity of money in a way conducive to a smoothly functioning market order. A good money, like good law, must operate without regard to the effects that decisions of the issuer will have on known groups or individuals. A benevolent dictator might conceivably disregard these effects; no democratic government dependent on a number of special interests can possibly do so.
Hayek’s observation really gets to the fundamental issue facing Americans—namely, that changing course after almost seven decades of economic indulgence following WWII will be a domestic political challenge of the first order. Limiting public spending and monetary policy may ultimately force a political change in America in much the same way that Germany is now imposing fiscal austerity on the peripheral states of the EU via entirely nondemocratic means.
IF AMERICA can restrain its libertine impulses and get its fiscal house in order, the reality of an open, free-market, democratic system will continue to make the dollar among the most desirable asset classes in the world. But perhaps the real question is whether America will remain a free, open and democratic society in an environment of lower economic growth and expectations. After seven decades of using debt and inflation to pull future jobs and growth into the present, the prospect of less opportunity raises the specter of domestic political turmoil in the United States and other nations. Internationally, the result could be turmoil and war. This is not merely a short-run political challenge for Washington but ultimately threatens to challenge the self-image of American society. How will Americans react to seeing their children facing declining prospects for employment and home ownership?
That in turn raises a question of whether declining living standards in the United States could eventually force a geopolitical withdrawal by Americans from the world stage. Allied nations from the UK to Israel to South Korea and Japan may soon see an end to unconditional American military and economic support.
America remains a very young, fluid country that is still trying to figure out its place in the world. While one hopes that the ethic of open borders and open markets that helped the world recover from World War II continues, Americans will be under great pressure in coming years to turn inward and may eventually revisit protectionist and interventionist policies if economic pressures become sufficiently acute. It has happened before.
But there is still plenty of room for hope and perhaps even optimism about the shape of things to come. One key component of the new international order may be a mechanism to help overly indebted, mature societies in the United States and EU make the adjustment process in the same way that the emerging debtor nations of the 1980s have become engines of growth today. By giving the other nations of the world greater responsibility in managing the global financial system, we may be able to hasten the day when all nations trade in a global clearinghouse system based on the competitive position of each. The notion of a global currency is attractive in theory and goes back to some of the ideas of Keynes and others at Bretton Woods. But it remains to be seen if investors want to embrace an ersatz global currency that is not connected to a dominant nation.
The reason that the dollar is the currency of choice in the free world is because of the American political system, not just economic or foreign-policy considerations. If that open system remains intact, the role of the dollar in the global financial system is unlikely to change. As I wrote in my 2010 book Inflated, if Americans gradually deal with the explosion of government in the post–New Deal era and steer the economic course back toward a more responsible fiscal formulation focused on individual rights and responsibilities, our future is quite bright. In that event, the dollar is likely to remain the center of the global financial system for some time to come.
But should America’s political leaders continue to embrace the policies of borrow and spend championed by Paul Krugman and other mainstream economists, the likelihood is that Washington will not be able to preserve the dollar’s special role. Just as nations cannot substitute inflation and debt for true production and employment without slowly destroying the underlying purchasing power of their people, America cannot continue to play a leading geopolitical role in the world if its domestic economy falters. And there seem to be few alternatives to the United States.
As America comes to accept that there are real limits on its economic and military power, the leading role of the dollar in the global economy eventually may have to end. In that event, the world will face a future with no single nation acting as the guarantor of global security and economic stability. Instead, we may see a world with many roughly equal nations competing for a finite supply of global trade and economic resources, precisely the situation that prevailed prior to World War I. The choice facing all societies going back to the Greeks, Romans, the British Empire and now America seems to lie between using inflation and debt to stimulate economic growth when real production proves inadequate and turning to war to create growth at the expense of others. Finding a way to avoid these two extremes is now the chief concern.
In the 2011 fiscal year, the total revenue of the U.S. Government was $2.31 trillion, its total expenditures were $3.36 trillion. That's a trillion dollar short-fall. The special status of the U.S. Dollar as the world's reserve currency is what enables America to spend as it does. How long this will last, and how effectively the country will adapt in the face of losing the advantages that allow for debt financed consumption, remains to be seen.
Courtesy of The National Interest
U.S. Debt Culture and the Dollar's Fate
By Christopher Whalen
In our common narrative, the modern era of global finance—what we call the Old Order—begins with the Great Depression and New Deal of the 1930s. The economic model put in place by President Franklin D. Roosevelt and others at the end of World War II is seen as a political as well as economic break point. But arbitrarily selected demarcation points in any human timeline can be misleading. The purpose of narrative, after all, is to simplify the complex and, over time, to remake the past in today’s terms. As we approach any discussion of the Old Order, we must acknowledge that the image of intelligent design in public policy is largely an illusion.
There is no question that the world after 1950 was a reflection of the wants and needs of the United States, the victor in war and thus the designer of the peacetime system of commerce and finance that followed. Just as the Roman, Mongol and British empires did centuries earlier, America made the post–World War II peace in its own image. The U.S.-centric model enjoyed enormous success due to factors such as relatively low inflation, financial transactions that respect anonymity, an open court system and a relatively enlightened foreign policy—all unique attributes of the American system.
But the framework of the global financial system in the twentieth century and its U.S.-centric design were the end results of a series of terrible wars—starting, in the case of America, with the Civil War. The roots of the U.S.-centric financial order that arose at the end of World War II extend back into the nineteenth century and reflect the political response of a very young nation to acute problems of employment and economic growth—problems that remain unresolved today.
From an American perspective, the modern era of what we describe as the global financial system based upon the U.S. dollar begins with Abraham Lincoln, the great emancipator who took office in March 1861 as the American republic stood on the verge of dissolution. In those days, “money,” as understood by Americans, comprised gold and silver coins, foreign currency and notes issued by state-chartered banks that were convertible into metal, in that order of qualitative ranking.
The state-chartered banks of that era relied upon gold coin or specie as a store of value and means of exchange with other banks. Going back to Andrew Jackson’s epic campaign to extinguish the Second Bank of the United States in the 1830s, state-chartered banks large and small were suspicious of Washington and would not finance Lincoln’s war. Bankers in New York, Boston and London, for example, would have been happy to see the North and South separate without a war and the continuation of slavery so as not to disturb the cotton trade.
Lincoln tasked Secretary of the Treasury Salmon P. Chase to sell Treasury bonds and had Congress create national banks to buy the debt. The Treasury Department suspended convertibility and issued large quantities of “greenbacks” in the form of paper dollars to pay for the immediate cost of fighting the war. By the end of the Civil War, the greenback traded down to a fifth of face value when measured in gold. Yet Lincoln won the war, even in death, because the Union outspent the Confederacy using the credit afforded by paper money created by government fiat. And the Civil War set the precedent for Washington to engage in massive currency inflation in times of exigency and also to develop by fiat new platforms for creating financial leverage to meet national needs. As Wall Street speculator Ray Dalio wrote a century and a half later, “Virtually all of what they call money is credit (i.e., promises to deliver money) rather than money itself.” Lincoln used that fact to win the Civil War.
Even without a central bank, from the end of the Civil War to the start of World War I in 1914, the United States saw powerful economic growth. So strong was demand for a means of exchange that the much-abused greenback dollar traded back to par value against gold by the time President Ulysses S. Grant officially restored convertibility at the end of his term. The public remained highly skeptical of paper money or other promissory notes, with good reason. As Mark Twain immortalized in Pudd’nhead Wilson, Roxy lost the savings accumulated from eight years of labor as a riverboat servant when her bank failed. Roxy concluded that “hard work and economy” were insufficient to make her secure and independent.
By the turn of the twentieth century, many Americans had adopted a view similar to Roxy’s, which differed significantly from the rugged, self-reliant individualism of American pioneer mythology. Decades of financial crises tempered the independent, hard-money views of Americans. Growing urban populations worried about jobs and opportunity, while farmers, businesses and even conservative state-chartered banks fretted about access to credit. The solution that emerged was not the free market but increasingly the collective credit of the federal government in Washington.
By 1913, when the banking industry and progressive forces in Congress created the Federal Reserve System, America had been through several more financial and economic crashes, leaving bankers even more disposed to a government-sponsored entity (GSE) rescuing them from the harsh discipline of “market forces,” to recall the words of South Carolina’s Democratic senator Ernest Hollings. The private clearinghouse system developed in major U.S. cities during the nineteenth century was inadequate to provide liquidity in times of market upheaval. Thus twelve Federal Reserve banks were created to support the liquidity needs of banks and, in a big-picture sense, provide another layer of financial leverage on top of national banks to support the funding needs of the U.S. economy.
Yet, even after the Fed’s establishment, the U.S. economy continued to labor under the weight of deflation and slack demand. In that sense, the First World War was the first true watershed for America’s economic narrative; it forced Americans to look outward for growth and financial interaction with foreign states as a single nation. Public sentiment was split between sympathy for the British, French and Belgian forces, on the one hand, and for the Central powers led by Germany on the other. But all Americans welcomed the vast demand for goods and services as a relief from years of price deflation on the farm and slack job growth in urban markets driven by the adoption of new technology and imports from Europe. Allan Nevins and Henry Steele Commager wrote, “Economic considerations re-enforced sentimental and political ones.”
The gradual American economic mobilization to support the Allies in World War I not only marked a growing willingness of Americans to engage in foreign intervention overseas but also saw a vast transfer of wealth from Europe to the former colonies as a large portion of the Continent’s gold was sent to America to pay for the war. U.S. banks and eventually the Fed also provided financing for Allied purchases, which grew to a great torrent of raw materials and finished goods. So vast were the financial flows in the early days of World War I that J.P. Morgan could not manage the dollar-sterling transactions. At first, commercial paper issued by British banks could not be discounted with the Fed. The Federal Reserve Bank of New York stepped in, however, and effectively subsidized the British pound exchange rate when the Bank of England exhausted its gold reserves.
Victory bonds were sold widely to Americans to finance the war and manage domestic demand, thereby also socializing the idea of investing by individuals. A number of new GSEs were created to fund and manage the American war effort via the issuance of debt. Compared to the neoliberal orthodoxy of today, there was little fretting over market forces during World War I. Jobs and inflation were the top issues. Wage and price controls and other authoritarian mechanisms were employed by Washington without apology to limit cost increases because wages were constrained. The American farm sector recovered from years in the doldrums as global demand for cotton, grains and meat soared, pushing up domestic prices as well. By 1917, when the United States entered the war militarily on the Allied side, the American economy was running better than it had in many decades. The following year, however, when the debt available to the Allies dried up and exports to Europe slowed, the U.S. economy quickly faded as well. By 1919, the United States was entering a serious economic slump.
After World War I, America descended into a period of isolation and uneven economic circumstances presided over by the dominant Republican Party, which rejected U.S. involvement in world affairs and promptly raised tariffs to protect Americans from cheap imports. When Congress enacted Smoot-Hawley eight years later, the increase in tariffs was marginal compared with earlier increases in import taxes. But no amount of tariff protection could shield U.S. workers and industries from the impact of technological changes such as electricity and the growing use of machines.
The return to “normalcy” promised by President Warren Harding meant an environment where large corporations and banks prowled the economic landscape unhindered, and the federal government largely withdrew from the economy, compared with the policies of Teddy Roosevelt and Woodrow Wilson. The Fed played a relatively marginal role in the post–World War I period and did little to alleviate the economic stagnation that affected much of America’s rural areas. Urban workers had employment, but wages remained stagnant even as the concentration of wealth in the United States increased dramatically.
While a large part of the real economy suffered during the post–World War I period, speculation in real estate and on Wall Street grew through the 1920s. With it came financial fraud. The party ended, though, with the landmark 1925 Supreme Court decision written by Louis Brandeis in Benedict v. Ratner, which set a new standard for collateralized borrowing. The Brandeis decision, which ruled that the failure to specify the collateral was “fraud on its face,” arguably helped cause the great crash of 1929 because it effectively shut down the Wall Street sausage machine, cutting liquidity to the market.
The great Wall Street crash of 1929 completed the process of speculative boom and bust that made the market collapses and currency crises of the previous half century pale by comparison. John Kenneth Galbraith noted in The Great Crash of 1929 that Americans displayed “an inordinate desire to get rich quickly with a minimum of physical effort.” As a chronicler of the Great Depression, Galbraith describes the run-up to the Wall Street crash, including the real-estate mania in Florida in the mid-1920s. Few today recall that the precursor to the Great Depression was a real-estate bubble in the mid-1920s, an eerie parallel to the real-estate boom and bust of the 2000s. But in each case, it was the supply of credit in the form of debt that drove the boom and eventual bust in the economy.
IN THE wake of the financial and social catastrophe that followed the 1929 crash, the Franklin D. Roosevelt administration responded with government and more government. Whatever the laissez-faire excesses of the era of Republican rule in the 1920s, the New Deal Democrats lurched in the opposite direction. Historian Arthur M. Schlesinger Jr. noted that “whether revolution was a real possibility or not, faith in a free system was plainly waning.”
Roosevelt launched a campaign of vilification and intimidation against private business, a terrible but probably deliberate blunder that worsened the Depression and drove the formation of private debt capital in the United States to zero by the mid-1930s. Economist Irving Fisher notes in his celebrated 1933 essay, “The Debt-Deflation Theory of Great Depressions,” that FDR’s reflation efforts did help to avoid catastrophic price deflation, but he also blames Roosevelt for prolonging the Depression. The man Milton Friedman called America’s greatest economist wrote:
In fact, under President Hoover, recovery was apparently well started by the Federal Reserve open-market purchases, which revived prices and business from May to September 1932. The efforts were not kept up and recovery was stopped by various circumstances, including the political “campaign of fear.”
The Second World War and the new debt used to fund it ultimately rescued the United States from FDR’s economic mismanagement. The mobilization to meet the needs of the conflict quickly soaked up the excess workforce, either in terms of conscription or war industries, which were organized in a centralized fashion, as had been the case in World War I under production czar Herbert Hoover. The Fed played a secondary role in financing the New Deal and America’s military effort in World War II. By contrast, the Reconstruction Finance Corporation (RFC) under Jesse Jones took the lead as the government’s merchant bank and provided the financial muscle to fund government programs by issuing its own debt.
At the end of World War II, Britain was broke, and its leaders worried openly that the United States would take advantage of its parlous financial position in the postwar era. In geopolitical terms, the war was the handoff of imperial responsibility from London to Washington. During World War II, Britain liquidated $1 billion in overseas investments and took on another $3 billion in debt, much of which would be rescheduled and eventually forgiven. But when the British, Americans and other Allies met at Bretton Woods at the war’s end, the objective was to stimulate growth and thereby avoid another global war. The key decision taken at that meeting, which set the pattern for the post–World War II financial order, was equating the fiat paper dollar with gold.
When FDR confiscated public gold holdings in 1933 and devalued the dollar, the RFC and not the Fed was the instrument of government action. Jones took delight in having the Fed of New York execute open-market purchases of gold on behalf of the RFC. Together with giants like Leo Crowley—who organized the Federal Deposit Insurance Corporation (FDIC), ran the “lend-lease” operation in World War II and managed two reelection campaigns for FDR—Jones restructured the American economy and then financed the war’s industrial effort with massive amounts of debt.
Besides the RFC, many other parastatal entities were created before, during and after the Depression and war years that were modeled after the experiments of fascist European nations. These included the Federal Housing Administration, the Federal Home Loan Banks, Fannie Mae, the Export-Import Bank, the FDIC, the World Bank and the International Monetary Fund. All of these GSEs were designed to support economic growth via the issuance of debt atop a small foundation of capital—capital that was not in the form of gold but in the form of fiat greenback dollars and U.S. government debt.
Most industrial nations had backed away from gold convertibility by the 1950s, but the metal was still the symbolic unit of account in terms of national payments and private commercial transactions. By stating explicitly that the dollar was essentially interchangeable with gold, Bretton Woods vastly increased the global monetary base and created political possibilities for promoting economic growth that would not have been otherwise possible. Just as Lincoln used a vast expansion of the money supply and the issuance of debt to fund the Civil War, the cost of which approximated the U.S. gross national product of that era, the United States and Allied victors after World War II built the foundation of prosperity on old-fashioned money (gold) and debt (paper dollars). Civil War–era greenbacks originally bore interest to help make these “notes,” which were not backed by gold, more attractive to the public. But by 1945, the paper dollar had become de facto money for the entire world—one of many legacies of war.
Multilateral GSEs such as the World Bank and IMF fueled growth in the emerging world, while U.S. domestic growth in defense spending and later housing was driven by a growing number of domestic GSEs. “Created to rebuild Western Europe, the World Bank soon was eclipsed by the Marshall Plan and its appendages as West European capital markets recovered,” notes author and journalist Sol Sanders. “Looking for new fields to conquer, it turned to what then were unambiguously called undeveloped countries, entering its golden age under Eugene Black (1949–1963), a former Wall Street bond salesman.”
Carried by the demographic tsunami known as the baby boom, created when the “greatest generation” returned from the war, the U.S. economy fueled the rebuilding of European and Asian nations. The Marshall Plan supported growth in Europe while loans from the World Bank and IMF supported nations around the globe with everything from infrastructure loans to social-welfare schemes to explicit balance-of-payments financing—the latter something John Maynard Keynes would have condemned in a loud voice. Hardly a free trader, Keynes wrote in 1933:
I sympathize, therefore, with those who would minimize, rather than with those who would maximize, economic entanglement among nations. Ideas, knowledge, science, hospitality, travel—these are the things which should of their nature be international. But let goods be homespun whenever it is reasonably and conveniently possible, and, above all, let finance be primarily national.
Based on the dollar as the common currency of the free world, in the era known as the Cold War, the United States led a marathon of economic stamina against the Warsaw Pact nations. Loans to nations of all sizes and descriptions fueled global growth and also supported the geopolitical objective of blocking the military expansion of the Soviet Union. Developing nations such as Mexico, Brazil and India became clients of the World Bank and IMF through large loans, causing periodic political and economic crises and currency devaluations as the world reached the 1970s.
When the Berlin Wall fell in 1989, it was not from force of arms by the NATO alliance but the weight of spending and debt by the U.S. defense-industrial and multilateral-aid complex. As in World War II, the ability of America to outmatch the foe in terms of logistics and sheer weight of money—that is, credit—won the day over often-superior weapons and military forces. But while the United States won the Cold War in a geostrategic sense, the economic cost mounted enormously in terms of decades of debt issuance, accommodative monetary policy and extremely generous free-trade policies. Consumers felt the wasting effect of steady inflation, and the impact on American free-market values was corrosive in the extreme. Recalling the allegory in George Orwell’s Animal Farm, all the politicians in Washington, regardless of affiliation, became pigs. In the 1970s, when Washington tried to manage the economy via price controls, “this initiative was not the handiwork of left-wing liberals but of the administration of Richard Nixon,” wrote Daniel Yergin and Joseph Stanislaw, “a moderately conservative Republican who was a critic of government intervention in the economy.”
Through the 1970s and 1980s, as core industries were stripped out of the United States and moved offshore, lost jobs were replaced with domestic-oriented service industries. Chief among these was housing, a necessary and popular area of economic activity that supports employment but does not create any national wealth. The first surge in real-estate prices, which was again driven by the demographic force of the baby boom, ended with the savings-and-loan crisis of the late 1980s. Several of the largest U.S. banks tottered on the brink of failure in the early 1990s. But these crises only presaged the subprime meltdown of the 2000s.
As domestic growth slowed and inflation reared its ugly head, Americans for the first time since the years following World War II began to feel constrained by debt and a lack of opportunity. But instead of succumbing to the constraints of current income, Americans substituted ever-increasing amounts of debt in order to maintain national living standards. Through the 1990s and 2000s, the United States used a toxic combination of debt and easy-money policy to maintain growth levels while a politically cowed, “independent” central bank pushed interest rates lower and lower to pursue the twin goals of full employment and price stability. Under Chairman Alan Greenspan, the Fed kept the party going in terms of nominal growth, even if American consumers actually lost ground in terms of wages and inflation, proof that the Fed’s dual mandate to foster both employment and stable prices is impossibly conflicted.
The use of debt to bid up the prices of residential real estate from the late 1990s through 2007 is yet another example of the determinative impact of demographics on the economic narrative. Federal spending financed with debt started to grow dramatically in the 1980s, while mandates for future social-welfare benefits likewise began to soar. Domestic industries continued to lose ground to imports, which were encouraged through now-institutionalized free-trade policies to preserve the myth of low domestic inflation for consumers.
As the debt dependence of the United States grew from the 1980s onward, the rest of the world benefited from the steady demand for goods and services needed to satiate American consumers. So long as America was willing to incur debt to buy foreign goods, the global financial system functioned via a transfer of wealth from the now-developed U.S. economy to the less developed nations of the world. And to a large extent, the model worked. Today, India, Mexico and Brazil have all repaid their once-problematic foreign debts, leaving agencies such as the World Bank and IMF seemingly out of a job. The question remains how to turn the success of the new world as an export-oriented platform into a stable, competitive marketplace among global industries and nations.
IN A December 2011 comment in Project Syndicate, Mohamed El-Erian of PIMCO wrote:
A new economic order is taking shape before our eyes, and it is one that includes accelerated convergence between the old Western powers and the emerging world’s major new players. But the forces driving this convergence have little to do with what generations of economists envisaged when they pointed out the inadequacy of the old order; and these forces’ implications may be equally unsettling.
El-Erian points to a most troubling aspect of considering the state of the Old Order in global finance—namely, that much of it was a function of war, demographics and other factors far removed from the minds of today’s world leaders. Whereas after World War II there was a strong international consensus behind coordinated government planning when it came to global finance, today the resurgence of neoliberal thinking makes such concerted action unlikely. At the time of Bretton Woods, respected icons of the Old Order like Henry Morgenthau called publicly for government control of the financial markets; today, such views would be ridiculed as retrograde.
Yet even now, the blessed age of globalization—including support for free markets and free trade—may be receding after decades of torrid economic expansion around the globe driven by easy money and debt. “The aging of the baby boom will redirect spending toward domestically provided services and away from foreign supplied gadgetry,” one senior U.S. official said in comments for this article. “The same is true in other industrial countries. Export-led growth is overrated.”
With the subprime-mortgage crisis in the United States since 2007 and the subsequent collapse of the EU nations into a financial meltdown, the dollar remains the only currency in the world that investors trust as a means of exchange, despite America’s massive public debt. Even though the Old Order built around the dollar is in the process of disintegrating, there is simply no obvious alternative to the greenback as a means of exchange in the global economy, at least for now. As my friend and mentor David Kotok of Cumberland Advisors likes to say, “Being the reserve currency is not a job you ask for. It finds you.”
In any event, asking whether the dollar will remain the global reserve currency may be the wrong question. In practical terms, neither the euro nor any other currency is large enough to handle even a small fraction of global payments. The global energy market, for example, is too large for any currency other than the dollar to handle.
Furthermore, there are strong political reasons for the dollar’s preeminence. Far more solvent but also authoritarian nations such as Russia and China just don’t have the right combination of attributes to make their currencies a globally accepted means of exchange, much less a store of value. This fact still makes America the most attractive venue in the world for global commerce—and, yes, capital flight, albeit not a long-term store of value. But in order for the dollar to retain this privileged position, a great deal depends upon the United States turning away from years of ever-expanding government, ever-expanding debt and an ever-expanding money supply.
One of the great fallacies after World War II was that government needed to continue spending and borrowing in order to save the Allies from economic disaster, an offshoot of the Keynesian line of reasoning regarding state intervention in the economy. While choosing to rebuild the productive capacity of the world’s industrial states following the war was clearly the right policy up to a point, the resulting governmental expansion in all aspects of the U.S. domestic economy has sapped the long-term prospects of the world’s greatest market—and hence the global financial system.
Now the price has come due. Keynes and the other leading thinkers of the post–World War II era championed this leading role of government in economic affairs, but all ignored the fundamental truth that production and purchasing power are two entirely different things. Keynes believed, falsely, that purchasing power had to be kept high via government spending to support real production. But, as American economist Benjamin M. Anderson noted, “The prevailing view among economists . . . has long been that purchasing power grows out of production.”
Jobs created via productive economic activity increase the overall pool of wealth, but artificially augmenting consumer activity via government spending or monetary expansion merely slices the existing economic pie into ever-smaller pieces. Governments can use fiscal and monetary policy to encourage growth on the margins, but substituting debt-fueled public-sector spending or easy-money policies for basic economic activity is dishonest politically and madness in economic terms. Yet this is precisely the path championed by Keynes and recommended by most economists today. “We do not have to keep pouring more money into the spending stream through endless Government deficits,” argued economist and writer Henry Hazlitt in a 1945 editorial in the New York Times. “That is not the way to sound prosperity, but the way to uncontrolled inflation.” After living through almost a century of Keynesian-fueled boom and bust, the admonition of Hazlitt and other members of the free-market school is one that we would do well to heed today.
But it won’t be easy. As Friedrich Hayek wrote on this subject:
I do not think it an exaggeration to say that it is wholly impossible for a central bank subject to political control, or even exposed to serious political pressure, to regulate the quantity of money in a way conducive to a smoothly functioning market order. A good money, like good law, must operate without regard to the effects that decisions of the issuer will have on known groups or individuals. A benevolent dictator might conceivably disregard these effects; no democratic government dependent on a number of special interests can possibly do so.
Hayek’s observation really gets to the fundamental issue facing Americans—namely, that changing course after almost seven decades of economic indulgence following WWII will be a domestic political challenge of the first order. Limiting public spending and monetary policy may ultimately force a political change in America in much the same way that Germany is now imposing fiscal austerity on the peripheral states of the EU via entirely nondemocratic means.
IF AMERICA can restrain its libertine impulses and get its fiscal house in order, the reality of an open, free-market, democratic system will continue to make the dollar among the most desirable asset classes in the world. But perhaps the real question is whether America will remain a free, open and democratic society in an environment of lower economic growth and expectations. After seven decades of using debt and inflation to pull future jobs and growth into the present, the prospect of less opportunity raises the specter of domestic political turmoil in the United States and other nations. Internationally, the result could be turmoil and war. This is not merely a short-run political challenge for Washington but ultimately threatens to challenge the self-image of American society. How will Americans react to seeing their children facing declining prospects for employment and home ownership?
That in turn raises a question of whether declining living standards in the United States could eventually force a geopolitical withdrawal by Americans from the world stage. Allied nations from the UK to Israel to South Korea and Japan may soon see an end to unconditional American military and economic support.
America remains a very young, fluid country that is still trying to figure out its place in the world. While one hopes that the ethic of open borders and open markets that helped the world recover from World War II continues, Americans will be under great pressure in coming years to turn inward and may eventually revisit protectionist and interventionist policies if economic pressures become sufficiently acute. It has happened before.
But there is still plenty of room for hope and perhaps even optimism about the shape of things to come. One key component of the new international order may be a mechanism to help overly indebted, mature societies in the United States and EU make the adjustment process in the same way that the emerging debtor nations of the 1980s have become engines of growth today. By giving the other nations of the world greater responsibility in managing the global financial system, we may be able to hasten the day when all nations trade in a global clearinghouse system based on the competitive position of each. The notion of a global currency is attractive in theory and goes back to some of the ideas of Keynes and others at Bretton Woods. But it remains to be seen if investors want to embrace an ersatz global currency that is not connected to a dominant nation.
The reason that the dollar is the currency of choice in the free world is because of the American political system, not just economic or foreign-policy considerations. If that open system remains intact, the role of the dollar in the global financial system is unlikely to change. As I wrote in my 2010 book Inflated, if Americans gradually deal with the explosion of government in the post–New Deal era and steer the economic course back toward a more responsible fiscal formulation focused on individual rights and responsibilities, our future is quite bright. In that event, the dollar is likely to remain the center of the global financial system for some time to come.
But should America’s political leaders continue to embrace the policies of borrow and spend championed by Paul Krugman and other mainstream economists, the likelihood is that Washington will not be able to preserve the dollar’s special role. Just as nations cannot substitute inflation and debt for true production and employment without slowly destroying the underlying purchasing power of their people, America cannot continue to play a leading geopolitical role in the world if its domestic economy falters. And there seem to be few alternatives to the United States.
As America comes to accept that there are real limits on its economic and military power, the leading role of the dollar in the global economy eventually may have to end. In that event, the world will face a future with no single nation acting as the guarantor of global security and economic stability. Instead, we may see a world with many roughly equal nations competing for a finite supply of global trade and economic resources, precisely the situation that prevailed prior to World War I. The choice facing all societies going back to the Greeks, Romans, the British Empire and now America seems to lie between using inflation and debt to stimulate economic growth when real production proves inadequate and turning to war to create growth at the expense of others. Finding a way to avoid these two extremes is now the chief concern.