Spain was the first global superpower. Obviously, there had been other great powers—Rome, the China of the Tang and Ming dynasties, the vast Mongol domains—but none had spanned oceans and continents the way the Spanish Empire did at its height. In the first half of the 16th century, Charles V reigned over vast swathes of Europe; his son Philip II controlled most of the Western Hemisphere as well as a sizable chunk of Asia (the Philippines were named after him). Imperial Spain’s maximum territorial reach would only be surpassed by the British Empire in the 19th century, and in the 20th by the informal American imperium, with its 750 overseas bases and network of global alliances.

But then Spain blew it. Already by the middle of the 17th century, under the crisis-ridden rule of Philip IV, the Iberian kingdom “had been left behind by the rest of Europe,” as John A. Crow wrote in his classic study, Spain: The Root and the Flower. England’s emergent sea power had dealt an early, crippling blow to Spanish naval might in the 1588 defeat of the Armada. A little more than three centuries later, the United States would effectively end Spain’s overseas empire, seizing control of its last colonies in Cuba, Puerto Rico, the Philippines, and Guam. Between these two catastrophes there intervened a long period of slow decline.

“The parallels between America and Spain are striking.”

Those contemplating the possible demise of American global hegemony most often turn for lessons to Rome or the Soviet Union, but the parallels between America and Spain are striking. Both countries, in their formative, pre-imperial periods, were defined by processes of territorial expansion across shifting frontiers: the reconquista of southern Spain from the Moors and the conquest of the American West. In both cases, the closing of the frontier—Spain’s in 1492 with the capture of Granada by Ferdinand and Isabella, America’s famously marked by the historian Frederick Jackson Turner in 1893—coincided with the initial phase of overseas territorial expansion that would lead to superpower status. Columbus arrived in the Caribbean the same year Granada fell; America’s seizure of Cuba, Puerto Rico, and the Philippines occurred in 1898, the same year Congress ratified the annexation of Hawaii.

But the most revealing parallels relate to a different expansionary dynamic—that of money. The key to so much else that happened to both countries was the appearance of what seemed like unlimited wealth but was actually access to unlimited quantities of a universal medium of exchange, craved and accepted everywhere. In the late 16th century, the Spanish elite could buy whatever it wanted wherever it wanted with the gold and silver that was pouring into Spain from the mines of Peru and Mexico. Today, the American ruling class can do the same with US dollars created at will and deposited in the memory banks of the Federal Reserve’s computers. That Midas-like power permitted elites in both countries to confuse money with what it could buy, and led to financialization, politically dangerous levels of inequality, and the wasting of wealth on endless wars aimed at remaking the world in the image, respectively, of Iberian Catholicism and American democracy.

In his 2015 book Killing the Host, economist Michael Hudson elaborates on this similarity:

Despite its vast stream of gold and silver, Spain became the most debt-ridden country in Europe—with its tax burden shifted entirely onto the least affluent, blocking development of a home market. Yet today’s neoliberal lobbyists are urging similar tax favoritism to un-tax finance and real estate, shift the tax burden onto labor and consumers, cut public infrastructure and social spending, and put rentier managers in charge of government. The main difference from Spain and other post-feudal economies is that interest to the financial sector has replaced the rent paid to feudal landlords.

The process by which the fiat currency of the United States became the specie of our time was more convoluted than Spain’s pillaging of the Aztec and Inca empires. But the story is worth telling, since the dollar’s hegemony is more than a simple natural consequence of America’s position as the world’s premier economic and military power. It might not have happened had it not been for a series of accidents—among them, an ill-conceived tax measure and an assassination attempt against a popular president—and measures taken for their own reasons by some non-Americans, including an eminent German-Jewish banker, Arabs seeking to avoid US sanctions, and portfolio managers at Japanese insurance companies.


Dollar hegemony had its origins in America’s emergence from World War I as the planet’s principal creditor and supreme economic power. But it took the Bretton Woods Conference of 1944 to establish the dollar’s central role in global finance. Bretton Woods is the New Hampshire resort where the Allied powers convened to construct a formal monetary order for the postwar world. The architects of that order, Harry Dexter White and John Maynard Keynes, sought to avoid any return to the monetary chaos of the interwar years when Washington had refused to step up and manage global monetary matters at a time when no other government—particularly, that in London—had the power to do so anymore.

“An unforeseen series of events cemented the dollar’s place at the core of global finance.”

Bretton Woods, which became shorthand for the monetary regime that prevailed between 1945 and 1971, was indispensable to the three decades of prosperity that followed the end of World War II. But this arrangement proved too rigid to survive the onset in the late 1960s of American trade deficits, together with their flipside: Japanese and West German trade surpluses. (Keynes had attempted to insert a mechanism into the Bretton Woods system that would have required countries to take steps to reduce surpluses; he was overruled by American delegates unable to foresee the day when the United States could or would run trade deficits.) The advent of structural American deficits on its external accounts didn’t—contrary both to expectations at the time and conventional theory—dethrone the dollar, despite a rough decade for the American currency after Richard Nixon unilaterally reneged on Washington’s obligation to exchange gold for dollars presented by foreign central banks. Instead, an unforeseen series of events cemented the dollar’s place at the core of global finance.

The story starts back in 1963, when the Kennedy administration sought to tax interest payments from foreign companies and governments that went to New York to borrow money. Siegmund Warburg, a refugee from Hitler’s Germany who went on to establish Britain’s premier merchant bank, demonstrated that the tax could be circumvented by raising dollars in London from non-Americans via transactions managed by British and European, rather than American, banks and governed by UK law and regulatory oversight. Washington reacted with outrage, but the long-term result was the largest offshore financial market the world had ever seen—the market in eurodollars and eurobonds, which played an indispensable role in making the dollar the world’s currency. Another result was that London regained its status as the world’s financial capital, demoting New York to secondary status.

This arrangement meant, for instance, that if you were an Arab oil company in 1974, you instructed your customers to pay for your oil by depositing dollars into Arab accounts with banks in London where the dollars were beyond the reach of American regulators seeking, say, to sanction Arab countries for going to war with Israel. (The Saudis briefly considered billing their customers in some currency other than the dollar, but quickly realized that no other currency circulated in sufficient quantities to allow their customers to put their hands on enough of it to pay their oil-import bills.) Hence, countries like Brazil and Argentina went to London to borrow the dollars they needed to meet the Arab oil exporters’ demands. American trade deficits and military spending overseas may have fed the ever-growing pool of dollars circulating outside the United States, but the infrastructure of the eurodollar and eurobond markets helped ensure that path dependence would begin to take hold. If most companies were billing their foreign customers in dollars and using dollars to pay their cross-border obligations, yours was probably going to do so, too.

The path dependence that would entrench the dollar’s supremacy hadn’t yet become fully established in the mid-1970s, even if Saudi Arabia and other OPEC nations had reluctantly concluded they had no choice for the time being but to bill their customers in dollars. The 1970s were, in fact, a bad time for the American currency: It lost almost two-thirds of its purchasing power during that decade, and many were desperate for an alternative, a desperation that led the dollar’s value to plunge on foreign-exchange markets in 1978.

The next year, President Jimmy Carter, backed into a corner by that crisis and the soaring inflation that was its proximate cause, engineered the replacement of William Miller as head of the Federal Reserve with Paul Volcker. Carter barely knew who Volcker was—he was serving as governor of the New York Fed at the time—but the president had been told that Volcker was one of the few men capable of calming the foreign-exchange and bond markets and stopping a possible dollar collapse. This Volcker proceeded to do. He hiked interest rates to the point where traders bought dollar instruments again. The political price was a steep recession that, along with events in Iran, doomed Carter’s re-election chances.

Carter’s successor, Ronald Reagan, went on to preside over what were then the largest peacetime government deficits in American history. He justified them with the notion that cutting taxes would increase revenues, but the boom he presided over demonstrated not the validity of arch-supply-sider Arthur Laffer’s infamous curve, but the truth of Keynes’s insight that the surest way to revive a moribund economy is a large government deficit, provided the deficit can be financed. That such a deficit could be financed was by no means obvious in 1981. Tip O’Neill—then speaker of the House and a savvy politician who understood that he couldn’t stand in the way of a popular president who had become even more popular in the wake of the attempted assassination in April of that year—allowed the Reagan tax cuts to go through Congress, figuring the Republicans would have to come back, hat in hand, when the bond markets tanked. Instead, the Japanese, for their own reasons, gulped down the flood of bonds issued to finance the Reagan deficits, reviving the US economy by putting money in Americans’ pockets—which they promptly spent on Japanese imports, further entrenching the dollar’s role as the world’s money.

There were further rough patches on the road to dollar supremacy, among them the devastation to whole swaths of American industry from the super-strong dollar that was the inevitable product of the combination of high interest rates and voracious demands by the US Treasury for cash; the botched attempt in 1985 to bring the dollar down (the exchange rate did come down, but that did nothing to reduce the trade deficit—when the markets digested that, we got the 1987 stock-market crash); and the US-Japan trade conflicts of the late 1980s and early ’90s, with their implied threats to the free circulation of dollars in and out of the United States. But by 1995, when Treasury Secretary Robert Rubin figured out a way to bypass GOP attempts to embarrass the Clinton administration by denying the International Monetary Fund the money it needed to bail out Mexico in the wake of the peso crisis, the dollar’s central position in global finance was completely solidified. (Had America been unwilling or unable to rescue a country that the markets saw as being in its own backyard, that might not have happened.)

Many today predict the looming end of dollar hegemony. These prophets of doom point to the rise of China, to Washington’s attempts to weaponize the dollar with its sanctions against the likes of Russia and Iran, and to the periodic debt-limit shenanigans that seem to suggest that the US government might fail to honor its contractual obligations. At the time of this writing, however, neither the foreign exchange nor the bond markets betray the slightest hint of dollar shakiness. Today, it costs more than 147 Japanese yen to buy a dollar—more than it has since August 1998; a euro will cost you $1.07, whereas just before the onset of the global financial crisis in 2008, it was $1.60; a dollar today will fetch 7.32 Chinese yuan, which is roughly what you could have gotten in 2008.

No other currency really matters in the grand scheme of things; no one talks of the Swiss franc or the Australian dollar dethroning the US dollar’s role as the premier settlement and reserves currency. As for the bond market, the Federal Reserve seems to have managed the trick of hiking interest rates without provoking anything resembling a panic: A one-year Treasury security yields about the same rate of return (5.40 percent) today that it did in 2008. The latest published numbers show that dollar-denominated securities constituted 58 percent of global reported foreign exchange reserves. The figures for the euro, the yen, sterling, and the yuan were, respectively, 21 percent, 6 percent, 5 percent, and 3 percent. If the US dollar bond market were tanking, those numbers would be very different.


Over the past several decades, dollar hegemony has landed the United States in the same position that Spain enjoyed after the conquest of the Aztec and Inca empires. Like Habsburg monarchs of an earlier age, those who manage the American economy are able to run “deficits without tears,” to quote the French economist and Charles de Gaulle adviser Jacques Rueff’s bon mot But if Spanish history is any guide, Rueff wasn’t entirely right about the lack of tears. Regrettably, all that money tempts you to do what less well-positioned countries would hesitate to consider—because they can’t afford it: namely, wage endless wars. Spain’s example suggests that tears may indeed await countries whose elites convince themselves that they have been chosen by God to spread his faith throughout the world—or, to update the language, that theirs is the world’s “indispensable nation,” ordained by history to bring it to an end.

Michael Lind once wrote that “countries that cannot finance current account deficits cannot fight wars.” In other words, if you can’t find the money to field and equip armies, you find your capacity for military engagement, well, stymied. Consider the adage that the Confederacy lost the Civil War as much on the banking floors of London as on the battlefields of Virginia. At the end of the day, in other words, the Confederates couldn’t borrow the money they needed to keep fighting. But there is a flipside to Lind’s remark. Running “deficits without tears,” it would seem, tempts countries—or at least, the ruling classes of such countries—into global imperial projects, driven by the fantasy that they are the vehicles of universal salvation.

In Spain: The Root and the Flower, Crow observed that when Charles V “became engaged in wars all over Europe and America … the aim that guided his life was to overcome the Protestants and establish an immense Catholic empire.” This aim should be familiar to contemporary readers. From John F. Kennedy’s summons to “pay any price, bear any burden … to assure the survival and the success of liberty” to George W. Bush advisers David Frum’s and Richard Perle’s call for an “end to evil” in a book of that title to President Biden’s portrayal of Washington’s proxy battles in Ukraine as a matter of “standing guard over freedom today, tomorrow, and forever,” those at the helm of America’s global empire have been driven by comparably grandiose visions. Indeed, once you start looking for resonances between the Spain of Philip II, III, and IV, on one hand, and the United States from JFK straight through to Biden, on the other, you start to notice them everywhere—and they aren’t limited to flights of rhetoric.

You see Lyndon Johnson and Richard Nixon, like Philip II before them, deploying vast armadas against third-rate powers and having their asses handed back to them. You see the transfer of the foundation of genuine wealth—industrial capacity—away from Spain and the United States to their satellites and rivals: the Low Countries and Britain in the first instance; Japan, South Korea, and China in the second. You see that funneling of “wealth into foreign coffers” that Crow wrote of in Habsburg Spain occurring once again in the George W. Bush administration’s expenditures on the invasion of Iraq—spending that roughly equaled the spurt in purchases of US Treasury securities by China and Japan. (They earned the dollars to purchase those securities by running trade and current-account surpluses with America.)

The parallels extend from the material substrate to the cultural and ideological realm. The Protestant Reformation provoked the same kind of hysteria and outrage among Spanish elites of the 16th century that the rise of Soviet Communism did among their American counterparts in four centuries later. This isn’t surprising, because the Protestant confessions and Marxism are heretical sects within overarching faiths—and civil wars, particularly religious civil wars, tend to be the most vicious. In other words, neither Protestants nor Marxists challenged the core tenets of the established faiths from which they emerged: They were simply going to be better communicants, purifying Christianity or, as the case may be, employing “scientific” means to bring about the Enlightenment vision of a society run in accordance with the dictates of reason. Nonetheless, the intention to reform or replace existing institutions—the priesthood and the sacraments; capitalist markets and bourgeois democracy—provoked a virulent reaction.

Crow wrote of the era in which Spain became the beachhead of the Counter-Reformation: “Everything Spanish and Catholic was good; everything that was non-Spanish and non-Catholic was evil.” One is reminded of how the United States, the self-anointed embodiment of Enlightenment virtues of liberty, tolerance, and progress, reacted to the appearance of a rival claimant in Moscow to the title of history’s leading actor. A self-identified Communist in the America of the 1950s faced somewhat less horrible prospects than a Protestant in the Spain of the 1590s—though at times the electric chair, as in the cases of Julius and Ethel Rosenberg, was substituted for the stake. And the effects were similar. As Crow noted, “there never arose in Spain a single Protestant Church,” and the United States, unlike Western European democracies, has never had a left political party that came anywhere close to seizing the levers of power. (Of course, these parallels have limits. Among other things, Marxism as a rival ideology to liberal capitalism has collapsed, while many Protestant sects are eating away at Rome’s dominance in the very lands in which Spain succeeded in its goal of implanting the Catholic religion.)

The collapse of Soviet Communism hasn’t prevented the American elite from continuing to portray its foreign adventures in a manner reminiscent of the way Spain justified its role in the Thirty Years War: as crusades to stamp out evil and heresy. A vast, bloated defense establishment—comprising weapons manufacturers, Pentagon functionaries, think-tank denizens, spies, and their media mouthpieces at organs such as The Atlantic and The Washington Post with their endless drum-banging for ever greater expenditures of blood and treasure—needs these wars to justify its continued existence, but can’t admit as much. Hence, garden-variety strongmen like Vladimir Putin are cast as devils of limitless malevolence and power, able to dictate the outcomes of US presidential elections by zombifying millions of Americans—much as disaffected youths in the Middle East and tribal elders in the Hindu Kush with antiquated notions of gender relations were labeled “evil” two decades ago, with the implication that no price is too great to bring about their “end.” Likewise, a rising power in East Asia, acting much as the United States did 200 years ago with the proclamation of the Monroe Doctrine, is cast as some sort of historically unprecedented monster for signaling that it intends to put an end to outside interference in its region.

“The most formidable tool of the new Holy Office is its control of the internet.”

Nor did the collapse of the USSR prevent the Washington Blob from establishing an updated Tribunal of the Holy Office of the Inquisition, or several. It has new names, the “Global Engagement Center” in the State Department being one of the most noteworthy, but the aims remain the same: ferreting out heresy or, as they now call it, “disinformation.” The origins and aims of these new incarnations of the Holy Office have been laid out in devastating detail by Tablet’s Jacob Siegel:

The United States is still in the earliest stages of a mass mobilization that aims to harness every sector of society under a singular technocratic rule. The mobilization, which began as a response to the supposedly urgent menace of Russian interference, now evolves into a regime of total information control that has arrogated to itself the mission of eradicating abstract dangers such as error, injustice, and harm—a goal worthy only of leaders who believe themselves to be infallible.

The institutional and technical apparatus that this newly reincarnated Holy Office has constructed to impose orthodoxy and eliminate error would earn the envy and admiration of its Spanish forebears. It starts with the ubiquitous diversity, equity, and inclusion offices in practically every corporation, university, government agency, and civil-society organization. The stenography that has replaced journalism in the establishment media—CNN, MSNBC, The New York Times—has also proved helpful; to quote Siegel again, “the American press, once the guardian of democracy, was hollowed out to the point that it could be worn like a hand puppet by the US security agencies and party operatives.” But the most formidable tool of the new Holy Office is its control of the internet. Here we find echoes of the draconian response of the Catholic hierarchy to translations of the Bible into the vernacular and their wide dissemination made possible by the newly invented printing press. Imagine the scandal of those without clerical collars—or Ivy League diplomas, as the case may be—making up their own minds over how to interpret holy writ or, as it is now termed, “the Science”!


The Inquisition ultimately failed—or rather, while it succeeded at stamping out heresy inside Spain, the price of that success was the loss of Spain’s preeminence. Closed off from the economic and intellectual progress that ultimately would help its European rivals surpass it, Spain devolved from the premier great power into a stultifying backwater. Which raises the question: Will the instruments of social control at the disposal of America’s ruling class ultimately work? And if so, does a similar fate await the United States? The question was originally posed by Orwell: Can a disciplined ruling class emerge with sufficient means at its disposal that its overthrow becomes unimaginable? The Spanish example suggests that it can’t: that efforts to dictate what people see, read, hear, and think contain the seeds of elite downfall, because they cause the elites themselves to become blind to reality.

“Efforts to dictate what people see, read, hear, and think contain the seeds of elite downfall.”

The American ruling class found itself compelled to spin bizarre fantasies of Russian “disinformation” to explain away the results of the 2016 election. It can no longer tell you what a woman is, and has gone to great lengths to censor and suppress stories—the Hunter Biden laptop, the Wuhan lab leak—it later acknowledged to be true or plausible. The same elite ties itself into absurd knots to deny the fact that it seeks to establish a system of racial quotas in education, government, and business. To say the least, none of this suggests the intelligence and self-mastery that Orwell saw as preconditions for the successful practice of doublethink, obligatory for ensuring the perpetuity of elite rule. On the contrary, it suggests an elite that is, as occurred in 17th-century Spain, collapsing into decadence.

Crow quoted the poet Garcilaso de Vega, who at the peak of Spain’s Golden Age prophesied a time when “everything is gone, even the name. Of house and home and wife and memory. And what’s the use of it? A little fame? The nation’s thanks? A place in history?” Seventy years later, when it had become clear that Spain’s hubris, its fantasies of of birthing a global dominion of perfect orthodoxy, had indeed received history’s comeuppance, the composer Tomás Luis de Victoria published what is arguably the supreme masterpiece of Renaissance polyphony: the Officium Defunctorum, often known as the Victoria Requiem. Victoria composed the music for the funeral of Philip II’s sister, the Dowager Empress Maria, and after revising it for publication, dedicated it to her daughter Margaret.

Musicologists often depict the piece as a kind of summing up of the previous two centuries of musical development; polyphony was already being supplanted in Italy and the rest of Europe by the Baroque style, with its single melodic line supported by a network of other pitches, rather than the dense interweaving of voices that had characterized European sacred music from the early 15th century until Victoria (and his English contemporary William Byrd) brought it to a close. The Requiem can also be—and has been—heard as a requiem for the whole of Spain’s Golden Age. In its soaring lines and progressions of such poignancy that one can hardly bear to hear them, one senses an infinite regret at the passing of what once was, and what could have been.

The poetry of Garcilaso and the music of Victoria—not to mention the paintings of El Greco, Murillo, and Velázquez; novels such as Lazarillo de Tormes and Don Quixote; the plays of Lope de Vega and Calderón; and the cathedrals of Seville, Toledo, and Santiago de Compostela—point to another limit of any exercise in drawing parallels between Spain’s Golden Age and the high watermark of American global hegemony. The money pumps that the Spanish and American elites stumbled onto may have indeed have seduced them both into endless wars, into quixotic—to choose a term deliberately—crusades to impose a universal orthodoxy, into tolerance for a widening abyss separating an obscenely wealthy class of plutocrats from everyone else. But the Spanish elite did have taste and, as they used to say, “class.” They left behind them a nation in ruins, but they also helped bequeath to the world some of its greatest achievements in literature, painting, music, and architecture.

Could anyone contemplating American culture over the last generation—the preachy movies, the sheer ugliness of the urban landscapes, the banal formulaic music, the drips and splashes of the contemporary art scene—possibly say the same about the United States with a straight face? To ask the question, alas, is to answer it.

R. Taggart Murphy is professor emeritus of international political economy at the University of Tsukuba and author of Japan and the Shackles of the Past and The Weight of the Yen.

Get the best of Compact right in your inbox.

Sign up for our free newsletter today.

Great! Check your inbox and click the link.
Sorry, something went wrong. Please try again.