Here’s an unavoidable fact: we are now in a Brexit world. We are seeing the first signs of a major fragmentation of this planet that, until recently, the cognoscenti were convinced was globalizing rapidly and headed for unifications of all sorts. If you want a single figure that catches the grim spirit of our moment, it’s 65 million. That’s the record-setting number of people that the Office of the U.N. High Commissioner for Refugees estimates were displaced in 2015 by “conflict and persecution,” one of every 113 inhabitants of the planet. That’s more than were generated in the wake of World War II at a time when significant parts of the globe had been devastated. Of the 21 million refugees among them, 51% were children (often separated from their parents and lacking any access to education). Most of the displaced of 2015 were, in fact, internal refugees, still in their own often splintered states. Almost half of those who fled across borders have come from three countries: Syria (4.9 million), Afghanistan (2.7 million), and Somalia (1.1 million).
Despite the headlines about refugees heading for Europe — approximately a million of them made it there last year (with more dying on the way) — most of the uprooted who leave their homelands end up in poor or economically mid-level neighboring lands, with Turkey at 2.5 million refugees leading the way. In this fashion, the disruption of spreading conflicts and chaos, especially across the Greater Middle East and Africa, only brings more conflict and chaos with it wherever those refugees are forced to go.
And keep in mind that, as extreme as that 65 million figure may seem, it undoubtedly represents the beginning, not the end, of a process. For one thing, it doesn’t even include the estimated 19 million people displaced last year by extreme weather events and other natural disasters. Yet in coming decades, the heating of our planet, with attendant weather extremes (like the present heat wave in the American West) and rising sea levels, will undoubtedly produce its own waves of new refugees, only adding to both the conflicts and the fragmentation.
As Patrick Cockburn points out today, we have entered “an age of disintegration.” And he should know. There may be no Western reporter who has covered the grim dawn of that age in the Greater Middle East and North Africa — from Afghanistan to Iraq, Syria to Libya — more fully or movingly than he has over this last decade and a half. His latest book, Chaos & Caliphate: Jihadis and the West in the Struggle for the Middle East, gives a vivid taste of his reporting and of a world that is at present cracking under the pressure of the conflicts he has witnessed. And imagine that so much of this began, at the bargain-basement cost of a mere $400,000 to $500,000, with 19 (mainly Saudi) fanatics, and a few hijacked airliners. Osama bin Laden must be smiling in his watery grave. Tom
The Age of Disintegration Neoliberalism, Interventionism, the Resource Curse, and a Fragmenting World
By Patrick Cockburn
We live in an age of disintegration. Nowhere is this more evident than in the Greater Middle East and Africa. Across the vast swath of territory between Pakistan and Nigeria, there are at least seven ongoing wars — in Afghanistan, Iraq, Syria, Yemen, Libya, Somalia, and South Sudan. These conflicts are extraordinarily destructive. They are tearing apart the countries in which they are taking place in ways that make it doubtful they will ever recover. Cities like Aleppo in Syria, Ramadi in Iraq, Taiz in Yemen, and Benghazi in Libya have been partly or entirely reduced to ruins. There are also at least three other serious insurgencies: in southeast Turkey, where Kurdish guerrillas are fighting the Turkish army, in Egypt’s Sinai Peninsula where a little-reported but ferocious guerrilla conflict is underway, and in northeast Nigeria and neighboring countries where Boko Haram continues to launch murderous attacks.
All of these have a number of things in common: they are endless and seem never to produce definitive winners or losers. (Afghanistan has effectively been at war since 1979, Somalia since 1991.) They involve the destruction or dismemberment of unified nations, their de facto partition amid mass population movements and upheavals — well publicized in the case of Syria and Iraq, less so in places like South Sudan where more than 2.4 million people have been displaced in recent years.
Add in one more similarity, no less crucial for being obvious: in most of these countries, where Islam is the dominant religion, extreme Salafi-Jihadi movements, including the Islamic State (IS), al-Qaeda, and the Taliban are essentially the only available vehicles for protest and rebellion. By now, they have completely replaced the socialist and nationalist movements that predominated in the twentieth century; these years have, that is, seen a remarkable reversion to religious, ethnic, and tribal identity, to movements that seek to establish their own exclusive territory by the persecution and expulsion of minorities.
In the process and under the pressure of outside military intervention, a vast region of the planet seems to be cracking open. Yet there is very little understanding of these processes in Washington. This was recently well illustrated by the protest of 51 State Department diplomats against President Obama’s Syrian policy and their suggestion that air strikes be launched targeting Syrian regime forces in the belief that President Bashar al-Assad would then abide by a ceasefire. The diplomats’ approach remains typically simpleminded in this most complex of conflicts, assuming as it does that the Syrian government’s barrel-bombing of civilians and other grim acts are the “root cause of the instability that continues to grip Syria and the broader region.”
It is as if the minds of these diplomats were still in the Cold War era, as if they were still fighting the Soviet Union and its allies. Against all the evidence of the last five years, there is an assumption that a barely extant moderate Syrian opposition would benefit from the fall of Assad, and a lack of understanding that the armed opposition in Syria is entirely dominated by the Islamic State and al-Qaeda clones.
Though the invasion of Iraq in 2003 is now widely admitted to have been a mistake (even by those who supported it at the time), no real lessons have been learned about why direct or indirect military interventions by the U.S. and its allies in the Middle East over the last quarter century have all only exacerbated violence and accelerated state failure.
A Mass Extinction of Independent States
The Islamic State, just celebrating its second anniversary, is the grotesque outcome of this era of chaos and conflict. That such a monstrous cult exists at all is a symptom of the deep dislocation societies throughout that region, ruled by corrupt and discredited elites, have suffered. Its rise — and that of various Taliban and al-Qaeda-style clones — is a measure of the weakness of its opponents.
The Iraqi army and security forces, for example, had 350,000 soldiers and 660,000 police on the books in June 2014 when a few thousand Islamic State fighters captured Mosul, the country’s second largest city, which they still hold. Today the Iraqi army, security services, and about 20,000 Shia paramilitaries backed by the massive firepower of the United States and allied air forces have fought their way into the city of Fallujah, 40 miles west of Baghdad, against the resistance of IS fighters who may have numbered as few as 900. In Afghanistan, the resurgence of the Taliban, supposedly decisively defeated in 2001, came about less because of the popularity of that movement than the contempt with which Afghans came to regard their corrupt government in Kabul.
Everywhere nation states are enfeebled or collapsing, as authoritarian leaders battle for survival in the face of mounting external and internal pressures. This is hardly the way the region was expected to develop. Countries that had escaped from colonial rule in the second half of the twentieth century were supposed to become more, not less, unified as time passed.
Between 1950 and 1975, nationalist leaders came to power in much of the previously colonized world. They promised to achieve national self-determination by creating powerful independent states through the concentration of whatever political, military, and economic resources were at hand. Instead, over the decades, many of these regimes transmuted into police states controlled by small numbers of staggeringly wealthy families and a coterie of businessmen dependent on their connections to such leaders as Hosni Mubarak in Egypt or Bashar al-Assad in Syria.
In recent years, such countries were also opened up to the economic whirlwind of neoliberalism, which destroyed any crude social contract that existed between rulers and ruled. Take Syria. There, rural towns and villages that had once supported the Baathist regime of the al-Assad family because it provided jobs and kept the prices of necessities low were, after 2000, abandoned to market forces skewed in favor of those in power. These places would become the backbone of the post-2011 uprising. At the same time, institutions like the Organization of Petroleum Exporting Countries (OPEC) that had done so much to enhance the wealth and power of regional oil producers in the 1970s have lost their capacity for united action.
The question for our moment: Why is a “mass extinction” of independent states taking place in the Middle East, North Africa, and beyond? Western politicians and media often refer to such countries as “failed states.” The implication embedded in that term is that the process is a self-destructive one. But several of the states now labeled “failed” like Libya only became so after Western-backed opposition movements seized power with the support and military intervention of Washington and NATO, and proved too weak to impose their own central governments and so a monopoly of violence within the national territory.
In many ways, this process began with the intervention of a U.S.-led coalition in Iraq in 2003 leading to the overthrow of Saddam Hussein, the shutting down of his Baathist Party, and the disbanding of his military. Whatever their faults, Saddam and Libya’s autocratic ruler Muammar Gaddafi were clearly demonized and blamed for all ethnic, sectarian, and regional differences in the countries they ruled, forces that were, in fact, set loose in grim ways upon their deaths.
A question remains, however: Why did the opposition to autocracy and to Western intervention take on an Islamic form and why were the Islamic movements that came to dominate the armed resistance in Iraq and Syria in particular so violent, regressive, and sectarian? Put another way, how could such groups find so many people willing to die for their causes, while their opponents found so few? When IS battle groups were sweeping through northern Iraq in the summer of 2014, soldiers who had thrown aside their uniforms and weapons and deserted that country’s northern cities would justify their flight by saying derisively: “Die for [then-Prime Minister Nouri] al-Maliki? Never!”
A common explanation for the rise of Islamic resistance movements is that the socialist, secularist, and nationalist opposition had been crushed by the old regimes’ security forces, while the Islamists were not. In countries like Libya and Syria, however, Islamists were savagely persecuted, too, and they still came to dominate the opposition. And yet, while these religious movements were strong enough to oppose governments, they generally have not proven strong enough to replace them.
Too Weak to Win, But Too Strong to Lose
Though there are clearly many reasons for the present disintegration of states and they differ somewhat from place to place, one thing is beyond question: the phenomenon itself is becoming the norm across vast reaches of the planet.
If you’re looking for the causes of state failure in our time, the place to start is undoubtedly with the end of the Cold War a quarter-century ago. Once it was over, neither the U.S. nor the new Russia that emerged from the Soviet Union’s implosion had a significant interest in continuing to prop up “failed states,” as each had for so long, fearing that the rival superpower and its local proxies would otherwise take over. Previously, national leaders in places like the Greater Middle East had been able to maintain a degree of independence for their countries by balancing between Moscow and Washington. With the break-up of the Soviet Union, this was no longer feasible.
In addition, the triumph of neoliberal free-market economics in the wake of the Soviet Union’s collapse added a critical element to the mix. It would prove far more destabilizing than it looked at the time.
Again, consider Syria. The expansion of the free market in a country where there was neither democratic accountability nor the rule of law meant one thing above all: plutocrats linked to the nation’s ruling family took anything that seemed potentially profitable. In the process, they grew staggeringly wealthy, while the denizens of Syria’s impoverished villages, country towns, and city slums, who had once looked to the state for jobs and cheap food, suffered. It should have surprised no one that those places became the strongholds of the Syrian uprising after 2011. In the capital, Damascus, as the reign of neoliberalism spread, even the lesser members of the mukhabarat, or secret police, found themselves living on only $200 to $300 a month, while the state became a machine for thievery.
This sort of thievery and the auctioning off of the nation’s patrimony spread across the region in these years. The new Egyptian ruler, General Abdel Fattah al-Sisi, merciless toward any sign of domestic dissent, was typical. In a country that once had been a standard bearer for nationalist regimes the world over, he didn’t hesitate this April to try to hand over two islands in the Red Sea to Saudi Arabia on whose funding and aid his regime is dependent. (To the surprise of everyone, an Egyptian court recently overruled Sisi’s decision.)
That gesture, deeply unpopular among increasingly impoverished Egyptians, was symbolic of a larger change in the balance of power in the Middle East: once the most powerful states in the region — Egypt, Syria, and Iraq — had been secular nationalists and a genuine counterbalance to Saudi Arabia and the Persian Gulf monarchies. As those secular autocracies weakened, however, the power and influence of the Sunni fundamentalist monarchies only increased. If 2011 saw rebellion and revolution spread across the Greater Middle East as the Arab Spring briefly blossomed, it also saw counterrevolution spread, funded by those oil-rich absolute Gulf monarchies, which were never going to tolerate democratic secular regime change in Syria or Libya.
Add in one more process at work making such states ever more fragile: the production and sale of natural resources — oil, gas, and minerals — and the kleptomania that goes with it. Such countries often suffer from what has become known as “the resources curse”: states increasingly dependent for revenues on the sale of their natural resources — enough to theoretically provide the whole population with a reasonably decent standard of living — turn instead into grotesquely corrupt dictatorships. In them, the yachts of local billionaires with crucial connections to the regime of the moment bob in harbors surrounded by slums running with raw sewage. In such nations, politics tends to focus on elites battling and maneuvering to steal state revenues and transfer them as rapidly as possible out of the country.
This has been the pattern of economic and political life in much of sub-Saharan Africa from Angola to Nigeria. In the Middle East and North Africa, however, a somewhat different system exists, one usually misunderstood by the outside world. There is similarly great inequality in Iraq or Saudi Arabia with similarly kleptocratic elites. They have, however, ruled over patronage states in which a significant part of the population is offered jobs in the public sector in return for political passivity or support for the kleptocrats.
In Iraq with a population of 33 million people, for instance, no less than seven million of them are on the government payroll, thanks to salaries or pensions that cost the government $4 billion a month. This crude way of distributing oil revenues to the people has often been denounced by Western commentators and economists as corruption. They, in turn, generally recommend cutting the number of these jobs, but this would mean that all, rather than just part, of the state’s resource revenues would be stolen by the elite. This, in fact, is increasingly the case in such lands as oil prices bottom out and even the Saudi royals begin to cut back on state support for the populace.
Neoliberalism was once believed to be the path to secular democracy and free-market economies. In practice, it has been anything but. Instead, in conjunction with the resource curse, as well as repeated military interventions by Washington and its allies, free-market economics has profoundly destabilized the Greater Middle East. Encouraged by Washington and Brussels, twenty-first-century neoliberalism has made unequal societies ever more unequal and helped transform already corrupt regimes into looting machines. This is also, of course, a formula for the success of the Islamic State or any other radical alternative to the status quo. Such movements are bound to find support in impoverished or neglected regions like eastern Syria or eastern Libya.
Note, however, that this process of destabilization is by no means confined to the Greater Middle East and North Africa. We are indeed in the age of destabilization, a phenomenon that is on the rise globally and at present spreading into the Balkans and Eastern Europe (with the European Union ever less able to influence events there). People no longer speak of European integration, but of how to prevent the complete break-up of the European Union in the wake of the British vote to leave.
The reasons why a narrow majority of Britons voted for Brexit have parallels with the Middle East: the free-market economic policies pursued by governments since Margaret Thatcher was prime minister have widened the gap between rich and poor and between wealthy cities and much of the rest of the country. Britain might be doing well, but millions of Britons did not share in the prosperity. The referendum about continued membership in the European Union, the option almost universally advocated by the British establishment, became the catalyst for protest against the status quo. The anger of the “Leave” voters has much in common with that of Donald Trump supporters in the United States.
The U.S. remains a superpower, but is no longer as powerful as it once was. It, too, is feeling the strains of this global moment, in which it and its local allies are powerful enough to imagine they can get rid of regimes they do not like, but either they do not quite succeed, as in Syria, or succeed but cannot replace what they have destroyed, as in Libya. An Iraqi politician once said that the problem in his country was that parties and movements were “too weak to win, but too strong to lose.” This is increasingly the pattern for the whole region and is spreading elsewhere. It carries with it the possibility of an endless cycle of indecisive wars and an era of instability that has already begun.
Sunday, April 17th was the designated moment. The world’s leading oil producers were expected to bring fresh discipline to the chaotic petroleum market and spark a return to high prices. Meeting in Doha, the glittering capital of petroleum-rich Qatar, the oil ministers of the Organization of the Petroleum Exporting Countries (OPEC), along with such key non-OPEC producers as Russia and Mexico, were scheduled to ratify a draft agreement obliging them to freeze their oil output at current levels. In anticipation of such a deal, oil prices had begun to creep inexorably upward, from $30 per barrel in mid-January to $43 on the eve of the gathering. But far from restoring the old oil order, the meeting ended in discord, driving prices down again and revealing deep cracks in the ranks of global energy producers.
It is hard to overstate the significance of the Doha debacle. At the very least, it will perpetuate the low oil prices that have plagued the industry for the past two years, forcing smaller firms into bankruptcy and erasing hundreds of billions of dollars of investments in new production capacity. It may also have obliterated any future prospects for cooperation between OPEC and non-OPEC producers in regulating the market. Most of all, however, it demonstrated that the petroleum-fueled world we’ve known these last decades — with oil demand always thrusting ahead of supply, ensuring steady profits for all major producers — is no more. Replacing it is an anemic, possibly even declining, demand for oil that is likely to force suppliers to fight one another for ever-diminishing market shares.
The Road to Doha
Before the Doha gathering, the leaders of the major producing countries expressed confidence that a production freeze would finally halt the devastating slump in oil prices that began in mid-2014. Most of them are heavily dependent on petroleum exports to finance their governments and keep restiveness among their populaces at bay. Both Russia and Venezuela, for instance, rely on energy exports for approximately 50% of government income, while for Nigeria it’s more like 75%. So the plunge in prices had already cut deep into government spending around the world, causing civil unrest and even in some cases political turmoil.
No one expected the April 17th meeting to result in an immediate, dramatic price upturn, but everyone hoped that it would lay the foundation for a steady rise in the coming months. The leaders of these countries were well aware of one thing: to achieve such progress, unity was crucial. Otherwise they were not likely to overcome the various factors that had caused the price collapsein the first place. Some of these were structural and embedded deep in the way the industry had been organized; some were the product of their own feckless responses to the crisis.
On the structural side, global demand for energy had, in recent years, ceased to rise quickly enough to soak up all the crude oil pouring onto the market, thanks in part to new supplies from Iraq and especially from the expanding shale fields of the United States. This oversupply triggered the initial 2014 price drop when Brent crude — the international benchmark blend — went from a high of $115 on June 19th to $77 on November 26th, the day before a fateful OPEC meeting in Vienna. The next day, OPEC members, led by Saudi Arabia, failed to agree on either production cuts or a freeze, and the price of oil went into freefall.
The failure of that November meeting has been widely attributed to the Saudis’ desire to kill off new output elsewhere — especially shale production in the United States — and to restore their historic dominance of the global oil market. Many analysts were also convinced that Riyadh was seeking to punish regional rivals Iran and Russia for their support of the Assad regime in Syria (which the Saudis seek to topple).
The rejection, in other words, was meant to fulfill two tasks at the same time: blunt or wipe out the challenge posed by North American shale producers and undermine two economically shaky energy powers that opposed Saudi goals in the Middle East by depriving them of much needed oil revenues. Because Saudi Arabia could produce oil so much more cheaply than other countries — for as little as $3 per barrel — and because it could draw upon hundreds of billions of dollars in sovereign wealth funds to meet any budget shortfalls of its own, its leaders believed it more capable of weathering any price downturn than its rivals. Today, however, that rosy prediction is looking grimmer as the Saudi royals begin to feel the pinch of low oil prices, and find themselves cutting back on the benefits they had been passing on to an ever-growing, potentially restive population while still financing a costly, inconclusive, and increasingly disastrous war in Yemen.
Many energy analysts became convinced that Doha would prove the decisive moment when Riyadh would finally be amenable to a production freeze. Just days before the conference, participants expressed growing confidence that such a plan would indeed be adopted. After all, preliminary negotiations between Russia, Venezuela, Qatar, and Saudi Arabia had produced a draft document that most participants assumed was essentially ready for signature. The only sticking point: the nature of Iran’s participation.
The Iranians were, in fact, agreeable to such a freeze, but only after they were allowed to raise their relatively modest daily output to levels achieved in 2012 before the West imposed sanctions in an effort to force Tehran to agree to dismantle its nuclear enrichment program. Now that those sanctions were, in fact, being lifted as a result of the recently concluded nuclear deal, Tehran was determined to restore the status quo ante. On this, the Saudis balked, having no wish to see their arch-rival obtain added oil revenues. Still, most observers assumed that, in the end, Riyadh would agree to a formula allowing Iran some increase before a freeze. “There are positive indications an agreement will be reached during this meeting… an initial agreement on freezing production,” said Nawal Al-Fuzaia, Kuwait’s OPEC representative, echoing the views of other Doha participants.
But then something happened. According to people familiar with the sequence of events, Saudi Arabia’s Deputy Crown Prince and key oil strategist, Mohammed bin Salman, called the Saudi delegation in Doha at 3:00 a.m. on April 17th and instructed them to spurn a deal that provided leeway of any sort for Iran. When the Iranians — who chose not to attend the meeting — signaled that they had no intention of freezing their output to satisfy their rivals, the Saudis rejected the draft agreement it had helped negotiate and the assembly ended in disarray.
Geopolitics to the Fore
Most analysts have since suggested that the Saudi royals simply considered punishing Iran more important than raising oil prices. No matter the cost to them, in other words, they could not bring themselves to help Iran pursue its geopolitical objectives, including giving yet more support to Shiite forces in Iraq, Syria, Yemen, and Lebanon. Already feeling pressured by Tehran and ever less confident of Washington’s support, they were ready to use any means available to weaken the Iranians, whatever the danger to themselves.
“The failure to reach an agreement in Doha is a reminder that Saudi Arabia is in no mood to do Iran any favors right now and that their ongoing geopolitical conflict cannot be discounted as an element of the current Saudi oil policy,” said Jason Bordoff of the Center on Global Energy Policy at Columbia University.
Many analysts also pointed to the rising influence of Deputy Crown Prince Mohammed bin Salman, entrusted with near-total control of the economy and the military by his aging father, King Salman. As Minister of Defense, the prince has spearheaded the Saudi drive to counter the Iranians in a regional struggle for dominance. Most significantly, he is the main force behind Saudi Arabia’s ongoing intervention in Yemen, aimed at defeating the Houthi rebels, a largely Shia group with loose ties to Iran, and restoring deposed former president Abd Rabbuh Mansur Hadi. After a year of relentless U.S.-backed airstrikes (including the use of cluster bombs), the Saudi intervention has, in fact, failed to achieve its intended objectives, though it has produced thousands of civilian casualties, provoking fierce condemnation from U.N. officials, and created space for the rise of al-Qaeda in the Arabian Peninsula. Nevertheless, the prince seems determined to keep the conflict going and to counter Iranian influence across the region.
For Prince Mohammed, the oil market has evidently become just another arena for this ongoing struggle. “Under his guidance,” the Financial Timesnoted in April, “Saudi Arabia’s oil policy appears to be less driven by the price of crude than global politics, particularly Riyadh’s bitter rivalry with post-sanctions Tehran.” This seems to have been the backstory for Riyadh’s last-minute decision to scuttle the talks in Doha. On April 16th, for instance, Prince Mohammed couldn’t have been blunter to Bloomberg, even if he didn’t mention the Iranians by name: “If all major producers don’t freeze production, we will not freeze production.”
With the proposed agreement in tatters, Saudi Arabia is now expected to boost its own output, ensuring that prices will remain bargain-basement low and so deprive Iran of any windfall from its expected increase in exports. The kingdom, Prince Mohammed toldBloomberg, was prepared to immediately raise production from its current 10.2 million barrels per day to 11.5 million barrels and could add another million barrels “if we wanted to” in the next six to nine months. With Iranian and Iraqi oil heading for market in larger quantities, that’s the definition of oversupply. It would certainly ensure Saudi Arabia’s continued dominance of the market, but it might also wound the kingdom in a major way, if not fatally.
A New Global Reality
No doubt geopolitics played a significant role in the Saudi decision, but that’s hardly the whole story. Overshadowing discussions about a possible production freeze was a new fact of life for the oil industry: the past would be no predictor of the future when it came to global oil demand. Whatever the Saudis think of the Iranians or vice versa, their industry is being fundamentally transformed, altering relationships among the major producers and eroding their inclination to cooperate.
Until very recently, it was assumed that the demand for oil would continue to expand indefinitely, creating space for multiple producers to enter the market, and for ones already in it to increase their output. Even when supply outran demand and drove prices down, as has periodically occurred, producers could always take solace in the knowledge that, as in the past, demand would eventually rebound, jacking prices up again. Under such circumstances and at such a moment, it was just good sense for individual producers to cooperate in lowering output, knowing that everyone would benefit sooner or later from the inevitable price increase.
But what happens if confidence in the eventual resurgence of demand begins to wither? Then the incentives to cooperate begin to evaporate, too, and it’s every producer for itself in a mad scramble to protect market share. This new reality — a world in which “peak oil demand,” rather than “peak oil,” will shape the consciousness of major players — is what the Doha catastrophe foreshadowed.
At the beginning of this century, many energy analysts were convinced that we were at the edge of the arrival of “peak oil”; a peak, that is, in the output of petroleum in which planetary reserves would be exhausted long before the demand for oil disappeared, triggering a global economic crisis. As a result of advances in drilling technology, however, the supply of oil has continued to grow, while demand has unexpectedly begun to stall. This can be traced both to slowing economic growth globally and to an accelerating “green revolution” in which the planet will be transitioning to non-carbon fuel sources. With most nations now committed to measures aimed at reducing emissions of greenhouse gases under the just-signed Paris climate accord, the demand for oil is likely to experience significant declines in the years ahead. In other words, global oil demand will peak long before supplies begin to run low, creating a monumental challenge for the oil-producing countries.
This is no theoretical construct. It’s reality itself. Net consumption of oil in the advanced industrialized nations has already dropped from 50 million barrels per day in 2005 to 45 million barrels in 2014. Further declines are in store as strict fuel efficiency standards for the production of new vehicles and other climate-related measures take effect, the price of solar and wind power continues to fall, and other alternative energy sources come on line. While the demand for oil does continue to rise in the developing world, even there it’s not climbing at rates previously taken for granted. With such countries also beginning to impose tougher constraints on carbon emissions, global consumption is expected to reach a peak and begin an inexorable decline.According to experts Thijs Van de Graaf and Aviel Verbruggen, overall world peak demand could be reached as early as 2020.
In such a world, high-cost oil producers will be driven out of the market and the advantage — such as it is — will lie with the lowest-cost ones. Countries that depend on petroleum exports for a large share of their revenues will come under increasing pressure to move away from excessive reliance on oil. This may have been another consideration in the Saudi decision at Doha. In the months leading up to the April meeting, senior Saudi officials dropped hints that they were beginning to plan for a post-petroleum era and that Deputy Crown Prince bin Salman would play a key role in overseeing the transition.
On April 1st, the prince himself indicated that steps were underway to begin this process. As part of the effort, he announced, he was planning an initial public offering of shares in state-owned Saudi Aramco, the world’s number one oil producer, and would transfer the proceeds, an estimated $2 trillion, to its Public Investment Fund (PIF). “IPOing Aramco and transferring its shares to PIF will technically make investments the source of Saudi government revenue, not oil,” the prince pointed out. “What is left now is to diversify investments. So within 20 years, we will be an economy or state that doesn’t depend mainly on oil.”
For a country that more than any other has rested its claim to wealth and power on the production and sale of petroleum, this is a revolutionary statement. If Saudi Arabia says it is ready to begin a move away from reliance on petroleum, we are indeed entering a new world in which, among other things, the titans of oil production will no longer hold sway over our lives as they have in the past.
This, in fact, appears to be the outlook adopted by Prince Mohammed in the wake of the Doha debacle. In announcing the kingdom’s new economic blueprint on April 25th, he vowed to liberate the country from its “addiction” to oil.” This will not, of course, be easy to achieve, given the kingdom’s heavy reliance on oil revenues and lack of plausible alternatives. The 30-year-old prince could also face opposition from within the royal family to his audacious moves (as well as his blundering ones in Yemen and possibly elsewhere). Whatever the fate of the Saudi royals, however, if predictions of a future peak in world oil demand prove accurate, the debacle in Doha will be seen as marking the beginning of the end of the old oil order.
Michael T. Klare, a TomDispatch regular, is a professor of peace and world security studies at Hampshire College and the author, most recently, of The Race for What’s Left. A documentary movie version of his book Blood and Oil is available from the Media Education Foundation. Follow him on Twitter at @mklare1.
A crowning achievement of the historic March on Washington, where Dr. Martin Luther King gave his “I have a dream” speech, was pushing through the landmark Voting Rights Act of 1965. Recognizing the history of racist attempts to prevent Black people from voting, that federal law forced a number of southern states and districts to adhere to federal guidelines allowing citizens access to the polls.
But in 2013 the Supreme Court effectively gutted many of these protections. As a result, states are finding new ways to stop more and more people—especially African-Americans and other likely Democratic voters—from reaching the polls.
Several states are requiring government-issued photo IDs—like drivers licenses—to vote even though there’s no evidence of the voter fraud this is supposed to prevent. But there’s plenty of evidence that these ID measures depress voting, especially among communities of color, young voters, and lower-income Americans.
Alabama, after requiring photo IDs, has practically closed driver’s license offices in counties with large percentages of black voters. Wisconsin requires a government-issued photo ID but hasn’t provided any funding to explain to prospective voters how to secure those IDs.
Other states are reducing opportunities for early voting.
And several state legislatures—not just in the South—are gerrymandering districts to reduce the political power of people of color and Democrats, and thereby guarantee Republican control in Congress.
We need to move to the next stage of voting rights—a new Voting Rights Act—that renews the law that was effectively repealed by the conservative activists on the Supreme Court.
That new Voting Rights Act should also set minimum national standards—providing automatic voter registration when people get driver’s licenses, allowing at least 2 weeks of early voting, and taking districting away from the politicians and putting it under independent commissions.
Voting isn’t a privilege. It’s a right. And that right is too important to be left to partisan politics. We must not allow anyone’s votes to be taken away.
ROBERT B. REICH is Chancellor’s Professor of Public Policy at the University of California at Berkeley and Senior Fellow at the Blum Center for Developing Economies. He served as Secretary of Labor in the Clinton administration, for which Time Magazine named him one of the ten most effective cabinet secretaries of the twentieth century. He has written fourteen books, including the best sellers “Aftershock, “The Work of Nations,” and”Beyond Outrage,” and, his most recent, “Saving Capitalism.” He is also a founding editor of the American Prospect magazine, chairman of Common Cause, a member of the American Academy of Arts and Sciences, and co-creator of the award-winning documentary, INEQUALITY FOR ALL.
When you press Democrats on their uninspiring deeds — their lousy free trade deals, for example, or their flaccid response to Wall Street misbehavior — when you press them on any of these things, they automatically reply that this is the best anyone could have done. After all, they had to deal with those awful Republicans, and those awful Republicans wouldn’t let the really good stuff get through. They filibustered in the Senate. They gerrymandered the congressional districts. And besides, change takes a long time. Surely you don’t think the tepid-to-lukewarm things Bill Clinton and Barack Obama have done in Washington really represent the fiery Democratic soul.
So let’s go to a place that does. Let’s choose a locale where Democratic rule is virtually unopposed, a place where Republican obstruction and sabotage can’t taint the experiment.
Let’s go to Boston, Massachusetts, the spiritual homeland of the professional class and a place where the ideology of modern liberalism has been permitted to grow and flourish without challenge or restraint. As the seat of American higher learning, it seems unsurprising that Boston should anchor one of the most Democratic of states, a place where elected Republicans (like the new governor) are highly unusual. This is the city that virtually invented the blue-state economic model, in which prosperity arises from higher education and the knowledge-based industries that surround it.
The coming of post-industrial society has treated this most ancient of American cities extremely well. Massachusetts routinely occupies the number one spot on the State New Economy Index, a measure of how “knowledge-based, globalized, entrepreneurial, IT-driven, and innovation-based” a place happens to be. Boston ranks high on many of Richard Florida’s statistical indices of approbation — in 2003, it was number one on the “creative class index,” number three in innovation and in high tech — and his many books marvel at the city’s concentration of venture capital, its allure to young people, or the time it enticed some firm away from some unenlightened locale in the hinterlands.
Boston’s knowledge economy is the best, and it is the oldest. Boston’s metro area encompasses some 85 private colleges and universities, the greatest concentration of higher-ed institutions in the country — probably in the world. The region has all the ancillary advantages to show for this: a highly educated population, an unusually large number of patents, and more Nobel laureates than any other city in the country.
The city’s Route 128 corridor was the original model for a suburban tech district, lined ever since it was built with defense contractors and computer manufacturers. The suburbs situated along this golden thoroughfare are among the wealthiest municipalities in the nation, populated by engineers, lawyers, and aerospace workers. Their public schools are excellent, their downtowns are cute, and back in the seventies their socially enlightened residents were the prototype for the figure of the “suburban liberal.”
Another prototype: the Massachusetts Institute of Technology, situated in Cambridge, is where our modern conception of the university as an incubator for business enterprises began. According to a report on MIT’s achievements in this category, the school’s alumni have started nearly 26,000 companies over the years, including Intel, Hewlett Packard, and Qualcomm. If you were to take those 26,000 companies as a separate nation, the report tells us, its economy would be one of the most productive in the world.
Then there are Boston’s many biotech and pharmaceutical concerns, grouped together in what is known as the “life sciences super cluster,” which, properly understood, is part of an “ecosystem” in which PhDs can “partner” with venture capitalists and in which big pharmaceutical firms can acquire small ones. While other industries shrivel, the Boston super cluster grows, with the life-sciences professionals of the world lighting out for the Athens of America and the massive new “innovation centers” shoehorning themselves one after the other into the crowded academic suburb of Cambridge.
To think about it slightly more critically, Boston is the headquarters for two industries that are steadily bankrupting middle America: big learning and big medicine, both of them imposing costs that everyone else is basically required to pay and which increase at a far more rapid pace than wages or inflation. A thousand dollars a pill, 30 grand a semester: the debts that are gradually choking the life out of people where you live are what has madethis city so very rich.
Perhaps it makes sense, then, that another category in which Massachusetts ranks highly is inequality. Once the visitor leaves the brainy bustle of Boston, he discovers that this state is filled with wreckage — with former manufacturing towns in which workers watch their way of life draining away, and with cities that are little more than warehouses for people on Medicare. According to one survey, Massachusetts has the eighth-worst rate of income inequality among the states; by another metric it ranks fourth. However you choose to measure the diverging fortunes of the country’s top 10% and the rest, Massachusetts always seems to finish among the nation’s most unequal places.
Seething City on a Cliff
You can see what I mean when you visit Fall River, an old mill town 50 miles south of Boston. Median household income in that city is $33,000, among the lowest in the state; unemployment is among the highest, 15% in March 2014, nearly five years after the recession ended. Twenty-three percent of Fall River’s inhabitants live in poverty. The city lost its many fabric-making concerns decades ago and with them it lost its reason for being. People have been deserting the place for decades.
Many of the empty factories in which their ancestors worked are still standing, however. Solid nineteenth-century structures of granite or brick, these huge boxes dominate the city visually — there always seems to be one or two of them in the vista, contrasting painfully with whatever colorful plastic fast-food joint has been slapped up next door.
Most of the old factories are boarded up, unmistakable emblems of hopelessness right up to the roof. But the ones that have been successfully repurposed are in some ways even worse, filled as they often are with enterprises offering cheap suits or help with drug addiction. A clinic in the hulk of one abandoned mill has a sign on the window reading simply “Cancer & Blood.”
The effect of all this is to remind you with every prospect that this is a place and a way of life from which the politicians have withdrawn their blessing. Like so many other American scenes, this one is the product of decades of deindustrialization, engineered by Republicans and rationalized by Democrats. This is a place where affluence never returns — not because affluence for Fall River is impossible or unimaginable, but because our country’s leaders have blandly accepted a social order that constantly bids down the wages of people like these while bidding up the rewards for innovators, creatives, and professionals.
Even the city’s one real hope for new employment opportunities — an Amazon warehouse that is now in the planning stages — will serve to lock in this relationship. If all goes according to plan, and if Amazon sticks to the practices it has pioneered elsewhere, people from Fall River will one day get to do exhausting work with few benefits while being electronically monitored for efficiency, in order to save the affluent customers of nearby Boston a few pennies when they buy books or electronics.
But that is all in the future. These days, the local newspaper publishes an endless stream of stories about drug arrests, shootings, drunk-driving crashes, the stupidity of local politicians, and the lamentable surplus of “affordable housing.” The town is up to its eyeballs in wrathful bitterness against public workers. As in: Why do they deserve a decent life when the rest of us have no chance at all? It’s every man for himself here in a “competition for crumbs,” as a Fall River friend puts it.
The Great Entrepreneurial Awakening
If Fall River is pocked with empty mills, the streets of Boston are dotted with facilities intended to make innovation and entrepreneurship easy and convenient. I was surprised to discover, during the time I spent exploring the city’s political landscape, that Boston boasts a full-blown Innovation District, a disused industrial neighborhood that has actually been zoned creative — a projection of the post-industrial blue-state ideal onto the urban grid itself. The heart of the neighborhood is a building called “District Hall” — “Boston’s New Home for Innovation” — which appeared to me to be a glorified multipurpose room, enclosed in a sharply angular façade, and sharing a roof with a restaurant that offers “inventive cuisine for innovative people.” The Wi-Fi was free, the screens on the walls displayed famous quotations about creativity, and the walls themselves were covered with a high-gloss finish meant to be written on with dry-erase markers; but otherwise it was not much different from an ordinary public library. Aside from not having anything to read, that is.
This was my introduction to the innovation infrastructure of the city, much of it built up by entrepreneurs shrewdly angling to grab a piece of the entrepreneur craze. There are “co-working” spaces, shared offices for startups that can’t afford the real thing. There are startup “incubators” and startup “accelerators,” which aim to ease the innovator’s eternal struggle with an uncaring public: the Startup Institute, for example, and the famous MassChallenge, the “World’s Largest Startup Accelerator,” which runs an annual competition for new companies and hands out prizes at the end.
And then there are the innovation Democrats, led by former Governor Deval Patrick, who presided over the Massachusetts government from 2007 to 2015. He is typical of liberal-class leaders; you might even say he is their most successful exemplar. Everyone seems to like him, even his opponents. He is a witty and affable public speaker as well as a man of competence, a highly educated technocrat who is comfortable in corporate surroundings. Thanks to his upbringing in a Chicago housing project, he also understands the plight of the poor, and (perhaps best of all) he is an honest politician in a state accustomed to wide-open corruption. Patrick was also the first black governor of Massachusetts and, in some ways, an ideal Democrat for the era of Barack Obama — who, as it happens, is one of his closest political allies.
As governor, Patrick became a kind of missionary for the innovation cult. “The Massachusetts economy is an innovation economy,” he liked to declare, and he made similar comments countless times, slightly varying the order of the optimistic keywords: “Innovation is a centerpiece of the Massachusetts economy,” et cetera. The governor opened “innovation schools,” a species of ramped-up charter school. He signed the “Social Innovation Compact,” which had something to do with meeting “the private sector’s need for skilled entry-level professional talent.” In a 2009 speech called “The Innovation Economy,” Patrick elaborated the political theory of innovation in greater detail, telling an audience of corporate types in Silicon Valley about Massachusetts’s “high concentration of brainpower” and “world-class” universities, and how “we in government are actively partnering with the private sector and the universities, to strengthen our innovation industries.”
What did all of this inno-talk mean? Much of the time, it was pure applesauce — standard-issue platitudes to be rolled out every time some pharmaceutical company opened an office building somewhere in the state.
On some occasions, Patrick’s favorite buzzword came with a gigantic price tag, like the billion dollars in subsidies and tax breaks that the governor authorized in 2008 to encourage pharmaceutical and biotech companies to do business in Massachusetts. On still other occasions, favoring inno has meant bulldozing the people in its path — for instance, the taxi drivers whose livelihoods are being usurped by ridesharing apps like Uber. When these workers staged a variety of protests in the Boston area, Patrick intervened decisively on the side of the distant software company. Apparently convenience for the people who ride in taxis was more important than good pay for people who drive those taxis. It probably didn’t hurt that Uber had hired a former Patrick aide as a lobbyist, but the real point was, of course, innovation: Uber was the future, the taxi drivers were the past, and the path for Massachusetts was obvious.
A short while later, Patrick became something of an innovator himself. After his time as governor came to an end last year, he won a job as a managing director of Bain Capital, the private equity firm that was founded by his predecessor Mitt Romney — and that had been so powerfully denounced by Democrats during the 2012 election. Patrick spoke about the job like it was just another startup: “It was a happy and timely coincidence I was interested in building a business that Bain was also interested in building,” he told theWall Street Journal. Romney reportedly phoned him with congratulations.
At a 2014 celebration of Governor Patrick’s innovation leadership, Google’s Eric Schmidt announced that “if you want to solve the economic problems of the U.S., create more entrepreneurs.” That sort of sums up the ideology in this corporate commonwealth: Entrepreneurs first. But how has such a doctrine become holy writ in a party dedicated to the welfare of the common man? And how has all this come to pass in the liberal state of Massachusetts?
The answer is that I’ve got the wrong liberalism. The kind of liberalism that has dominated Massachusetts for the last few decades isn’t the stuff of Franklin Roosevelt or the United Auto Workers; it’s the Route 128/suburban-professionals variety. (Senator Elizabeth Warren is the great exception to this rule.) Professional-class liberals aren’t really alarmed by oversized rewards for society’s winners. On the contrary, this seems natural to them — because they are society’s winners. The liberalism of professionals just does not extend to matters of inequality; this is the area where soft hearts abruptly turn hard.
Innovation liberalism is “a liberalism of the rich,” to use the straightforward phrase of local labor leader Harris Gruman. This doctrine has no patience with the idea that everyone should share in society’s wealth. What Massachusetts liberals pine for, by and large, is a more perfect meritocracy — a system where the essential thing is to ensure that the truly talented get into the right schools and then get to rise through the ranks of society. Unfortunately, however, as the blue-state model makes painfully clear, there is no solidarity in a meritocracy. The ideology of educational achievement conveniently negates any esteem we might feel for the poorly graduated.
This is a curious phenomenon, is it not? A blue state where the Democrats maintain transparent connections to high finance and big pharma; where they have deliberately chosen distant software barons over working-class members of their own society; and where their chief economic proposals have to do with promoting “innovation,” a grand and promising idea that remains suspiciously vague. Nor can these innovation Democrats claim that their hands were forced by Republicans. They came up with this program all on their own.
The other week, feeling sick, I spent a day on my couch with the TV on and was reminded of an odd fact of American life. More than seven months before Election Day, you can watch the 2016 campaign for the presidency at any moment of your choosing, and that’s been true since at least late last year. There is essentially never a time when some network or news channel isn’t reporting on, discussing, debating, analyzing, speculating about, or simply drooling over some aspect of the primary campaign, of Hillary, Bernie, Ted, and above all — a million times above all — The Donald (from the violence at his rallies to the size of his hands). In case you’re young and think this is more or less the American norm, it isn’t. Or wasn’t.
Truly, there is something new under the sun. Of course, in 1994 with O.J. Simpson’s white Ford Bronco chase (95 million viewers!), the 24/7 media event arrived full blown in American life and something changed when it came to the way we focused on our world and the media focused on us. But you can be sure of one thing: never in the history of television, or any other form of media, has a single figure garnered the amount of attention — hour after hour, day after day, week after week — as Donald Trump. If he’s the O.J. Simpson of twenty-first-century American politics and his run for the presidency is the eternal white Ford Bronco chase of our moment, then we’re in a truly strange world.
Or let me put it another way: this is not an election. I know the word “election” is being used every five seconds and somewhere along the line significant numbers of Americans (particularly, this season, Republicans) continue to enter voting booths or in the case of primary caucuses, school gyms and the like, to choose among various candidates, so it’s all still election-like. But take my word for it as a 71-year-old guy who’s been watching our politics for decades: this is not an election of the kind the textbooks once taught us was so crucial to American democracy. If, however, you’re sitting there waiting for me to tell you what it is, take a breath and don’t be too disappointed. I have no idea, though it’s certainly part bread-and-circuses spectacle, part celebrity obsession, and part media money machine.
Actually, before we go further, let me hedge my bets on the idea that Donald Trump is a twenty-first-century O.J. Simpson. It’s certainly a reasonable enough comparison, but I’ve begun to wonder about the usefulness of just about any comparison in our present situation. Even the most nightmarish of them — Donald Trump is Adolf Hitler, Benito Mussolini, or any past extreme demagogue of your choice — may actually prove to be covert gestures of consolation, reassurance, and comfort. Yes, what’s happening in our world is increasingly extreme and could hardly be weirder, we seem to have the urge to say, but it’s still recognizable. It’s something we’ve encountered before, something we’ve made sense of in the past and, in the process, overcome.
Round Up the Usual SuspectsBut what if that’s not true? In some ways, the most frightening, least acceptable thing to say about our American world right now — even if Donald Trump’s overwhelming presence all but begs us to say it — is that we’ve entered uncharted territory and, under the circumstances, comparisons might actually impair our ability to come to grips with our new reality. My own suspicion: Donald Trump is only the most obvious instance of this, the example no one can miss.In these first years of the twenty-first century, we may be witnessing a new world being born inside the hollowed-out shell of the American system. As yet, though we live with this reality every day, we evidently just can’t bear to recognize it for what it might be. When we survey the landscape, what we tend to focus on is that shell — the usual elections (in somewhat heightened form), the usual governmental bodies (a little tarnished) with the usual governmental powers (a little diminished or redistributed), including the usual checks and balances (a little out of whack), and the same old Constitution (much praised in its absence), and yes, we know that none of this is working particularly well, or sometimes at all, but it still feels comfortable to view what we have as a reduced, shabbier, and more dysfunctional version of the known.
Perhaps, however, it’s increasingly a version of the unknown. We say, for instance, that Congress is “paralyzed,” and that little can be done in a country where politics has become so “polarized,” and we wait for something to shake us loose from that “paralysis,” to return us to a Washington closer to what we remember and recognize. But maybe this is it. Maybe even if the Republicans somehow lost control of the House of Representatives and the Senate, we would still be in a situation something like what we’re now labeling paralysis. Maybe in our new American reality, Congress is actually some kind of glorified, well-lobbied, and well-financed version of a peanut gallery.
Of course, I don’t want to deny that much of what is “new” in our world has a long history. The present yawning inequality gap between the 1% and ordinary Americans first began to widen in the 1970s and — as Thomas Frank explains so brilliantly in his new book, Listen, Liberal — was already a powerful and much-discussed reality in the early 1990s, when Bill Clinton ran for president. Yes, that gap is now more like an abyss and looks ever more permanently embedded in the American system, but it has a genuine history, as for instance do 1% elections and the rise and self-organization of the “billionaire class,” even if no one, until this second, imagined that government of the billionaires, by the billionaires, and for the billionaires might devolve into government of the billionaire, by the billionaire, and for the billionaire — that is, just one of them.
Indeed, much of our shape-shifting world can be written about as a set of comparisons and in terms of historical reference points. Inequality has a history. The military-industrial complex and the all-volunteer military, like the warrior corporation, weren’t born yesterday; neither was our state of perpetual war, nor the national security state that now looms over Washington, nor its surveilling urge, the desire to know far too much about the private lives of Americans. (A little bow of remembrance to FBI Director J. Edgar Hoover is in order here.)
And yet, true as all that may be, Washington increasingly seems like a new land, sporting something like a new system in the midst of our much-described polarized and paralyzed politics. The national security state doesn’t seem faintly paralyzed or polarized to me. Nor does the Pentagon. On certain days when I catch the news, I can’t believe how strange and yet humdrum this uncharted new territory is. Remind me, for instance, where in the Constitution the Founding Fathers wrote about that national security state? And yet there it is in all its glory, all its powers, an ever more independent force in our nation’s capital. In what way, for instance, did those men of the revolutionary era prepare the ground for the Pentagon to loose its spy drones from our distant war zones over the United States? And yet, so it has. And no one even seems disturbed by the development. The news, barely noticed or noted, was instantly absorbed into what’s becoming the new normal.
Graduation Ceremonies in the Imperium
Let me mention here the almost random piece of news that recently made me wonder just what planet I was actually on. And I know you won’t believe it, but it had absolutely nothing to do with Donald Trump.
Given the carnage of America’s wars and conflicts across the Greater Middle East and Africa, which I’ve been following closely these last years, I’m unsure why this particular moment even got to me. Best guess? Maybe that, of all the once-obscure places — from Afghanistan to Yemen to Libya — in which the U.S. has been fighting recently, Somalia, where this particular little slaughter took place, seems to me like the most obscure of all. Yes, I’ve been half-attending to events there from the 1993 Blackhawk Down moment to the disastrous U.S.-backed Ethiopian invasion of 2006 to the hardly less disastrous invasion of that country by Kenyan and other African forces. Still, Somalia?
Recently, U.S. Reaper drones and manned aircraft launched a set of strikes against what the Pentagon claimed was a graduation ceremony for “low-level” foot soldiers in the Somali terror group al-Shabab. It was proudly announced that more than 150 Somalis had died in this attack. In a country where, in recent years, U.S. drones and special ops forces had carried out a modest number of strikes against individual al-Shabab leaders, this might be thought of as a distinct escalation of Washington’s endless low-level conflict there (with a raid involving U.S. special ops forces following soon after).
Now, let me try to put this in some personal context. Since I was a kid, I’ve always liked globes and maps. I have a reasonable sense of where most countries on this planet are. Still, Somalia? I have to stop and give that one some thought to truly locate it on a mental map of eastern Africa. Most Americans? Honestly, I doubt they’d have a clue. So the other day, when this news came out, I stopped a moment to take it in. If accurate, we killed 150 more or less nobodies (except to those who knew them) and maybe even a top leader or two in a country most Americans couldn’t locate on a map.
I mean, don’t you find that just a little odd, no matter how horrible the organization they were preparing to fight for? 150 Somalis? Blam!
Remind me: On just what basis was this modest massacre carried out? After all, the U.S. isn’t at war with Somalia or with al-Shabab. Of course, Congress no longer plays any real role in decisions about American war making. It no longer declares war on any group or country we fight. (Paralysis!) War is now purely a matter of executive power or, in reality, the collective power of the national security state and the White House. The essential explanation offered for the Somali strike, for instance, is that the U.S. had a small set of advisers stationed with African Union forces in that country and it was just faintly possible that those guerrilla graduates might soon prepare to attack some of those forces (and hence U.S. military personnel). It seems that if the U.S. puts advisers in place anywhere on the planet — and any day of any year they are now in scores of countries — that’s excuse enough to validate acts of war based on the “imminent” threat of their attack.
Or just think of it this way: a new, informal constitution is being written in these years in Washington. No need for a convention or a new bill of rights. It’s a constitution focused on the use of power, especially military power, and it’s being written in blood.
These days, our government (the unparalyzed one) acts regularly on the basis of that informal constitution-in-the-making, committing Somalia-like acts across significant swathes of the planet. In these years, we’ve been marrying the latest in wonder technology, our Hellfire-missile-armed drones, to executive power and slaughtering people we don’t much like in majority Muslim countries with a certain alacrity. By now, it’s simply accepted that any commander-in-chief is also our assassin-in-chief, and that all of this is part of a wartime-that-isn’t-wartime system, spreading the principle of chaos and dissolution to whole areas of the planet, leaving failed states and terror movements in its wake.
When was it, by the way, that “the people” agreed that the president could appoint himself assassin-in-chief, muster his legal beagles to write new “law” that covered any future acts of his (including the killing of American citizens), and year after year dispatch what essentially is his own private fleet of killer drones to knock offthousands of people across the Greater Middle East and parts of Africa? Weirdly enough, after almost 14 years of this sort of behavior, with ample evidence that such strikes don’t suppress the movements Washington loathes (and often only fan the flames of resentment and revenge that help them spread), neither the current president and his top officials, nor any of the candidates for his office have the slightest intention of ever grounding those drones.
And when exactly did the people say that, within the country’s vast standing military, which now garrisons much of the planet, a force of nearly 70,000 Special Operations personnel should be birthed, or that it should conduct covert missions globally, essentially accountable only to the president (if him)? And what I find strangest of all is that few in our world find such developments strange at all.
A Planet in Decline?
In some way, all of this could be said to work. At the very least, it is a functioning new system-in-the-making that we have yet to truly come to grips with, just as we haven’t come to grips with a national security state that surveils the world in a way that even science fiction writers (no less totalitarian rulers) of a previous era could never have imagined, or the strange version of media overkill that we still call an election. All of this is by now both old news and mind-bogglingly new.
Do I understand it? Not for a second.
This is not war as we knew it, nor government as we once understood it, nor are these elections as we once imagined them, nor is this democracy as it used to be conceived of, nor is this journalism of a kind ever taught in a journalism school. This is the definition of uncharted territory. It’s a genuine American terra incognita and yet in some fashion that unknown landscape is already part of our sense of ourselves and our world. In this “election” season, many remain shocked that a leading candidate for the presidency is a demagogue with a visible authoritarian side and what looks like an autocratic bent. All such labels are pinned on Donald Trump, but the new American system that’s been emerging from its chrysalis in these years already has just those tendencies. So don’t blame it all on Donald Trump. He should be far less of a shock to this country than he continues to be. After all, a Trumpian world-in-formation has paved the way for him.
Who knows? Perhaps what we’re watching is the new iteration of a very old story: a twenty-first-century version of an ancient tale of a great imperial power, perhaps the greatest ever — the “lone superpower” — sinking into decline. It’s a tale humanity has experienced often enough in the course of our long history. But lest you think once again that there’s nothing new under the sun, the context for all of this, for everything now happening in our world, is so new as to be quite literally outside of thousands of years of human experience. As the latest heat records indicate, we are, for the first time, on a planet in decline. And if that isn’t uncharted territory, what is?
National wildlife refuges such as the one at Malheur near Burns, Oregon, have importance far beyond the current furor over who manages our public lands. Such refuges are becoming increasingly critical habitat for migratory birds because 95 percent of the wetlands along the Pacific Flyway have already been lost to development.
In some years, 25 million birds visit Malheur, and if the refuge were drained and converted to intensive cattle grazing – which is something the “occupiers” threatened to do – entire populations of ducks, sandhill cranes, and shorebirds would suffer. With their long-distance flights and distinctive songs, the migratory birds visiting Malheur’s wetlands now help to tie the continent together.
This was not always the case. By the 1930s, three decades of drainage, reclamation, and drought had decimated high-desert wetlands and the birds that depended upon them. Out of the hundreds of thousands of egrets that once nested on Malheur Lake, only 121 remained. The American population of the birds had dropped by 95 percent. It took the federal government to restore Malheur’s wetlands and recover waterbird populations, bringing back healthy populations of egrets and many other species.
Yet despite the importance of wildlife refuges to America’s birds, not everyone appreciates them. At one recent news conference, Ammon Bundy called the creation of Malheur National Wildlife refuge “an unconstitutional act” that removed ranchers from their lands and plunged the county into an economic depression. This is not a new complaint. Since the Sagebrush Rebellion of the 1980s, rural communities in the West have blamed their poverty on the 640 million acres of federal public lands, which make up 52 percent of the land in Western states.
Rural Western communities are indeed suffering, but the cause is not the wildlife refuge system. Conservation of bird habitat did not lead to economic devastation, nor were refuge lands “stolen” from ranchers. If any group has prior claims to Malheur refuge, it is the Paiute Indian Tribe.
For at least 6,000 years, Malheur was the Paiutes’ home. It took a brutal Army campaign to force the people from their reservation, marching them through the snow to the state of Washington in 1879. Homesteaders and cattle barons then moved onto Paiute lands, squeezing as much livestock as possible onto dwindling pastures, and warring with each other over whose land was whose. Scars from this era persist more than a century later.
In 1908, President Roosevelt established the Malheur Lake Bird Reservation on the lands of the former Malheur Indian Reservation. But the refuge included only the lake itself, not the rivers that fed into it. Deprived of water, the lake shrank during droughts, and squatters moved onto the drying lakebed. Conservationists, realizing they needed to protect the Blitzen River that fed the lake, began a campaign to expand the refuge.
But the federal government never forced the ranchers to sell, as the occupiers at Malheur claimed, and the sale did not impoverish the community. In fact, it was just the opposite: During the Depression years of the 1930s, the federal government paid the Swift Corp. $675,000 for ruined grazing lands. Impoverished homesteaders who had squatted on refuge lands eventually received payments substantial enough to set them up as cattle ranchers nearby.
John Scharff, Malheur’s manager from 1935 to 1971, sought to transform local suspicion into acceptance by allowing local ranchers to graze cattle on the refuge. Yet some tension persisted. In the 1970s, when concern about overgrazing reduced – but did not eliminate – refuge grazing, violence erupted again. Some environmentalists denounced ranchers as parasites who destroyed wildlife habitat. A few ranchers responded with death threats against environmentalists and federal employees.
But violence is not the basin’s most important historical legacy. Through the decades, community members have come together to negotiate a better future. In the 1920s, poor homesteaders worked with conservationists to save the refuge from irrigation drainage. In the 1990s, Paiute tribal members, ranchers, environmentalists and federal agencies collaborated on innovative grazing plans to restore bird habitat while also giving ranchers more flexibility. In 2013, such efforts resulted in a landmark collaborative conservation plan for the refuge, and it offers great hope for the local economy and for wildlife.
The poet Gary Snyder wrote, “We must learn to know, love, and join our place even more than we love our own ideas. People who can agree that they share a commitment to the landscape – even if they are otherwise locked in struggle with each other – have at least one deep thing to share.”
Collaborative processes are difficult and time-consuming. Yet they have proven that they have the potential to peacefully sustain both human and wildlife communities.
Nancy Langston is a contributor to Writers on the Range, the opinion service of High Country News. She is a professor of environmental history at Michigan Technological University, and the author of a history of Malheur Refuge, Where Land and Water Meet: A Western Landscape Transformed.
Reprinted from the New Economic Perspectives blog at the University of Missouri-Kansas City
Editor’s Note: William K. Black, author of “The Best Way to Rob a Bank is to Own One,” is Associate Professor of Law and Economics at the University of Missouri-Kansas City, where — according to James Galbraith — “the best economics is now being done.”
In the latest example of the New York Times’ reporters’ inability to read Paul Krugman, we have an article claiming that the “Growing Imbalance Between Germany and France Strains Their Relationship.” The article begins with Merkel’s major myth accepted as if it were unquestionable reality.
“It was a clear illustration of the dysfunction of the French-German partnership, the axis that for decades kept Europe on a united and dynamic track.
In Berlin this month, Chancellor Angela Merkel, riding high after nine years in power, delivered a strident defense in Parliament of austerity, which she has been pushing on Europe ever since a debt crisis broke out in 2009.”
No, not true on multiple grounds. First, the so-called “debt crisis” was a symptom rather than a cause. The reader will note that the year 2008, when the Great Recession became terrifying, has somehow been removed from the narrative because it would expose the misapprehension in Merkel’s myth. Prior to 2008, only Greece had debt levels given its abandonment of a sovereign currency that posed a material risk. The EU nations had unusually low budgetary deficits leading into the Great Recession. Indeed, that along with the extremely low budgetary deficits of the Clinton administration (the budget went into surplus near the end of his term) is likely one of the triggers for the Great Recession.
The Great Recession caused sharp increases in deficits – as we have long known will happen as part of the “automatic stabilizers.” This is normal and speeds recovery. The eurozone and the U.S. began to come out of the Great Recession in 2009. The U.S. recovery accelerated with the addition of stimulus. In the eurozone, however, the abandonment of sovereign currencies and adoption of the euro exposed the periphery to recurrent attacks by the “bond vigilantes.” The ECB could have stopped these attacks at any time, but it was very late intervening – largely because of German resistance. Instead, Merkel used the leverage provided by the bond vigilantes and the refusal of the ECB to act to end their attacks to force increasing austerity upon the eurozone and demands for severe cuts in workers’ wages in the periphery.
Merkel’s actions in forcing austerity and efforts to force sharp drops in workers’ wages in the periphery were not required to stop any “debt crisis.” The ECB had the ability to end the bond vigilantes’ attacks and reestablish the ability of the periphery to borrow at low cost, as it demonstrated. Merkel’s austerity demands and demands that (largely) left governments in the periphery slash workers’ wages promptly threw the entire Eurozone back into a second Great Recession – and much of the periphery into a Second Great Depression. It had the desired purpose of discrediting the governing parties of the left, particularly in Spain, Portugal, and Greece; that gave in to Merkel’s mandates that they adopt masochistic macroeconomic policies.
It is also false that Merkel began demanding that eurozone inflict austerity only in 2009. Merkel wanted to inflict austerity and her war on the workers and the parties they primarily supported long before 2009. What changed in 2009 was that the ECB, the Great Recession, and the bond vigilantes gave her the leverage to successfully extort the members of the eurozone who opposed austerity and her war on workers and the parties of the left.
But it is what is left out of the quoted passage above that is most amazing. The fact that Merkel’s orders that the eurozone leaders bleed their economies through austerity and the war on workers’ wages led to a gratuitous Second Great Recession in the eurozone – and Great Depression levels of unemployment in much of the periphery disappears. The fact that inflicting austerity and wage cuts in response to a Great Recession is economically illiterate and cruel disappears. The fact that the overall eurozone – six years after the financial crisis of 2008 and eight years after the financial bubbles popped in 2006 – has stagnated and caused tens of trillions of dollars in lost GDP and well over 10 million lost jobs is treated by the NYT article as if it were unrelated to Merkel’s infliction of austerity.
“But the French economy has grown stagnant, with unemployment stubbornly stuck near 11 percent and an unpopular government pledging to cut tens of billions in taxes on business, which many French fear will unravel their prized welfare state.”
No, the eurozone economy “has grown stagnant” and produced a Second Great Depression in much of the periphery. If France had a sovereign currency or if the EU were to make the euro and into a true sovereign currency France could simultaneously “cut tens of billions in taxes on business” while preserving the social safety net and speeding the recovery. The same is true of the rest of the eurozone – including Germany where Merkel’s policies have made the wealthy far wealthier and deepened the economic crisis in other eurozone nations by cutting German worker’s wages. The NYT article is disingenuous about both aspects of the German economy, noting only that “the German economy has shown signs of slowing down.” German growth was actually negative in the last quarter and the treatment of its workers weakens the German and overall eurozone recovery.
It continues to be obvious that it is a condition of employment for NYT reporters covering the eurozone’s economic policies that they never read Paul Krugman (or most any other American economist). Consider this claim in the article:
“[Prime Minister Manuel Valls] and Mr. Hollande have alienated many members of the Socialist Party by taking a more centrist approach to economic policy, stoking suspicions that the government is favoring business at the expense of the welfare state.”
I will take this part very slow. By my count Krugman has written at least six columns in the NYT explaining that there actually is a powerful consensus among economists. The “centrist approach” is that austerity in response to a Great Recession is self-destructive. We have known this for at least 75 years. Modern Republicans, when they hold the presidency, always respond to a recession with a stimulus package. Valls and Hollande are moving away from a “centrist approach to economic policy.” They are doing so despite observing first-hand the self-destructive nature of austerity (and proclaiming that it is self-destructive). They do so despite the demonstrated success of stimulus in responding to the financial crisis. They do so despite the fact that the results of the faux left parties adopting these economically illiterate neo-liberal economic policies is the destruction of the parties that betray their principles and the workers. Valls and Hollande are spectacularly unpopular in France because of these betrayals. It is clear why Valls and Hollande wish to avoid reading Krugman’s critique of their betrayals, but theNYT reporters have no excuse.
The reporters do not simply ignore the insanity of austerity and the plight of the eurozone’s workers – they assert that it is obvious that Merkel is correct and that the French reluctance to slash workers’ wages is obviously economically illiterate.
“Just over a decade ago, as Ms. Merkel is fond of noting, Germany was Europe’s sick economy. It recovered partly because of changes to labor laws and social welfare. Mr. Hollande now faces a similar task in an era of low or no growth.”
No. These two sentences propound multiple Merkel myths and assume (1) that France’s (and the rest of the eurozone’s) problems are the same as Germany’s issues “just over a decade ago,” (2) that Germany “recovered” due to slashing workers’ wages and social programs, and (3) that the German “solutions” would work for the eurozone as a whole.
Germany’s “reforms,” which included increasing financial deregulation, have proven disastrous. German banks finished third in the regulatory “race to the bottom” (“behind” Wall Street and the worst of the worst – the City of London). The officers that controlled Deutsche Bank and various state-owned German banks were among the leading causes of the financial crisis. German workers had lost ground even before the financial crisis and have lost even more ground since the crisis began. Inequality has also become increasingly more extreme in Germany.
The current problem in the eurozone is a critical shortage of demand exacerbated by the insanity of austerity and Merkel’s war on workers’ wages. The word “demand” and the concept, the centerpiece of the macroeconomics of recession, never appear in the article. An individual nation in which the wealthy have the political power to lower workers’ wages can increase its exports and employ more of its citizens. This obviously does not prove that the workers were overpaid. Merkel and the NYT ignore the “fallacy of composition,” which is particularly embarrassing because they are neo-mercantilists pushing the universal goal of being a net exporter. As Adam Smith emphasized, we can’t all be net exporters. A strategy that can work (for the elites) of one nation cannot logically be assumed to work for large numbers of nations.
The last thing a society should want in a recession is rapidly falling wages and prices that can create deflation (another word expunged from the NYT article because it would refute their ode to Merkel, austerity, and her war on the worker). If France were to slash workers’ wages to try to take exports from Ireland while Ireland slashed workers’ wages to try to take exports from Spain, which did the same to take exports from Italy the result would be deflation, a massive increase in inequality, the political destruction of any (allegedly) progressive political party that joined in the war on the worker, and a “race to Bangladesh” dynamic.
Germany’s “success” in being a very large net exporter makes it far more difficult – not easier – for any other eurozone nation to copy its export strategy successfully. As a group, the strategy cannot work for the eurozone. The strategy has, of course, not simply “not succeeded.” It has failed catastrophically. Merkel’s eurozone policies have caused trillions of dollars in extra losses in productivity, the gratuitous loss of over 10 million jobs, increased inequality, and the loss through emigration of many of the best educated young citizens of the periphery.
Hollande does not face “a similar task” to Merkel. He faces different problems and Merkel’s “solutions” are the chief causes of France’s economic stagnation rather than the answers to France’s problems.
I repeat my twin suggestions to the NYT reporters that cover the eurozone’s economy. The paper’s management should host a seminar in which Krugman educates his colleagues. Alternatively, come to UMKC and we’ll provide that seminar without charge. None of us can afford the cost of the reporters’ continuing willful ignorance of economics and their indifference to the victims of austerity and Merkel’s war on workers.
Editor’s Note: Rebecca Solnit, is one of the best writers in America because she’s one of the most original thinkers. Here she reminds us of the revolutionary power of hope, and how hope overturns old regimes from the bottom up.
There have undoubtedly been stable periods in human history, but you and your parents, grandparents, and great-grandparents never lived through one, and neither will any children or grandchildren you may have or come to have. Everything has been changing continuously, profoundly — from the role of women to the nature of agriculture. For the past couple of hundred years, change has been accelerating in both magnificent and nightmarish ways.
Yet when we argue for change, notably changing our ways in response to climate change, we’re arguing against people who claim we’re disrupting a stable system. They insist that we’re rocking the boat unnecessarily.
I say: rock that boat. It’s a lifeboat; maybe the people in it will wake up and start rowing. Those who think they’re hanging onto a stable order are actually clinging to the wreckage of the old order, a ship already sinking, that we need to leave behind.
As you probably know, the actual oceans are rising — almost eight inches since 1880, and that’s only going to accelerate. They’re also acidifying, because they’re absorbing significant amounts of the carbon we continue to pump into the atmosphere at record levels. The ice that covers the polar seas is shrinking, while the ice shields that cover Antarctica and Greenland are melting. The water locked up in all the polar ice, as it’s unlocked by heat, is going to raise sea levels staggeringly, possibly by as much as 200 feet at some point in the future, how distant we do not know. In the temperate latitudes, warming seas breed fiercer hurricanes.
The oceans are changing fast, and for the worse. Fish stocks are dying off, as are shellfish. In many acidified oceanic regions, their shells are actually dissolving or failing to form, which is one of the scariest, most nightmarish things I’ve ever heard. So don’t tell me that we’re rocking a stable boat on calm seas. The glorious 10,000-year period of stable climate in which humanity flourished and then exploded to overrun the Earth and all its ecosystems is over.
But responding to these current cataclysmic changes means taking on people who believe, or at least assert, that those of us who want to react and act are gratuitously disrupting a stable system that’s working fine. It isn’t stable. It isworking fine — in the short term and the most limited sense — for oil companies and the people who profit from them and for some of us in the particularly cushy parts of the world who haven’t been impacted yet by weather events like, say, the recent torrential floods in Japan or southern Nevada and Arizona, or the monsoon versions of the same that have devastated parts of India and Pakistan, or the drought that has mummified my beloved California, or the wildfires of Australia.
The problem, of course, is that the people who most benefit from the current arrangements have effectively purchased a lot of politicians, and that a great many of the rest of them are either hopelessly dim or amazingly timid. Most of the Democrats recognize the reality of climate change but not the urgency of doing something about it. Many of the Republicans used to — John McCain has done an amazing about-face from being a sane voice on climate to a shrill denier — and they present a horrific obstacle to any international treaties.
Put it this way: in one country, one party holding 45 out of 100 seats in one legislative house, while serving a minority of the very rich, can basically block what quite a lot of the other seven billion people on Earth want and need, because a two-thirds majority in the Senate must consent to any international treaty the U.S. signs. Which is not to say much for the president, whose drill-baby-drill administration only looks good compared to the petroleum servants he faces, when he bothers to face them and isn’t just one of them. History will despise them all and much of the world does now, but as my mother would have said, they know which side their bread is buttered on.
As it happens, the butter is melting and the bread is getting more expensive. Global grain production is already down several percent thanks to climate change, says a terrifying new United Nations report. Declining crops cause food shortages and rising food prices, creating hunger and even famine for the poorest on Earth, and also sometimes cause massive unrest. Rising bread prices were one factor that helped spark the Arab Spring in 2011. Anyone who argues that doing something about global warming will be too expensive is dodging just how expensive unmitigated climate change is already proving to be.
It’s only a question of whether the very wealthy or the very poor will pay. Putting it that way, however, devalues all the nonmonetary things at stake, from the survival of myriad species to our confidence in the future. And yeah, climate change is here, now. We’ve already lost a lot and we’re going to lose more, but there’s a difference between terrible and apocalyptic. We still have some control over how extreme it gets. That’s not a great choice, but it’s the choice we have. There’s still a window open for action, but it’s closing. As the Secretary-General of the World Meteorological Society, Michel Jarraud, bluntly put it recently, “We are running out of time.”
New and Renewable Energies
The future is not yet written. Look at the world we’re in at this very moment. The Keystone XL tar sands pipeline was supposed to be built years ago, but activists catalyzed by the rural and indigenous communities across whose land it would go have stopped it so far, and made what was supposed to be a done deal a contentious issue. Activists changed the outcome.
Fracking has been challenged on the state level, and banned in townships and counties from upstate New York to central California. (It has also been banned in two Canadian provinces, France, and Bulgaria.) The fossil-fuel divestment movement has achieved a number of remarkable victories in its few bare years of existence and more are on the way. The actual divestments and commitments to divest fossil fuel stocks by various institutions ranging from the city of Seattle to the British Medical Association are striking. But the real power of the movement lies in the way it has called into question the wisdom of investing in fossil fuel corporations. Even mainstream voices like the British Parliament’s Environmental Audit Committee and publications like Forbes are now beginning to question whether they are safe places to put money. That’s a sea change.
Renewable energy has become more efficient, technologically sophisticated, and cheaper — the price of solar power in relation to the energy it generates has plummeted astonishingly over the past three decades and wind technology keeps getting better. While Americans overall are not yet curtailing their fossil-fuel habits, many individuals and communities are choosing other options, and those options are becoming increasingly viable. A Stanford University scientist has proposed a plan to allow each of the 50 states to run on 100% renewable energy by 2050.
Since, according to the latest report of the U.N.’s Intergovernmental Panel on Climate Change, fossil fuel reserves still in the ground are “at least four times larger than could safely be burned if global warming is to be kept to a tolerable level,” it couldn’t be more important to reach global agreements to do things differently on a planetary scale. Notably, most of those carbon reserves must be left untapped and the modest steps already taken locally andad hoc show that such changes are indeed possible and that an encouraging number of us want to pursue them.
We can do it. And we is the key word here. The world is not going to be saved by individual acts of virtue; it’s going to be saved, if it is to be saved, by collective acts of social and political change. That’s why I’m marching this Sunday with tens or maybe hundreds of thousands of others in New York City — to pressure the United Nations as it meets to address climate change. That’s why people who care about the future state of our planet will also be marching and demonstrating in New Delhi, Rio de Janeiro, Paris, Berlin, Melbourne, Kathmandu, Dublin, Manila, Seoul, Mumbai, Istanbul, and so many smaller places.
Mass movements work. Unarmed citizens have changed the course of history countless times in the modern era. When we come together as civil society, we have the capacity to transform policies, change old ways of doing things, and sometimes even topple regimes. And it is about governments. Like it or not, the global treaties, compacts, and agreements we need can only be made by governments, and governments will make those agreements when the pressure to do so is greater than the pressure not to. We can and must be that pressure.
The Long View from One Window
I lived in the same apartment for 25 years, moving into a poor but thriving black community in 1981 and out of the far more affluent, paler, and less neighborly place it had become in 2006. A lot of people moved in and out in that period, many of them staying only a year or two. Those transients always seemed to believe that the neighborhood they were passing through was a stable one. You had to be slower than change and stick around to see it. I saw it and it helped me learn how to take a historical view of things.
It’s crazy that anyone speaks as if our world is not undergoing rapid change, when the view from the window called history shows nothing but transformation, both incremental and dramatic. Exactly 25 years ago this month, Eastern Europe was astir. Remember that back then there was still a Soviet bloc, and a Soviet Union, and an Iron Curtain, and a Berlin Wall, and a Cold War. Most people thought those were permanent fixtures, but in the summer of 1989, Hungary decided to let East Germans (who were permitted to travel freely to that communist country) stream over to the West.
Thousands of people, tired of life in the totalitarian east, fled. Poland, Czechoslovakia, and Hungary, as well as East Germany, were already electrified by a resurgent civil society and activist communities that had dared to organize in the face of repression. At the time, politicians and pundits in the West were making careers out of explaining, among so many other things, why German reunification wasn’t going to happen in anyone’s lifetime. And they probably would have been proven right if people had stayed home and done nothing, if they hadn’t begun to hope and acted on that hope.
The bureaucrats on both sides of the Berlin Wall were still talking about the possibility of demilitarizing it when citizens showed up en masse and the guards began abandoning their posts. On that epochal night of November 9, 1989, the people made whole what had been broken. The lesson: showing up is half the battle.
British Prime Minister Margaret Thatcher had been so unnerved by developments in the Soviet Union’s Eastern European holdings that she went to Moscow, two months before the fall of the wall, to implore Soviet leader Mikhail Gorbachev to prevent any such thing. That was early September 1989. “No dramatic change in the situation in Czechoslovakia can be expected,” predicted a Czech official two months before a glorious popular uprising, remembered as the Velvet Revolution, erupted and abolished the government in which he was an official.
There are three things to note about those changes in 1989. First, most people in power dismissed the possibility that such extraordinary change could happen or deplored what it might bring. They were comfortable enough with things as they were, even though the status quo was several kinds of scary and awful. In other words, the status quo likes the status quo and dislikes change. Second, everything changed despite them, thanks to grassroots organizing and civil society, forces that — we are now regularly assured — are pointless and irrelevant. Third, the world that existed then has been largely swept away: the Soviet Union, the global alignments of that time, the idea of a binary world of communism and capitalism, and the policies that had kept us on the brink of nuclear annihilation for decades. We live in a very different world now (though nuclear weapons are still a terrible problem). Things do change.
Maybe, in fact, there’s a fourth point to note as well. That, important as they were, the front-page stories about the liberation of Eastern Europe weren’t what mattered most all those years ago. After all, hidden away deep inside theNew York Times that autumn, you can find a dozen or so articles about global warming, as the newly recognized phenomenon was then called. And small as they were, anyone reading them now can see that so long ago the essential problem and peril to our world was already clear.
The thought of what might have been accomplished, had a people’s movement arisen then to face global warming, could break your heart. That, after all, was still a time when the Earth’s atmosphere held just above 350 parts per million of carbon dioxide, the maximum safe level for a sustainable survivable planet, not the 400 parts per million of the present moment (“142% of the pre-industrial era” level of carbon, the World Meteorological Organization notes). In other words, we’ve been steadily filling the atmosphere with greenhouse gases and so imperiling the planet and humanity since we knew what we were doing.
The Great Smog and the Big Wind
In that fall a quarter of a century ago, the world changed profoundly right before our eyes. Then we settled back into the short-term, ahistorical view that things are really pretty stable, that ordinary people have no power, and that the world can’t be changed. With that in mind, it’s worth looking at Germany today. Maybe because Germans know better than us that things can change for the worse or the better fast, that the world is not a stable and settled place, and that we do shape it, they have been willing to change.
At one point last spring, cold, cloudy Germany managed to get almost 75% of its electricity from renewable sources. Scotland — cold, gray, oil-rich Scotland! — is on track to achieve 100% renewable electrical generation by 2020 and has already hit the 40% mark. Spain now generates about half its electricity through clean and renewable sources. Other European countries have similar accomplishments. In fact, many of the changes that we in the United States will be marching for this Sunday have already begun happening, sometimes on a significant scale, elsewhere.
To remember how radical this new Europe is, recall that most of these places were burning coal not just in power plants or factories but in homes, too, not so many decades ago. Everyone deplores the horrific air of Beijing and other Chinese cities now, but few remember that many European cities were similarly foul with smoke and smog from the industrial revolution into the postwar era. In December 1952, for instance, the “Great Smog” of London reduced daytime visibility to a few yards and killed about 4,000 people in three days.
A decade before that, in response to the war Germany started, North Americans radically reduced their use of private vehicles and gasoline and planted more than 20 million victory gardens, producing vast quantities of food by non-industrial means. We have done that; we could (and must) do it again.
At least, we don’t burn coal in our homes any more, and in the U.S. we’ve retired 178 coal-fired power plants, phasing out many more, and prevented many new ones from being built. The renewable energy sources that were, people insisted, too minor or unreliable or expensive or new are now beginning to work well, and the price to produce energy in such a fashion is dropping rapidly. UBS, the European investment giant, recently counseled that power plants and centralized power generation are no longer good investments, since decentralized renewables are likely to replace them.
Of course, Germany and Britain are still burning coal, and Poland remains a giant coal mine. Europe is not a perfect renewable energy paradise, just a part of the world that demonstrates the viability of changing how we produce and consume energy. We are already changing, even if not fast enough, not by a long shot, at least not yet. The same goes for divesting from fossil-fuel investments, even though dozens of universities, cities, religious institutions, and foundations have already committed to doing so, and some have by now actually purged their portfolios. The excuse that change is impossible is no longer available, because many places and entities have already changed.
If you want to know how potentially powerful you are, ask your enemies. The misogynists who attack feminism and try to intimidate feminists into silence only demonstrate in a roundabout way that feminism really is changing the world; they are the furious backlash and so the proof that something meaningful is at stake. The climate movement is similarly upsetting a lot of powerful people and institutions; to grasp that, you just have to look at the tsunamis of money spent opposing specific measures and misinforming the public. The carbon barons are demonstrating that we could change the world and that they don’t want us to.
We are powerful and need to become more so in the next year as a major conference in Paris approaches in December 2015 where the climate agreements we need could be hammered out. Or not. This is, after all, a sequel to the Copenhagen conference of 2009, where representatives of many smaller and more vulnerable nations, as well as citizens’ groups, were eager for a treaty that took on climate change in significant ways, only to have their hopes crushed by the recalcitrant governments of the United States and China.
Right now, we are in a churning sea of change, of climate change, of subtle changes in everyday life, of powerful efforts by elites to serve themselves and damn the rest of us, and of increasingly powerful activist and social-movement campaigns to make a world that benefits more beings, human and otherwise, in the longer term. Every choice you make aligns you with one set of these forces or another. That includes doing nothing, which means aligning yourself with the worst of the status quo dragging us down in that ocean of carbon and consumption.
To make personal changes is to do too little. Only great movements, only collective action can save us now. Only is a scary word, but when the ship is sinking, it can be an encouraging one as well. It can hold out hope. The world has changed again and again in ways that, until they happened, would have been considered improbable by just about everyone on the planet. It is changing now and the direction is up to us.
There will be another story to be told about what we did a quarter century after civil society toppled the East Bloc regimes, what we did in the pivotal years of 2014 and 2015. All we know now is that it is not yet written, and that we who live at this very moment have the power to write it with our lives and acts.
In his novel The Plague, Albert Camus describes how death comes to an ugly French port in Algeria.
Thanks to an infestation of rats and the fleas they carry, the bubonic plague descends upon the city in the spring and intensifies during the hot summer. After a short period of denial, the residents panic, then sink into despondency and alcoholism. The port is put under quarantine. Undeterred by the apathy of the population and the danger of exposure, a small number of courageous individuals mobilize to fight the epidemic and eventually beat back the invader.
Camus took great care to detail the symptoms of the disease. But for all his medical exactitude, the French writer was not primarily interested in epidemiology. His inspiration was a different kind of infection. The novel is set some time in the 1940s. The plague is Nazism, and those who fight the disease stand in for the heroes of the French Resistance. It is a supremely apt allegory, for did not the Nazis claim that their victims were vermin? Camus surely must have enjoyed reincarnating the German fascists as the lowest of the low: bloodsucking fleas and desperate rats.
The twin plagues of Nazism and bubonic plague, except for some isolated cases, are behind us. But now it seems that a different pair of plagues is in our midst.
Today’s headlines are filled with similar stories of the spread of death and destruction in the Middle East and Africa. American commentators worry that these plagues will burst their borders and somehow spread to these shores. And, as in Camus’s novel, these diseases point to something larger, not the imposition of a new malignant system but the breakdown of the existing order.
In West Africa, the plague is Ebola, a terrifying fever that ends in massive hemorrhaging. The mortality rate, if untreated, is as high as bubonic plague. But at least with the modern version of the Black Death, treatment brings the mortality rate down to 15 percent. Ebola, by contrast, resists treatment. There are no vaccines for this hemorrhagic fever—though there’s promising news out of Canada—and the few treatments that have been used remain highly experimental. Doctors and officials establish quarantines and hope the disease will burn itself out. With airlines shutting down service to the infected region, hampering efforts to deliver medical supplies, the disease continues to rage on.
Ebola has so far claimed around 1,500 lives. This is terrible, of course, but it pales in comparison to how many children succumb to diarrhea in Africa. According to a 2010 report, 2,000 African children die every day of a disease that can be prevented through relatively cheap methods: safe water and hygiene. But diarrhea is not a communicable disease in the same sense as the plague or Ebola. And no one in the United States worries that a summit of African leaders or the repatriation of infected patients will spread an epidemic of diarrhea stateside. Ebola monopolizes the headlines because what grabs attention is fear (along with the usual colonial images of Africans as dirty and irresponsible).
The panic is, of course, more acute in the areas hardest hit by Ebola. Consider the case of Kandeh Kamara, a brave 21-year-old who volunteered to help fight the disease in Sierra Leone. He was promptly drafted to become a “burial boy” responsible for dealing with the corpses of the infected. “In doing their jobs, the burial boys have been cast out of their communities because of fear that they will bring the virus home with them,” writes Adam Nossiter and Ben Solomon in apowerful piece in The New York Times. Talk about thankless tasks. Kandeh Kamara initially received no payment for his work and had to beg for food on the street. He now gets $6 a day and hopes to rent an apartment, though landlords often refuse to lease to the burial boys.
Ebola is bad news, but it hasn’t generated the same kind of fury as that other fast-spreading scourge, namely the Islamic State (IS). The recent beheading of U.S. journalist James Foley has ratcheted up the outrage of U.S. observers.
It’s certainly not the first beheading that IS has done. The group specializes inmeting out barbarous punishments—decapitation, crucifixion, amputations. But just as Ebola’s impact became real for Americans when it infected people “like us”—two U.S. missionaries in Liberia—the United States was prompted to act against IS when it began killing non-Muslims, first the stranded Yazidis and then the abducted journalist.
IS has spread quickly, and so has the panic that has accompanied its territorial acquisition. There have been the inevitable analogies to Nazism. But even those who don’t invoke Hitler are quick to use Manichean language to describe the IS challenge.
“We can see evil through the eye slits of the ski mask worn by Foley’s killer,” writes David Ignatius in a Washington Post commentary entitled The New Battle Against Evil. “But stopping that evil is a harder task.”
The IS killers are a nasty piece of work, and their ideology is thoroughly malign. But I hesitate to use the language of good and evil. Such moralistic terminology presumes that they, the beheaders, are a Satanic force that can only be exorcised with whatever version of holy water our angelic forces dispense—air strikes, boots on the ground, military aid to the Kurdish peshmerga, efforts in the community to dissuade angry young men from taking the next flight to Mosul.
We, on the other hand, are good. We would never behead anyone. Those we execute “deserve” their punishment (though the occasional innocent person might inadvertently fall through the cracks). And the civilian casualties from our military offensives, because we are by definition good, are simply mistakes. After all, we don’t publicly celebrate the deaths of Afghan civilians from our drone strikes (45 in 2013 alone) or the deaths of over 400 children in Gaza. But our protestations of innocence are little consolation to the families of the victims.
At what point do mistakes aggregate into something evil? At the very least, do they prevent us from claiming the mantle of good? And, of course, it’s not just the mistakes that are problematic but also the deliberate policies that, for instance, align Washington with dictators and other murderous actors. U.S. disgust with IS may already have prompted intelligence sharing with the regime in Damascus, though the Obama administration has denied such deals.
Camus had some choice words for those who are reluctant to call evil by its name. “Our townsfolk were like everybody else, wrapped up in themselves; in other words they were humanists: they disbelieved in pestilences,” he wrote in The Plague. “A pestilence isn’t a thing made to man’s measure; therefore we tell ourselves that pestilence is a mere bogy of the mind, a bad dream that will pass away. But it doesn’t always pass away and, from one bad dream to another, it is men who pass away, and the humanists first of all, because they haven’t taken their precautions.”
Humanists perhaps disbelieve in pestilences. “I used to not believe in evil,” confesses Richard Cohen this week in a Washington Post column declaring a “return of evil” with ISIS. Once a liberal humanist, Cohen long ago remade himself into a liberal hawk.
I still consider myself a humanist. But my brand of humanism sees pestilence everywhere. Indeed, I tend to see pestilence not only in the acts of individuals but in the structures within which the plague takes root and spreads. And this is where the two plagues intersect, Ebola and IS. They both prosper where the immune system is weak.
When it comes to medical infrastructure, Africa definitely has a compromised immune system. The continent has been hit hard by HIV/AIDS (70 percent of those living with HIV are in Africa), cholera (major outbreaks took place recently in Senegal, Zimbabwe, and Sierra Leone), and malaria (an African child dies every minute from this disease). Ebola has spread rapidly because of critical shortages in medical staff and supplies.
But the deeper reason is environmental: the clear-cutting of forests that have served as a traditional barrier to pathogens. West Africa has one of the fastest rates of deforestation in the world, losing nearly a million hectares a year. The forests are Africa’s natural defenses, and Ebola is a sign that these defenses have been fatally weakened. What used to stay in remote villages now spreads quickly to urban areas.
The recent victories of IS in Syria and Iraq, meanwhile, suggest not a breakdown in the environmental system but in the political one. IS is not simply a band of serial killers. They have a distinct ideology and set of political motives. Nor does it matter whether they are operating in a formally dictatorial or democratic environment. IS thrives both where Assad rules with an iron fist and where Saddam is long gone.
The common denominator is chaos. IS has ruthlessly expanded in the grey areas beyond the reach of the rule of law. In Syria, it has prospered in regions that already broke loose from the country during the uprising. In Iraq, it took advantage of a paralyzing conflict between Shi’a and Sunni that left the northern reaches of the country tenuously connected to the central government.
Local governance, whether it’s democratic or authoritarian, serves the same function as the forests of West Africa. Such governance holds society together. When it deteriorates, the very cellular structure breaks down. In Ebola, the cell walls fray and the patient bleeds out. With a virus like IS, the fibers of the social fabric fray and large sections of the country bleed out.
There are, of course, many differences between a pestilence like Ebola and a movement like IS. But they are both the result of systemic breakdown. They are opportunistic infections.
In both cases there are no magic pills. Even if we come up with an antidote to this version of Ebola, as long as we continue to cut down the forests of Africa, more potent versions will continue to appear and spread. And if we attempt to obliterate IS only with bombs or boots on the ground, it will simply pop up somewhere else where the conditions favor such desperate efforts to create a totalitarian order. Instead we should focus on the conditions that give rise to these phenomena—and our role in helping to perpetuate these conditions.
Camus recommended vigilance. Pestilence, he concluded, “bides its time in bedrooms, cellars, trunks, and bookshelves and…perhaps the day would come when, for the bane and the enlightening of men, it would rouse up its rats again and send them forth to die in a happy city.” The current plagues have certainly been a bane. Whether they also help to enlighten us remains to be seen.
John Feffer is the director of Foreign Policy In Focus.
Here are some tough words about the Obama presidency from Cornell West, who argues persuasively that the fetish for the middle ground in politics often makes for poor leadership.
In the interview Thomas Frank asks West, “What on earth ails the man? Why can’t he fight the Republicans? Why does he need to seek a grand bargain?”
“I think Obama, his modus operandi going all the way back to when he was head of the [Harvard] Law Review, first editor of the Law Review and didn’t have a piece in the Law Review. He was chosen because he always occupied the middle ground. He doesn’t realize that a great leader, a statesperson, doesn’t just occupy middle ground. They occupy higher ground or the moral ground or even sometimes the holy ground. But the middle ground is not the place to go if you’re going to show courage and vision. And I think that’s his modus operandi. He always moves to the middle ground. It turned out that historically, this was not a moment for a middle-ground politician. We needed a high-ground statesperson and it’s clear now he’s not the one.”
West also says:
“He posed as a progressive and turned out to be counterfeit. We ended up with a Wall Street presidency, a drone presidency, a national security presidency. The torturers go free. The Wall Street executives go free. The war crimes in the Middle East, especially now in Gaza, the war criminals go free. And yet, you know, he acted as if he was both a progressive and as if he was concerned about the issues of serious injustice and inequality and it turned out that he’s just another neoliberal centrist with a smile and with a nice rhetorical flair. And that’s a very sad moment in the history of the nation because we are—we’re an empire in decline. Our culture is in increasing decay. Our school systems are in deep trouble. Our political system is dysfunctional. Our leaders are more and more bought off with legalized bribery and normalized corruption in Congress and too much of our civil life. You would think that we needed somebody—a Lincoln-like figure who could revive some democratic spirit and democratic possibility.”