By Michael Klare
Reprinted with permission from TomDispatch.com
Here’s the good news: wind power, solar power, and other renewable forms of energy are expanding far more quickly than anyone expected, ensuring that these systems will provide an ever-increasing share of our future energy supply. According to the most recent projections from the Energy Information Administration (EIA) of the U.S. Department of Energy, global consumption of wind, solar, hydropower, and other renewables will double between now and 2040, jumping from 64 to 131 quadrillion British thermal units (BTUs).
And here’s the bad news: the consumption of oil, coal, and natural gas is also growing, making it likely that, whatever the advances of renewable energy, fossil fuels will continue to dominate the global landscape for decades to come, accelerating the pace of global warming and ensuring the intensification of climate-change catastrophes.
The rapid growth of renewable energy has given us much to cheer about. Not so long ago, energy analysts were reporting that wind and solar systems were too costly to compete with oil, coal, and natural gas in the global marketplace. Renewables would, it was then assumed, require pricey subsidies that might not always be available. That was then and this is now. Today, remarkably enough, wind and solar are already competitive with fossil fuels for many uses and in many markets.
If that wasn’t predicted, however, neither was this: despite such advances, the allure of fossil fuels hasn’t dissipated. Individuals, governments, whole societies continue to opt for such fuels even when they gain no significant economic advantage from that choice and risk causing severe planetary harm. Clearly, something irrational is at play. Think of it as the fossil-fuel equivalent of an addictive inclination writ large.
The contradictory and troubling nature of the energy landscape is on clear display in the 2016 edition of the International Energy Outlook, the annual assessment of global trends released by the EIA this May. The good news about renewables gets prominent attention in the report, which includes projections of global energy use through 2040. “Renewables are the world’s fastest-growing energy source over the projection period,” it concludes. Wind and solar are expected to demonstrate particular vigor in the years to come, their growth outpacing every other form of energy. But because renewables start from such a small base — representing just 12% of all energy used in 2012 — they will continue to be overshadowed in the decades ahead, explosive growth or not. In 2040, according to the report’s projections, fossil fuels will still have a grip on a staggering 78% of the world energy market, and — if you don’t mind getting thoroughly depressed — oil, coal, and natural gas will each still command larger shares of the market than all renewables combined.
Keep in mind that total energy consumption is expected to be much greater in 2040 than at present. At that time, humanity will be using an estimated 815 quadrillion BTUs (compared to approximately 600 quadrillion today). In other words, though fossil fuels will lose some of their market share to renewables, they will still experience striking growth in absolute terms. Oil consumption, for example, is expected to increase by 34% from 90 million to 121 million barrels per day by 2040. Despite all the negative publicity it’s been getting lately, coal, too, should experience substantial growth, rising from 153 to 180 quadrillion BTUs in “delivered energy” over this period. And natural gas will be the fossil-fuel champ, with global demand for it jumping by 70%. Put it all together and the consumption of fossil fuels is projected to increase by 177 quadrillion BTUs, or 38%, over the period the report surveys.
Anyone with even the most rudimentary knowledge of climate science has to shudder at such projections. After all, emissions from the combustion of fossil fuels account for approximately three-quarters of the greenhouse gases humans are putting into the atmosphere. An increase in their consumption of such magnitude will have a corresponding impact on the greenhouse effect that is accelerating the rise in global temperatures.
At the United Nations Climate Summit in Paris last December, delegates from more than 190 countries adopted a plan aimed at preventing global warming from exceeding 2 degrees Celsius (about 3.6 degrees Fahrenheit) above the pre-industrial level. This target was chosen because most scientists believe that any warming beyond that will result in catastrophic and irreversible climate effects, including the melting of the Greenland and Antarctic ice caps (and a resulting sea-level rise of 10-20 feet). Under the Paris Agreement, the participating nations signed onto a plan to take immediate steps to halt the growth of greenhouse gas emissions and then move to actual reductions. Although the agreement doesn’t specify what measures should be taken to satisfy this requirement — each country is obliged to devise its own “intended nationally determined contributions” to the overall goal — the only practical approach for most countries would be to reduce fossil fuel consumption.
As the 2016 EIA report makes eye-poppingly clear, however, the endorsers of the Paris Agreement aren’t on track to reduce their consumption of oil, coal, and natural gas. In fact, greenhouse gas emissions are expected to rise by an estimated 34% between 2012 and 2040 (from 32.3 billion to 43.2 billion metric tons). That net increase of 10.9 billion metric tons is equal to the total carbon emissions of the United States, Canada, and Europe in 2012. If such projections prove accurate, global temperatures will rise, possibly significantly above that 2 degree mark, with the destructive effects of climate change we are already witnessing today — the fires, heat waves, floods, droughts, storms, and sea level rise — only intensifying.
Exploring the Roots of Addiction
How to explain the world’s tenacious reliance on fossil fuels, despite all that we know about their role in global warming and those lofty promises made in Paris?
To some degree, it is undoubtedly the product of built-in momentum: our existing urban, industrial, and transportation infrastructure was largely constructed around fossil fuel-powered energy systems, and it will take a long time to replace or reconfigure them for a post-carbon future. Most of our electricity, for example, is provided by coal- and gas-fired power plants that will continue to operate for years to come. Even with the rapid growth of renewables, coal and natural gas are projected to supply 56% of the fuel for the world’s electrical power generation in 2040 (a drop of only 5% from today). Likewise, the overwhelming majority of cars and trucks on the road are now fueled by gasoline and diesel. Even if the number of new ones running on electricity were to spike, it would still be many years before oil-powered vehicles lost their commanding position. As history tells us, transitions from one form of energy to another take time.
Then there’s the problem — and what a problem it is! — of vested interests. Energy is the largest and most lucrative business in the world, and the giant fossil fuel companies have long enjoyed a privileged and highly profitable status. Oil corporations like Chevron and ExxonMobil, along with their state-owned counterparts like Gazprom of Russia and Saudi Aramco, are consistently ranked among the world’s most valuable enterprises. These companies — and the governments they’re associated with — are not inclined to surrender the massive profits they generate year after year for the future wellbeing of the planet.
As a result, it’s a guarantee that they will employ any means at their disposal (including well-established, well-funded ties to friendly politicians and political parties) to slow the transition to renewables. In the United States, for example, the politicians of coal-producing states are now at work on plans to block the Obama administration’s “clean power” drive, which might indeed lead to a sharp reduction in coal consumption. Similarly, Exxon has recruited friendly Republican officials to impede the efforts of some state attorney generals to investigate that company’s past suppression of information on the links between fossil fuel use and climate change. And that’s just to scratch the surface of corporate efforts to mislead the public that have included the funding of the Heartland Institute and other climate-change-denying think tanks.
Of course, nowhere is the determination to sustain fossil fuels fiercer than in the “petro-states” that rely on their production for government revenues, provide energy subsidies to their citizens, and sometimes sell their products at below-market rates to encourage their use. According to the International Energy Agency (IEA), in 2014 fossil fuel subsidies of various sorts added up to a staggering $493 billion worldwide — far more than those for the development of renewable forms of energy. The G-20 group of leading industrial powers agreed in 2009 to phase out such subsidies, but a meeting of G-20 energy ministers in Beijing in June failed to adopt a timeline to complete the phase-out process, suggesting that little progress will be made when the heads of state of those countries meet in Hangzhou, China, this September.
None of this should surprise anyone, given the global economy’s institutionalized dependence on fossil fuels and the amounts of money at stake. What it doesn’t explain, however, is the projected growth in global fossil fuel consumption. A gradual decline, accelerating over time, would be consistent with a broad-scale but slow transition from carbon-based fuels to renewables. That the opposite seems to be happening, that their use is actually expanding in most parts of the world, suggests that another factor is in play: addiction.
We all know that smoking tobacco, snorting cocaine, or consuming too much alcohol is bad for us, but many of us persist in doing so anyway, finding the resulting thrill, the relief, or the dulling of the pain of everyday life simply too great to resist. In the same way, much of the world now seems to find it easier to fill up the car with the usual tankful of gasoline or flip the switch and receive electricity from coal or natural gas than to begin to shake our addiction to fossil fuels. As in everyday life, so at a global level, the power of addiction seems regularly to trump the obvious desirability of embarking on another, far healthier path.
On a Fossil Fuel Bridge to Nowhere
Without acknowledging any of this, the 2016 EIA report indicates just how widespread and prevalent our fossil-fuel addiction remains. In explaining the rising demand for oil, for example, it notes that “in the transportation sector, liquid fuels [predominantly petroleum] continue to provide most of the energy consumed.” Even though “advances in nonliquids-based [electrical] transportation technologies are anticipated,” they will not prove sufficient “to offset the rising demand for transportation services worldwide,” and so the demand for gasoline and diesel will continue to grow.
Most of the increase in demand for petroleum-based fuels is expected tooccur in the developing world, where hundreds of millions of people are entering the middle class, buying their first gas-powered cars, and about to be hooked on an energy way of life that should be, but isn’t, dying. Oil use is expected to grow in China by 57% between 2012 and 2040, and at a faster rate (131%!) in India. Even in the United States, however, a growing preference for sport utility vehicles and pickup trucks continues to mean higher petroleum use. In 2016, according to Edmunds.com, a car shopping and research site, nearly 75% of the people who traded in a hybrid or electric car to a dealer replaced it with an all-gas car, typically a larger vehicle like an SUV or a pickup.
The rising demand for coal follows a depressingly similar pattern. Although it remains a major source of the greenhouse gases responsible for climate change, many developing nations, especially in Asia, continue to favor it when adding electricity capacity because of its low cost and familiar technology. Although the demand for coal in China — long the leading consumer of that fuel — is slowing, that country is still expected to increase its usage by 12% by 2035. The big story here, however, is India: according to the EIA, its coal consumption will grow by 62% in the years surveyed, eventually making it, not the United States, the world’s second largest consumer. Most of that extra coal will go for electricity generation, once again to satisfy an “expanding middle class using more electricity-consuming appliances.”
And then there’s the mammoth expected increase in the demand for natural gas. According to the latest EIA projections, its consumption will rise faster than any fuel except renewables. Given the small base from which renewables start, however, gas will experience the biggest absolute increase of any fuel, 87 quadrillion BTUs between 2012 and 2040. (In contrast, renewables are expected to grow by 68 quadrillion and oil by 62 quadrillion BTUs during this period.)
At present, natural gas appears to enjoy an enormous advantage in the global energy marketplace. “In the power sector, natural gas is an attractive choice for new generating plants given its moderate capital cost and attractive pricing in many regions as well as the relatively high fuel efficiency and moderate capital cost of gas-fired plants,” the EIA notes. It is also said to benefit from its “clean” reputation (compared to coal) in generating electricity. “As more governments begin implementing national or regional plans to reduce carbon dioxide emissions, natural gas may displace consumption of the more carbon-intensive coal and liquid fuels.”
Unfortunately, despite that reputation, natural gas remains a carbon-based fossil fuel, and its expanded consumption will result in a significant increase in global greenhouse gas emissions. In fact, the EIA claims that it will generate a larger increase in such emissions over the next quarter-century than either coal or oil — a disturbing note for those who contend that natural gas provides a “bridge” to a green energy future.
If you were to read through the EIA’s latest report as I did, you, too, might end up depressed by humanity’s addictive need for its daily fossil fuel hit. While the EIA’s analysts add the usual caveats, including the possibility that a more sweeping than expected follow-up climate agreement or strict enforcement of the one adopted last December could alter their projections, they detect no signs of the beginning of a determined move away from the reliance on fossil fuels.
If, indeed, addiction is a big part of the problem, any strategies undertaken to address climate change must incorporate a treatment component. Simply saying that global warming is bad for the planet, and that prudence and morality oblige us to prevent the worst climate-related disasters, will no more suffice than would telling addicts that tobacco and hard drugs are bad for them. Success in any global drive to avert climate catastrophe will involve tackling addictive behavior at its roots and promoting lasting changes in lifestyle. To do that, it will be necessary to learn from the anti-drug and anti-tobacco communities about best practices, and apply them to fossil fuels.
Consider, for example, the case of anti-smoking efforts. It was the medical community that first took up the struggle against tobacco and began by banning smoking in hospitals and other medical facilities. This effort was later extended to public facilities — schools, government buildings, airports, and so on — until vast areas of the public sphere became smoke-free. Anti-smoking activists also campaigned to have warning labels displayed in tobacco advertising and cigarette packaging.
Such approaches helped reduce tobacco consumption around the world and can be adapted to the anti-carbon struggle. College campuses and town centers could, for instance, be declared car-free — a strategy already embraced by London’s newly elected mayor, Sadiq Khan. Express lanes on major streets and highways can be reserved for hybrids, electric cars, and other alternative vehicles. Gas station pumps and oil advertising can be made to incorporate warning signs saying something like, “Notice: consumption of this product increases your exposure to asthma, heat waves, sea level rise, and other threats to public health.” Once such an approach began to be seriously considered, there would undoubtedly be a host of other ideas for how to begin to put limits on our fossil fuel addiction.
Such measures would have to be complemented by major moves to combat the excessive influence of the fossil fuel companies and energy states when it comes to setting both local and global policy. In the U.S., for instance, severely restricting the scope of private donations in campaign financing, as Senator Bernie Sanders advocated in his presidential campaign, would be a way to start down this path. Another would step up legal efforts to hold giant energy companies like ExxonMobil accountable for malfeasance in suppressing information about the links between fossil fuel combustion and global warming, just as, decades ago, anti-smoking activists tried to expose tobacco company criminality in suppressing information on the links between smoking and cancer.
Without similar efforts of every sort on a global level, one thing seems certain: the future projected by the EIA will indeed come to pass and human suffering of a previously unimaginable sort will be the order of the day.
Michael T. Klare, a TomDispatch regular, is a professor of peace and world security studies at Hampshire College and the author, most recently, of The Race for What’s Left. A documentary movie version of his book Blood and Oil is available from the Media Education Foundation. Follow him on Twitter at @mklare1.
Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Nick Turse’s Next Time They’ll Come to Count the Dead, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 Michael T. Klare
By David Morris
Reprinted with permission from the Institute for Local Self-Reliance
[Editor’s Comment: Note in the article that follows the discussion of the Bank of North Dakota, which has received particular attention in the last few years because of its beneficial role in helping North Dakota weather the storms of the Great Recession. Note especially in this regard the writings of Ellen Brown, President of the Public Banking Institute, author of Web of Debt and the The Public Bank Solution. See also her article, “NORTH DAKOTA’S ECONOMIC “MIRACLE”—IT’S NOT OIL.”]
On June 14th, North Dakotans voted to overrule their government’s decision to allow corporate ownership of farms. That they had the power to do so was a result of a political revolution that occurred almost exactly a century before, a revolution that may hold lessons for those like Bernie Sanders’ supporters who seek to establish a bottom-up political movement in the face of hostile political parties today.
Here’s the story. In the early 1900s North Dakota was effectively an economic colony of Minneapolis/Saint Paul. A Saint Paul based railroad tycoon controlled its freight prices. Minnesota companies owned many of the grain elevators that sat next to the rail lines and often cheated farmers by giving their wheat a lower grade than deserved. Since the flour mills were in Minneapolis, shipping costs reduced the price wheat farmers received. Minneapolis banks held farmers’ mortgages and their operating loans to farmers carried a higher interest than they charged at home.
Farmers, who represented a majority of the population, tried to free themselves from bondage by making the political system more responsive. In 1913 they gained an important victory when the legislature gave them the right, by petition, to initiate a law or constitutional amendment as well as to overturn a law passed by the legislature.
But this was a limited victory for while the people could enable they could not compel.
In 1914, for example, after a 30-year effort, voters authorized the legislature to build a state-owned grain elevator and mill. But in January 1915 a state legislative committee concluded it “would be a waste of the people’s money as well as a humiliating disappointment to the people of the state.” The legislature refused funding.
A few weeks later, two former candidates on the Socialist Party ticket, Arthur C. Townley and Albert Bowen, launched a new political organization, the Non Partisan League (NPL). The name conveyed their strategy: To rely more on program-based politics than party-based politics. According to the NPL its program intended to end the “utterly unendurable” situation in which “the people of this state have always been dependent on their existence on industries, banks, markets, storage and transportation facilities either existing altogether outside of the state or controlled by great private interests outside the state.”
The NPL’s platform contained concrete and specific measures: state ownership of elevators, flour mills, packing houses and cold storage plants; state inspection of grain grading and dockage; state hail insurance; rural credit banks operating at cost; exemption of farm improvements from taxation.
In his recent book, Insurgent Democracy Michael Lansing explains, “Small-property holders anxious to use government to create a more equitable form of capitalism cannot be easily categorized in contemporary political term.” The NPL “reminded Americans that corporate capitalism was not the only way forward.” Supporters of the NPL wanted state sponsored market fairness but not state control. They wanted public options, not public monopolies.
In the language of our 2016 political campaigns, it would not be much of a stretch to characterize the NPL as a movement for an American-style decentralized, anti-corporate, democratic socialism.
The NPL was as one contemporary observer, Thorstein Veblen described it, “large, loose, animated and untidy, but sure of itself in its settled disallowance of the Vested Interests… “
The movement was membership-based. Members were kept informed through a regular newsletter. This was part of a massive popular education effort. Membership fees allowed the NPL to hire organizers and lecturers who traveled throughout the state. Townley, the founder and leader of the NPL, proved an entertaining and charismatic speaker. Sometimes thousands would gather to hear him speak. Speeches themselves were community affairs.
The goal was to convince farmers that collectively they could significantly influence the decisions that would affect their personal and business lives.
To gain power the NPL relied on a political tool born of the Progressive movement: the political primary. To make government more responsive and transparent, Progressives urged states to bypass political conventions, political bosses and backroom deals and adopt direct primaries. By 1916, 25 of the then 48 U.S. states had adopted the primary as the vehicle for nominating political candidates.
The primary system gave people the power to elect candidates of their political party, but the key to the remarkable political revolution that swept through North Dakota was its adoption, in 1908, of an “open primary” law that allowed anyone to vote in a party’s primary even if unaffiliated with that party.
On March 29, 1916 the NPL took advantage of that law by convening its first convention. Attendees endorsed candidates who swore allegiance to its platform. These candidates ran in the June Republican primary, a primary targeted by the NPL because then (as now) the Republican Party dominated North Dakota.
In June 1916 the NPL effectively took over the Republican Party. In November 1916 NPL –endorsed candidates won every statewide office except one and gained a majority in the state Assembly, although not in the Senate. By that time the NPL boasted 40,000 members, an astonishing number given the state population of only 620,000.
In the succeeding legislative session the NPL was able to implement parts of its platform: a grain grading system, a 9-hour workday for women, regulation of railroad shipping rates and increased state aid to rural schools. But the Senate narrowly defeated the key to implementing NPL’s broad vision: a constitutional amendment to allow for state-owned businesses.
In 1918, the NPL gained a majority in the state Senate. That year North Dakotans voted on 10 constitutional amendments. They approved every one. One, endorsed by a resounding margin of 59-41 gave state, county and local governments permission “to engage in industry, enterprises or businesses.” Another allowed the state to guarantee $2 million in bonds and established voting requirements for future bonding. Another created state hail insurance.
Other amendments expanded the possibility of direct democracy by reducing the number of signatures required to put an initiative on the ballot, and by allowing constitutional amendments to be passed by a simple majority of the voters.
In June 1919, voters approved 6 of 7 legislatively referred statutes, including the establishment of a state bank, that latter by a vote of 56-44. The one ballot initiative North Dakotans rejected—giving the Governor the authority to appoint every county school superintendent—was itself revealing. North Dakotans wanted a state that could stand up to big out-of-state corporations but they preferred local control to state control.
The Bank of North Dakota (BND) was the centerpiece of the NPL’s effort to take back control of their economy. It was intended to strengthen, not undercut local banks. It established no branches, nor did it accept independent deposits or accounts. The Bank “strongly recommended” that borrowers seek mortgages by working through local institutions. Banks across the state used the BND as a clearinghouse for various financial transactions.
Farmers immediately benefited as their interest rates on loans dropped to about 6 percent from the prevailing 8.7 percent.
In November 1920 voters strengthened the BND by narrowly approving an initiative requiring all state, county, township, municipal and school district funds be deposited there.
In March 1920 the NPL legislature referred to the people a constitutional amendment allowing them to petition for the recall of any elected officials. That unprecedented extension of direct democracy proved its undoing, for in late 1918, at the peak of the NPL’s power, political opposition had coalesced into a new organization, the Independent Voters Association (IVA). As the NPL battled internal divisions and a growing unease that it had begun to pursue measures beyond its mandate, the IVA gained support.
The IVA used the political tools the NPL had created. In 1921 its members successfully petitioned for recall elections for the three state officers who constituted the membership of the Industrial Commission that oversaw state enterprises: the governor, attorney general and commissioner of agriculture. The IVA slate won by a whisker. It was the first and last time a U.S. Governor has been successfully recalled.
The IVA immediately set about to undo the NPL program by putting nine provisions on the ballot, including one to abolish the state Bank. Another intended to shrink the capacity of state government by reducing the amount of state bonded debt. Another would have undermined the open primary by requiring separate party ballots for primaries.
Every ballot measure lost, albeit by very narrow margins.
In November 1922 the IVA achieved what the NPL had four years before: Control of all three branches of state government. The NPL’s abrupt disintegration resulted from a number of factors. In 1921 the price of wheat dropped about 60 percent. The resulting economic pain would have reduced the support for any sitting government. The Russian Revolution ushered in a nationwide “Red Scare.” The opposition labeled the NPL’s leaders Communists and Bolsheviks and launched a new magazine called Red Flame. Townley himself was jailed under a Minnesota sedition law for opposing the U.S. involvement in WWI. Meanwhile, internal divisions continued to beset the NPL.
The Legacy of the NPL
As the Nation magazine observed in 1923, “…although the visible machinery largely melted away, a sentiment and a point of view had been established in the minds of hundreds of thousands of farmers and ranchers.” Looking back in 1955, Robert L. Morian, author of the classic Political Prairie Fire, comment that the NPL helped to develop “some of the most independently minded electorates in the country.”
Those independently minded electorates and their anti-corporate, pro-cooperative and independent business sentiment continued to inform and often guide policymakers in the decades to come.
The North Dakota Mill and Elevator Association began operation in a modern facility in 1922. Today it consists of 7 milling units, an elevator and flour mill and a packing warehouse to prepare bagged products for shipment. It is the largest flour mill in the U.S. and the only state-owned milling facility.
In 1932, North Dakotans voted 57 to 43 to ban corporations from owning or leasing farmland.
In 1963 the legislature enacted a law that required pharmacies be owned by a state-registered pharmacist. The effect was to ban chains, except those operating at the time the law was passed.
In 1980 North Dakotans voted to establish a State Housing Finance Agency to provide mortgages to low income households.
In recent years several of these laws protecting independent farmers and businesses have come under attack by big corporations. After several attempts by Big Pharmacy failed to convince the legislature to repeal the Pharmacy Ownership Law, Wal Mart spent $9.3 million to finance a ballot initiative. In November 2014, by a vote of 59-41 the initiative lost.
In 2015 big corporations did convince the legislature to overturn the 1932 anti-corporate farming law. This June, as noted at the beginning of this article, by a resounding margin of 76-24 North Dakotans voted to reinstate the old law.
Today the economic structure of North Dakota reflects its focus on independent and cooperative businesses.
The Pharmacy Ownership law, for example, has markedly benefited North Dakota. A report by the Institute for Local Self-Reliance (ILSR) found that on every key measure of pharmacy care, including quality and the price of drugs, North Dakota’s independent pharmacies outperform those of neighboring states and the U.S. as a whole. Unsurprisingly North Dakota also has more pharmacies per capita than other states. Its rural residents are more likely to have a nearby pharmacist.
North Dakota’s banking system reflects a similar community-based structure. An analysis by ILSR found that, on a per capita basis, the state boasts almost six times as many locally owned financial institutions as the rest of the nation. (89 small and mid sized community banks and 38 credit unions). These control 83 percent of the deposits of the state. North Dakota’s community banks have given 400 percent more small business loans than the national average. Student loan rates are among the lowest in the country.
As Stacy Mitchell, Director of ILSR’s Community-Scaled Economy Initiative observes, “While the publicly owned BND might well be characterized as a socialist institution, it has had the effect of enabling North Dakota’s local banks to be very successful capitalists.” In recent years local banks in North Dakota have earned a return on capital nearly twice that of the nation’s largest 20 banks.
In the last two decades years the BND has generated almost $1 billion in “profit” and returned almost half of that to the state’s general fund.
Recall that in 1919, voters had approved the Bank of North Dakota, by the very slim margin of 51-49. A switch of 2,000 votes would have killed the Bank in its infancy. Today no party would dare propose its destruction.
North Dakota’s impressive 21st century telecommunications infrastructure is also a testament to its historic focus on local and independent ownership. The state ranks 47th in population density. That means it has one of the highest costs per household for installing state-of-the-art high-speed fiber networks. Nevertheless it boasts the highest percentage of people with access to such networks in the country. Why? One reason is its abundance of rural cooperatives and small telecom companies, 41 providers in all, including 17 cooperatives.
North Dakota is also home to the Dakota Carrier Network. Owned by 15 independent rural telecommunications companies, the DCN crisscrosses the state with more 1,400 miles of fiber backbone. In the last five years independently owned companies have invested more than $100 million per year to bring fiber to the home. They now serve more than 164,000 customers in 250 communities.
What Should Bernie’s Brethren Do?
Certainly the road to political power faces many more obstacles now than the NPL faced a century ago. North Dakota was a largely agricultural state. The key to NPL’s organizing effort was access to a car and gas money, not an easy get in those days, but much easier than the amount of money now needed to mount a political campaign.
Most new movements will be unable to take advantage of the open primary. After the NPL gained power in more than half a dozen states, the existing parties fought back. Nevertheless, 11 states still have pure open primaries; about a dozen more have hybrid systems.
Recently the courts have not been sympathetic to the open primary. Not long ago the Supreme Court invented a new “right of association” and bestowed that right on political parties. In 2000, for example, by a 7-2 vote, the Court overturned a California form of open primary approved by the voters by a 60-40 vote. Writing for the Majority, Justice Antonin Scalia objected that the California law “forces political parties to associate with—to have their nominees, and hence their positions, determined by—those who, at best, have refused to affiliate with the party, and, at worst, have expressly affiliated with a rival.”
After the California decision the voters of Washington, by a similar 60-40 vote, adopted an open primary system similar to California’s but with a key difference: The candidate would have to declare a “party preference” that would appear next to his or her name on the ballot. In 2008, the Supreme Court, again by a 7-2 vote, this time upheld that law, a ruling that might allow for a variant of the NPL strategy.
Before we develop a strategy for winning office we need to take a page from the NPL playbook and develop a platform, one consisting of specific, concrete, policies, not a laundry list of all desirable policies.
Bernie Sanders and his followers currently are working to write a platform for the Democratic Party convention. That is important and useful, but that platform by its nature will have a national focus and speak to the exercise of power by the federal government. We also need platforms that focus on states and cities and counties and school districts and offer concrete measures they have the authority to enact.
Those platforms will provide the basis for endorsing candidates, regardless of their political affiliation or whether they run in a closed or open primary state. In those states that permit, we may be able to enact various planks of the platform through initiative and referendum. At this point 27 states have initiative and 24 have referendum. Nineteen allow constitutional amendments by initiative.
The Nonpartisan League’s tenure in power was brief, but its policies, the public institutions it built and perhaps most important, the public sentiment it nurtured and brought to maturity, endure to this day: A true example of a political revolution from the bottom up.
The Age of Disintegration: Neoliberalism, Interventionism, the Resource Curse, and a Fragmenting World
By Patrick Cockburn
Reprinted with permission from TomDispatch.com
Introduction by Tom Engelhardt:
Here’s an unavoidable fact: we are now in a Brexit world. We are seeing the first signs of a major fragmentation of this planet that, until recently, the cognoscenti were convinced was globalizing rapidly and headed for unifications of all sorts. If you want a single figure that catches the grim spirit of our moment, it’s 65 million. That’s the record-setting number of people that the Office of the U.N. High Commissioner for Refugees estimates were displaced in 2015 by “conflict and persecution,” one of every 113 inhabitants of the planet. That’s more than were generated in the wake of World War II at a time when significant parts of the globe had been devastated. Of the 21 million refugees among them, 51% were children (often separated from their parents and lacking any access to education). Most of the displaced of 2015 were, in fact, internal refugees, still in their own often splintered states. Almost half of those who fled across borders have come from three countries: Syria (4.9 million), Afghanistan (2.7 million), and Somalia (1.1 million).
Despite the headlines about refugees heading for Europe — approximately a million of them made it there last year (with more dying on the way) — most of the uprooted who leave their homelands end up in poor or economically mid-level neighboring lands, with Turkey at 2.5 million refugees leading the way. In this fashion, the disruption of spreading conflicts and chaos, especially across the Greater Middle East and Africa, only brings more conflict and chaos with it wherever those refugees are forced to go.
And keep in mind that, as extreme as that 65 million figure may seem, it undoubtedly represents the beginning, not the end, of a process. For one thing, it doesn’t even include the estimated 19 million people displaced last year by extreme weather events and other natural disasters. Yet in coming decades, the heating of our planet, with attendant weather extremes (like the present heat wave in the American West) and rising sea levels, will undoubtedly produce its own waves of new refugees, only adding to both the conflicts and the fragmentation.
As Patrick Cockburn points out today, we have entered “an age of disintegration.” And he should know. There may be no Western reporter who has covered the grim dawn of that age in the Greater Middle East and North Africa — from Afghanistan to Iraq, Syria to Libya — more fully or movingly than he has over this last decade and a half. His latest book, Chaos & Caliphate: Jihadis and the West in the Struggle for the Middle East, gives a vivid taste of his reporting and of a world that is at present cracking under the pressure of the conflicts he has witnessed. And imagine that so much of this began, at the bargain-basement cost of a mere $400,000 to $500,000, with 19 (mainly Saudi) fanatics, and a few hijacked airliners. Osama bin Laden must be smiling in his watery grave. Tom
The Age of Disintegration
Neoliberalism, Interventionism, the Resource Curse, and a Fragmenting World
By Patrick Cockburn
We live in an age of disintegration. Nowhere is this more evident than in the Greater Middle East and Africa. Across the vast swath of territory between Pakistan and Nigeria, there are at least seven ongoing wars — in Afghanistan, Iraq, Syria, Yemen, Libya, Somalia, and South Sudan. These conflicts are extraordinarily destructive. They are tearing apart the countries in which they are taking place in ways that make it doubtful they will ever recover. Cities like Aleppo in Syria, Ramadi in Iraq, Taiz in Yemen, and Benghazi in Libya have been partly or entirely reduced to ruins. There are also at least three other serious insurgencies: in southeast Turkey, where Kurdish guerrillas are fighting the Turkish army, in Egypt’s Sinai Peninsula where a little-reported but ferocious guerrilla conflict is underway, and in northeast Nigeria and neighboring countries where Boko Haram continues to launch murderous attacks.
All of these have a number of things in common: they are endless and seem never to produce definitive winners or losers. (Afghanistan has effectively been at war since 1979, Somalia since 1991.) They involve the destruction or dismemberment of unified nations, their de facto partition amid mass population movements and upheavals — well publicized in the case of Syria and Iraq, less so in places like South Sudan where more than 2.4 million people have been displaced in recent years.
Add in one more similarity, no less crucial for being obvious: in most of these countries, where Islam is the dominant religion, extreme Salafi-Jihadi movements, including the Islamic State (IS), al-Qaeda, and the Taliban are essentially the only available vehicles for protest and rebellion. By now, they have completely replaced the socialist and nationalist movements that predominated in the twentieth century; these years have, that is, seen a remarkable reversion to religious, ethnic, and tribal identity, to movements that seek to establish their own exclusive territory by the persecution and expulsion of minorities.
In the process and under the pressure of outside military intervention, a vast region of the planet seems to be cracking open. Yet there is very little understanding of these processes in Washington. This was recently well illustrated by the protest of 51 State Department diplomats against President Obama’s Syrian policy and their suggestion that air strikes be launched targeting Syrian regime forces in the belief that President Bashar al-Assad would then abide by a ceasefire. The diplomats’ approach remains typically simpleminded in this most complex of conflicts, assuming as it does that the Syrian government’s barrel-bombing of civilians and other grim acts are the “root cause of the instability that continues to grip Syria and the broader region.”
It is as if the minds of these diplomats were still in the Cold War era, as if they were still fighting the Soviet Union and its allies. Against all the evidence of the last five years, there is an assumption that a barely extant moderate Syrian opposition would benefit from the fall of Assad, and a lack of understanding that the armed opposition in Syria is entirely dominated by the Islamic State and al-Qaeda clones.
Though the invasion of Iraq in 2003 is now widely admitted to have been a mistake (even by those who supported it at the time), no real lessons have been learned about why direct or indirect military interventions by the U.S. and its allies in the Middle East over the last quarter century have all only exacerbated violence and accelerated state failure.
A Mass Extinction of Independent States
The Islamic State, just celebrating its second anniversary, is the grotesque outcome of this era of chaos and conflict. That such a monstrous cult exists at all is a symptom of the deep dislocation societies throughout that region, ruled by corrupt and discredited elites, have suffered. Its rise — and that of various Taliban and al-Qaeda-style clones — is a measure of the weakness of its opponents.
The Iraqi army and security forces, for example, had 350,000 soldiers and 660,000 police on the books in June 2014 when a few thousand Islamic State fighters captured Mosul, the country’s second largest city, which they still hold. Today the Iraqi army, security services, and about 20,000 Shia paramilitaries backed by the massive firepower of the United States and allied air forces have fought their way into the city of Fallujah, 40 miles west of Baghdad, against the resistance of IS fighters who may have numbered as few as 900. In Afghanistan, the resurgence of the Taliban, supposedly decisively defeated in 2001, came about less because of the popularity of that movement than the contempt with which Afghans came to regard their corrupt government in Kabul.
Everywhere nation states are enfeebled or collapsing, as authoritarian leaders battle for survival in the face of mounting external and internal pressures. This is hardly the way the region was expected to develop. Countries that had escaped from colonial rule in the second half of the twentieth century were supposed to become more, not less, unified as time passed.
Between 1950 and 1975, nationalist leaders came to power in much of the previously colonized world. They promised to achieve national self-determination by creating powerful independent states through the concentration of whatever political, military, and economic resources were at hand. Instead, over the decades, many of these regimes transmuted into police states controlled by small numbers of staggeringly wealthy families and a coterie of businessmen dependent on their connections to such leaders as Hosni Mubarak in Egypt or Bashar al-Assad in Syria.
In recent years, such countries were also opened up to the economic whirlwind of neoliberalism, which destroyed any crude social contract that existed between rulers and ruled. Take Syria. There, rural towns and villages that had once supported the Baathist regime of the al-Assad family because it provided jobs and kept the prices of necessities low were, after 2000, abandoned to market forces skewed in favor of those in power. These places would become the backbone of the post-2011 uprising. At the same time, institutions like the Organization of Petroleum Exporting Countries (OPEC) that had done so much to enhance the wealth and power of regional oil producers in the 1970s have lost their capacity for united action.
The question for our moment: Why is a “mass extinction” of independent states taking place in the Middle East, North Africa, and beyond? Western politicians and media often refer to such countries as “failed states.” The implication embedded in that term is that the process is a self-destructive one. But several of the states now labeled “failed” like Libya only became so after Western-backed opposition movements seized power with the support and military intervention of Washington and NATO, and proved too weak to impose their own central governments and so a monopoly of violence within the national territory.
In many ways, this process began with the intervention of a U.S.-led coalition in Iraq in 2003 leading to the overthrow of Saddam Hussein, the shutting down of his Baathist Party, and the disbanding of his military. Whatever their faults, Saddam and Libya’s autocratic ruler Muammar Gaddafi were clearly demonized and blamed for all ethnic, sectarian, and regional differences in the countries they ruled, forces that were, in fact, set loose in grim ways upon their deaths.
A question remains, however: Why did the opposition to autocracy and to Western intervention take on an Islamic form and why were the Islamic movements that came to dominate the armed resistance in Iraq and Syria in particular so violent, regressive, and sectarian? Put another way, how could such groups find so many people willing to die for their causes, while their opponents found so few? When IS battle groups were sweeping through northern Iraq in the summer of 2014, soldiers who had thrown aside their uniforms and weapons and deserted that country’s northern cities would justify their flight by saying derisively: “Die for [then-Prime Minister Nouri] al-Maliki? Never!”
A common explanation for the rise of Islamic resistance movements is that the socialist, secularist, and nationalist opposition had been crushed by the old regimes’ security forces, while the Islamists were not. In countries like Libya and Syria, however, Islamists were savagely persecuted, too, and they still came to dominate the opposition. And yet, while these religious movements were strong enough to oppose governments, they generally have not proven strong enough to replace them.
Too Weak to Win, But Too Strong to Lose
Though there are clearly many reasons for the present disintegration of states and they differ somewhat from place to place, one thing is beyond question: the phenomenon itself is becoming the norm across vast reaches of the planet.
If you’re looking for the causes of state failure in our time, the place to start is undoubtedly with the end of the Cold War a quarter-century ago. Once it was over, neither the U.S. nor the new Russia that emerged from the Soviet Union’s implosion had a significant interest in continuing to prop up “failed states,” as each had for so long, fearing that the rival superpower and its local proxies would otherwise take over. Previously, national leaders in places like the Greater Middle East had been able to maintain a degree of independence for their countries by balancing between Moscow and Washington. With the break-up of the Soviet Union, this was no longer feasible.
In addition, the triumph of neoliberal free-market economics in the wake of the Soviet Union’s collapse added a critical element to the mix. It would prove far more destabilizing than it looked at the time.
Again, consider Syria. The expansion of the free market in a country where there was neither democratic accountability nor the rule of law meant one thing above all: plutocrats linked to the nation’s ruling family took anything that seemed potentially profitable. In the process, they grew staggeringly wealthy, while the denizens of Syria’s impoverished villages, country towns, and city slums, who had once looked to the state for jobs and cheap food, suffered. It should have surprised no one that those places became the strongholds of the Syrian uprising after 2011. In the capital, Damascus, as the reign of neoliberalism spread, even the lesser members of the mukhabarat, or secret police, found themselves living on only $200 to $300 a month, while the state became a machine for thievery.
This sort of thievery and the auctioning off of the nation’s patrimony spread across the region in these years. The new Egyptian ruler, General Abdel Fattah al-Sisi, merciless toward any sign of domestic dissent, was typical. In a country that once had been a standard bearer for nationalist regimes the world over, he didn’t hesitate this April to try to hand over two islands in the Red Sea to Saudi Arabia on whose funding and aid his regime is dependent. (To the surprise of everyone, an Egyptian court recently overruled Sisi’s decision.)
That gesture, deeply unpopular among increasingly impoverished Egyptians, was symbolic of a larger change in the balance of power in the Middle East: once the most powerful states in the region — Egypt, Syria, and Iraq — had been secular nationalists and a genuine counterbalance to Saudi Arabia and the Persian Gulf monarchies. As those secular autocracies weakened, however, the power and influence of the Sunni fundamentalist monarchies only increased. If 2011 saw rebellion and revolution spread across the Greater Middle East as the Arab Spring briefly blossomed, it also saw counterrevolution spread, funded by those oil-rich absolute Gulf monarchies, which were never going to tolerate democratic secular regime change in Syria or Libya.
Add in one more process at work making such states ever more fragile: the production and sale of natural resources — oil, gas, and minerals — and the kleptomania that goes with it. Such countries often suffer from what has become known as “the resources curse”: states increasingly dependent for revenues on the sale of their natural resources — enough to theoretically provide the whole population with a reasonably decent standard of living — turn instead into grotesquely corrupt dictatorships. In them, the yachts of local billionaires with crucial connections to the regime of the moment bob in harbors surrounded by slums running with raw sewage. In such nations, politics tends to focus on elites battling and maneuvering to steal state revenues and transfer them as rapidly as possible out of the country.
This has been the pattern of economic and political life in much of sub-Saharan Africa from Angola to Nigeria. In the Middle East and North Africa, however, a somewhat different system exists, one usually misunderstood by the outside world. There is similarly great inequality in Iraq or Saudi Arabia with similarly kleptocratic elites. They have, however, ruled over patronage states in which a significant part of the population is offered jobs in the public sector in return for political passivity or support for the kleptocrats.
In Iraq with a population of 33 million people, for instance, no less than seven million of them are on the government payroll, thanks to salaries or pensions that cost the government $4 billion a month. This crude way of distributing oil revenues to the people has often been denounced by Western commentators and economists as corruption. They, in turn, generally recommend cutting the number of these jobs, but this would mean that all, rather than just part, of the state’s resource revenues would be stolen by the elite. This, in fact, is increasingly the case in such lands as oil prices bottom out and even the Saudi royals begin to cut back on state support for the populace.
Neoliberalism was once believed to be the path to secular democracy and free-market economies. In practice, it has been anything but. Instead, in conjunction with the resource curse, as well as repeated military interventions by Washington and its allies, free-market economics has profoundly destabilized the Greater Middle East. Encouraged by Washington and Brussels, twenty-first-century neoliberalism has made unequal societies ever more unequal and helped transform already corrupt regimes into looting machines. This is also, of course, a formula for the success of the Islamic State or any other radical alternative to the status quo. Such movements are bound to find support in impoverished or neglected regions like eastern Syria or eastern Libya.
Note, however, that this process of destabilization is by no means confined to the Greater Middle East and North Africa. We are indeed in the age of destabilization, a phenomenon that is on the rise globally and at present spreading into the Balkans and Eastern Europe (with the European Union ever less able to influence events there). People no longer speak of European integration, but of how to prevent the complete break-up of the European Union in the wake of the British vote to leave.
The reasons why a narrow majority of Britons voted for Brexit have parallels with the Middle East: the free-market economic policies pursued by governments since Margaret Thatcher was prime minister have widened the gap between rich and poor and between wealthy cities and much of the rest of the country. Britain might be doing well, but millions of Britons did not share in the prosperity. The referendum about continued membership in the European Union, the option almost universally advocated by the British establishment, became the catalyst for protest against the status quo. The anger of the “Leave” voters has much in common with that of Donald Trump supporters in the United States.
The U.S. remains a superpower, but is no longer as powerful as it once was. It, too, is feeling the strains of this global moment, in which it and its local allies are powerful enough to imagine they can get rid of regimes they do not like, but either they do not quite succeed, as in Syria, or succeed but cannot replace what they have destroyed, as in Libya. An Iraqi politician once said that the problem in his country was that parties and movements were “too weak to win, but too strong to lose.” This is increasingly the pattern for the whole region and is spreading elsewhere. It carries with it the possibility of an endless cycle of indecisive wars and an era of instability that has already begun.
Patrick Cockburn is a Middle East correspondent for the Independent of London and the author of five books on the Middle East, the latest of which is Chaos and Caliphate: Jihadis and the West in the Struggle for the Middle East(OR Books).
Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Nick Turse’s Next Time They’ll Come to Count the Dead, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 Patrick Cockburn
Sunday, April 17th was the designated moment. The world’s leading oil producers were expected to bring fresh discipline to the chaotic petroleum market and spark a return to high prices. Meeting in Doha, the glittering capital of petroleum-rich Qatar, the oil ministers of the Organization of the Petroleum Exporting Countries (OPEC), along with such key non-OPEC producers as Russia and Mexico, were scheduled to ratify a draft agreement obliging them to freeze their oil output at current levels. In anticipation of such a deal, oil prices had begun to creep inexorably upward, from $30 per barrel in mid-January to $43 on the eve of the gathering. But far from restoring the old oil order, the meeting ended in discord, driving prices down again and revealing deep cracks in the ranks of global energy producers.
It is hard to overstate the significance of the Doha debacle. At the very least, it will perpetuate the low oil prices that have plagued the industry for the past two years, forcing smaller firms into bankruptcy and erasing hundreds of billions of dollars of investments in new production capacity. It may also have obliterated any future prospects for cooperation between OPEC and non-OPEC producers in regulating the market. Most of all, however, it demonstrated that the petroleum-fueled world we’ve known these last decades — with oil demand always thrusting ahead of supply, ensuring steady profits for all major producers — is no more. Replacing it is an anemic, possibly even declining, demand for oil that is likely to force suppliers to fight one another for ever-diminishing market shares.
The Road to Doha
Before the Doha gathering, the leaders of the major producing countries expressed confidence that a production freeze would finally halt the devastating slump in oil prices that began in mid-2014. Most of them are heavily dependent on petroleum exports to finance their governments and keep restiveness among their populaces at bay. Both Russia and Venezuela, for instance, rely on energy exports for approximately 50% of government income, while for Nigeria it’s more like 75%. So the plunge in prices had already cut deep into government spending around the world, causing civil unrest and even in some cases political turmoil.
No one expected the April 17th meeting to result in an immediate, dramatic price upturn, but everyone hoped that it would lay the foundation for a steady rise in the coming months. The leaders of these countries were well aware of one thing: to achieve such progress, unity was crucial. Otherwise they were not likely to overcome the various factors that had caused the price collapsein the first place. Some of these were structural and embedded deep in the way the industry had been organized; some were the product of their own feckless responses to the crisis.
On the structural side, global demand for energy had, in recent years, ceased to rise quickly enough to soak up all the crude oil pouring onto the market, thanks in part to new supplies from Iraq and especially from the expanding shale fields of the United States. This oversupply triggered the initial 2014 price drop when Brent crude — the international benchmark blend — went from a high of $115 on June 19th to $77 on November 26th, the day before a fateful OPEC meeting in Vienna. The next day, OPEC members, led by Saudi Arabia, failed to agree on either production cuts or a freeze, and the price of oil went into freefall.
The failure of that November meeting has been widely attributed to the Saudis’ desire to kill off new output elsewhere — especially shale production in the United States — and to restore their historic dominance of the global oil market. Many analysts were also convinced that Riyadh was seeking to punish regional rivals Iran and Russia for their support of the Assad regime in Syria (which the Saudis seek to topple).
The rejection, in other words, was meant to fulfill two tasks at the same time: blunt or wipe out the challenge posed by North American shale producers and undermine two economically shaky energy powers that opposed Saudi goals in the Middle East by depriving them of much needed oil revenues. Because Saudi Arabia could produce oil so much more cheaply than other countries — for as little as $3 per barrel — and because it could draw upon hundreds of billions of dollars in sovereign wealth funds to meet any budget shortfalls of its own, its leaders believed it more capable of weathering any price downturn than its rivals. Today, however, that rosy prediction is looking grimmer as the Saudi royals begin to feel the pinch of low oil prices, and find themselves cutting back on the benefits they had been passing on to an ever-growing, potentially restive population while still financing a costly, inconclusive, and increasingly disastrous war in Yemen.
Many energy analysts became convinced that Doha would prove the decisive moment when Riyadh would finally be amenable to a production freeze. Just days before the conference, participants expressed growing confidence that such a plan would indeed be adopted. After all, preliminary negotiations between Russia, Venezuela, Qatar, and Saudi Arabia had produced a draft document that most participants assumed was essentially ready for signature. The only sticking point: the nature of Iran’s participation.
The Iranians were, in fact, agreeable to such a freeze, but only after they were allowed to raise their relatively modest daily output to levels achieved in 2012 before the West imposed sanctions in an effort to force Tehran to agree to dismantle its nuclear enrichment program. Now that those sanctions were, in fact, being lifted as a result of the recently concluded nuclear deal, Tehran was determined to restore the status quo ante. On this, the Saudis balked, having no wish to see their arch-rival obtain added oil revenues. Still, most observers assumed that, in the end, Riyadh would agree to a formula allowing Iran some increase before a freeze. “There are positive indications an agreement will be reached during this meeting… an initial agreement on freezing production,” said Nawal Al-Fuzaia, Kuwait’s OPEC representative, echoing the views of other Doha participants.
But then something happened. According to people familiar with the sequence of events, Saudi Arabia’s Deputy Crown Prince and key oil strategist, Mohammed bin Salman, called the Saudi delegation in Doha at 3:00 a.m. on April 17th and instructed them to spurn a deal that provided leeway of any sort for Iran. When the Iranians — who chose not to attend the meeting — signaled that they had no intention of freezing their output to satisfy their rivals, the Saudis rejected the draft agreement it had helped negotiate and the assembly ended in disarray.
Geopolitics to the Fore
Most analysts have since suggested that the Saudi royals simply considered punishing Iran more important than raising oil prices. No matter the cost to them, in other words, they could not bring themselves to help Iran pursue its geopolitical objectives, including giving yet more support to Shiite forces in Iraq, Syria, Yemen, and Lebanon. Already feeling pressured by Tehran and ever less confident of Washington’s support, they were ready to use any means available to weaken the Iranians, whatever the danger to themselves.
“The failure to reach an agreement in Doha is a reminder that Saudi Arabia is in no mood to do Iran any favors right now and that their ongoing geopolitical conflict cannot be discounted as an element of the current Saudi oil policy,” said Jason Bordoff of the Center on Global Energy Policy at Columbia University.
Many analysts also pointed to the rising influence of Deputy Crown Prince Mohammed bin Salman, entrusted with near-total control of the economy and the military by his aging father, King Salman. As Minister of Defense, the prince has spearheaded the Saudi drive to counter the Iranians in a regional struggle for dominance. Most significantly, he is the main force behind Saudi Arabia’s ongoing intervention in Yemen, aimed at defeating the Houthi rebels, a largely Shia group with loose ties to Iran, and restoring deposed former president Abd Rabbuh Mansur Hadi. After a year of relentless U.S.-backed airstrikes (including the use of cluster bombs), the Saudi intervention has, in fact, failed to achieve its intended objectives, though it has produced thousands of civilian casualties, provoking fierce condemnation from U.N. officials, and created space for the rise of al-Qaeda in the Arabian Peninsula. Nevertheless, the prince seems determined to keep the conflict going and to counter Iranian influence across the region.
For Prince Mohammed, the oil market has evidently become just another arena for this ongoing struggle. “Under his guidance,” the Financial Timesnoted in April, “Saudi Arabia’s oil policy appears to be less driven by the price of crude than global politics, particularly Riyadh’s bitter rivalry with post-sanctions Tehran.” This seems to have been the backstory for Riyadh’s last-minute decision to scuttle the talks in Doha. On April 16th, for instance, Prince Mohammed couldn’t have been blunter to Bloomberg, even if he didn’t mention the Iranians by name: “If all major producers don’t freeze production, we will not freeze production.”
With the proposed agreement in tatters, Saudi Arabia is now expected to boost its own output, ensuring that prices will remain bargain-basement low and so deprive Iran of any windfall from its expected increase in exports. The kingdom, Prince Mohammed told Bloomberg, was prepared to immediately raise production from its current 10.2 million barrels per day to 11.5 million barrels and could add another million barrels “if we wanted to” in the next six to nine months. With Iranian and Iraqi oil heading for market in larger quantities, that’s the definition of oversupply. It would certainly ensure Saudi Arabia’s continued dominance of the market, but it might also wound the kingdom in a major way, if not fatally.
A New Global Reality
No doubt geopolitics played a significant role in the Saudi decision, but that’s hardly the whole story. Overshadowing discussions about a possible production freeze was a new fact of life for the oil industry: the past would be no predictor of the future when it came to global oil demand. Whatever the Saudis think of the Iranians or vice versa, their industry is being fundamentally transformed, altering relationships among the major producers and eroding their inclination to cooperate.
Until very recently, it was assumed that the demand for oil would continue to expand indefinitely, creating space for multiple producers to enter the market, and for ones already in it to increase their output. Even when supply outran demand and drove prices down, as has periodically occurred, producers could always take solace in the knowledge that, as in the past, demand would eventually rebound, jacking prices up again. Under such circumstances and at such a moment, it was just good sense for individual producers to cooperate in lowering output, knowing that everyone would benefit sooner or later from the inevitable price increase.
But what happens if confidence in the eventual resurgence of demand begins to wither? Then the incentives to cooperate begin to evaporate, too, and it’s every producer for itself in a mad scramble to protect market share. This new reality — a world in which “peak oil demand,” rather than “peak oil,” will shape the consciousness of major players — is what the Doha catastrophe foreshadowed.
At the beginning of this century, many energy analysts were convinced that we were at the edge of the arrival of “peak oil”; a peak, that is, in the output of petroleum in which planetary reserves would be exhausted long before the demand for oil disappeared, triggering a global economic crisis. As a result of advances in drilling technology, however, the supply of oil has continued to grow, while demand has unexpectedly begun to stall. This can be traced both to slowing economic growth globally and to an accelerating “green revolution” in which the planet will be transitioning to non-carbon fuel sources. With most nations now committed to measures aimed at reducing emissions of greenhouse gases under the just-signed Paris climate accord, the demand for oil is likely to experience significant declines in the years ahead. In other words, global oil demand will peak long before supplies begin to run low, creating a monumental challenge for the oil-producing countries.
This is no theoretical construct. It’s reality itself. Net consumption of oil in the advanced industrialized nations has already dropped from 50 million barrels per day in 2005 to 45 million barrels in 2014. Further declines are in store as strict fuel efficiency standards for the production of new vehicles and other climate-related measures take effect, the price of solar and wind power continues to fall, and other alternative energy sources come on line. While the demand for oil does continue to rise in the developing world, even there it’s not climbing at rates previously taken for granted. With such countries also beginning to impose tougher constraints on carbon emissions, global consumption is expected to reach a peak and begin an inexorable decline.According to experts Thijs Van de Graaf and Aviel Verbruggen, overall world peak demand could be reached as early as 2020.
In such a world, high-cost oil producers will be driven out of the market and the advantage — such as it is — will lie with the lowest-cost ones. Countries that depend on petroleum exports for a large share of their revenues will come under increasing pressure to move away from excessive reliance on oil. This may have been another consideration in the Saudi decision at Doha. In the months leading up to the April meeting, senior Saudi officials dropped hints that they were beginning to plan for a post-petroleum era and that Deputy Crown Prince bin Salman would play a key role in overseeing the transition.
On April 1st, the prince himself indicated that steps were underway to begin this process. As part of the effort, he announced, he was planning an initial public offering of shares in state-owned Saudi Aramco, the world’s number one oil producer, and would transfer the proceeds, an estimated $2 trillion, to its Public Investment Fund (PIF). “IPOing Aramco and transferring its shares to PIF will technically make investments the source of Saudi government revenue, not oil,” the prince pointed out. “What is left now is to diversify investments. So within 20 years, we will be an economy or state that doesn’t depend mainly on oil.”
For a country that more than any other has rested its claim to wealth and power on the production and sale of petroleum, this is a revolutionary statement. If Saudi Arabia says it is ready to begin a move away from reliance on petroleum, we are indeed entering a new world in which, among other things, the titans of oil production will no longer hold sway over our lives as they have in the past.
This, in fact, appears to be the outlook adopted by Prince Mohammed in the wake of the Doha debacle. In announcing the kingdom’s new economic blueprint on April 25th, he vowed to liberate the country from its “addiction” to oil.” This will not, of course, be easy to achieve, given the kingdom’s heavy reliance on oil revenues and lack of plausible alternatives. The 30-year-old prince could also face opposition from within the royal family to his audacious moves (as well as his blundering ones in Yemen and possibly elsewhere). Whatever the fate of the Saudi royals, however, if predictions of a future peak in world oil demand prove accurate, the debacle in Doha will be seen as marking the beginning of the end of the old oil order.
Michael T. Klare, a TomDispatch regular, is a professor of peace and world security studies at Hampshire College and the author, most recently, of The Race for What’s Left. A documentary movie version of his book Blood and Oil is available from the Media Education Foundation. Follow him on Twitter at @mklare1.
Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Nick Turse’s Tomorrow’s Battlefield: U.S. Proxy Wars and Secret Ops in Africa, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 Michael T. Klare
By Robert Reich
Reprinted from Robert Reich’s blog at robertreich.org
A crowning achievement of the historic March on Washington, where Dr. Martin Luther King gave his “I have a dream” speech, was pushing through the landmark Voting Rights Act of 1965. Recognizing the history of racist attempts to prevent Black people from voting, that federal law forced a number of southern states and districts to adhere to federal guidelines allowing citizens access to the polls.
But in 2013 the Supreme Court effectively gutted many of these protections. As a result, states are finding new ways to stop more and more people—especially African-Americans and other likely Democratic voters—from reaching the polls.
Several states are requiring government-issued photo IDs—like drivers licenses—to vote even though there’s no evidence of the voter fraud this is supposed to prevent. But there’s plenty of evidence that these ID measures depress voting, especially among communities of color, young voters, and lower-income Americans.
Alabama, after requiring photo IDs, has practically closed driver’s license offices in counties with large percentages of black voters. Wisconsin requires a government-issued photo ID but hasn’t provided any funding to explain to prospective voters how to secure those IDs.
Other states are reducing opportunities for early voting.
And several state legislatures—not just in the South—are gerrymandering districts to reduce the political power of people of color and Democrats, and thereby guarantee Republican control in Congress.
We need to move to the next stage of voting rights—a new Voting Rights Act—that renews the law that was effectively repealed by the conservative activists on the Supreme Court.
That new Voting Rights Act should also set minimum national standards—providing automatic voter registration when people get driver’s licenses, allowing at least 2 weeks of early voting, and taking districting away from the politicians and putting it under independent commissions.
Voting isn’t a privilege. It’s a right. And that right is too important to be left to partisan politics. We must not allow anyone’s votes to be taken away.
ROBERT B. REICH is Chancellor’s Professor of Public Policy at the University of California at Berkeley and Senior Fellow at the Blum Center for Developing Economies. He served as Secretary of Labor in the Clinton administration, for which Time Magazine named him one of the ten most effective cabinet secretaries of the twentieth century. He has written fourteen books, including the best sellers “Aftershock, “The Work of Nations,” and”Beyond Outrage,” and, his most recent, “Saving Capitalism.” He is also a founding editor of the American Prospect magazine, chairman of Common Cause, a member of the American Academy of Arts and Sciences, and co-creator of the award-winning documentary, INEQUALITY FOR ALL.
By Thomas Frank, author of the just-published Listen, Liberal, or What Ever Happened to the Party of the People? (Metropolitan Books) from which this essay is adapted. He has also written Pity the Billionaire, The Wrecking Crew, and What’s the Matter With Kansas? among other works. He is the founding editor of The Baffler. Reprinted with permission from Tomdispatch.com
When you press Democrats on their uninspiring deeds — their lousy free trade deals, for example, or their flaccid response to Wall Street misbehavior — when you press them on any of these things, they automatically reply that this is the best anyone could have done. After all, they had to deal with those awful Republicans, and those awful Republicans wouldn’t let the really good stuff get through. They filibustered in the Senate. They gerrymandered the congressional districts. And besides, change takes a long time. Surely you don’t think the tepid-to-lukewarm things Bill Clinton and Barack Obama have done in Washington really represent the fiery Democratic soul.
So let’s go to a place that does. Let’s choose a locale where Democratic rule is virtually unopposed, a place where Republican obstruction and sabotage can’t taint the experiment.
Let’s go to Boston, Massachusetts, the spiritual homeland of the professional class and a place where the ideology of modern liberalism has been permitted to grow and flourish without challenge or restraint. As the seat of American higher learning, it seems unsurprising that Boston should anchor one of the most Democratic of states, a place where elected Republicans (like the new governor) are highly unusual. This is the city that virtually invented the blue-state economic model, in which prosperity arises from higher education and the knowledge-based industries that surround it.
The coming of post-industrial society has treated this most ancient of American cities extremely well. Massachusetts routinely occupies the number one spot on the State New Economy Index, a measure of how “knowledge-based, globalized, entrepreneurial, IT-driven, and innovation-based” a place happens to be. Boston ranks high on many of Richard Florida’s statistical indices of approbation — in 2003, it was number one on the “creative class index,” number three in innovation and in high tech — and his many books marvel at the city’s concentration of venture capital, its allure to young people, or the time it enticed some firm away from some unenlightened locale in the hinterlands.
Boston’s knowledge economy is the best, and it is the oldest. Boston’s metro area encompasses some 85 private colleges and universities, the greatest concentration of higher-ed institutions in the country — probably in the world. The region has all the ancillary advantages to show for this: a highly educated population, an unusually large number of patents, and more Nobel laureates than any other city in the country.
The city’s Route 128 corridor was the original model for a suburban tech district, lined ever since it was built with defense contractors and computer manufacturers. The suburbs situated along this golden thoroughfare are among the wealthiest municipalities in the nation, populated by engineers, lawyers, and aerospace workers. Their public schools are excellent, their downtowns are cute, and back in the seventies their socially enlightened residents were the prototype for the figure of the “suburban liberal.”
Another prototype: the Massachusetts Institute of Technology, situated in Cambridge, is where our modern conception of the university as an incubator for business enterprises began. According to a report on MIT’s achievements in this category, the school’s alumni have started nearly 26,000 companies over the years, including Intel, Hewlett Packard, and Qualcomm. If you were to take those 26,000 companies as a separate nation, the report tells us, its economy would be one of the most productive in the world.
Then there are Boston’s many biotech and pharmaceutical concerns, grouped together in what is known as the “life sciences super cluster,” which, properly understood, is part of an “ecosystem” in which PhDs can “partner” with venture capitalists and in which big pharmaceutical firms can acquire small ones. While other industries shrivel, the Boston super cluster grows, with the life-sciences professionals of the world lighting out for the Athens of America and the massive new “innovation centers” shoehorning themselves one after the other into the crowded academic suburb of Cambridge.
To think about it slightly more critically, Boston is the headquarters for two industries that are steadily bankrupting middle America: big learning and big medicine, both of them imposing costs that everyone else is basically required to pay and which increase at a far more rapid pace than wages or inflation. A thousand dollars a pill, 30 grand a semester: the debts that are gradually choking the life out of people where you live are what has madethis city so very rich.
Perhaps it makes sense, then, that another category in which Massachusetts ranks highly is inequality. Once the visitor leaves the brainy bustle of Boston, he discovers that this state is filled with wreckage — with former manufacturing towns in which workers watch their way of life draining away, and with cities that are little more than warehouses for people on Medicare. According to one survey, Massachusetts has the eighth-worst rate of income inequality among the states; by another metric it ranks fourth. However you choose to measure the diverging fortunes of the country’s top 10% and the rest, Massachusetts always seems to finish among the nation’s most unequal places.
Seething City on a Cliff
You can see what I mean when you visit Fall River, an old mill town 50 miles south of Boston. Median household income in that city is $33,000, among the lowest in the state; unemployment is among the highest, 15% in March 2014, nearly five years after the recession ended. Twenty-three percent of Fall River’s inhabitants live in poverty. The city lost its many fabric-making concerns decades ago and with them it lost its reason for being. People have been deserting the place for decades.
Many of the empty factories in which their ancestors worked are still standing, however. Solid nineteenth-century structures of granite or brick, these huge boxes dominate the city visually — there always seems to be one or two of them in the vista, contrasting painfully with whatever colorful plastic fast-food joint has been slapped up next door.
Most of the old factories are boarded up, unmistakable emblems of hopelessness right up to the roof. But the ones that have been successfully repurposed are in some ways even worse, filled as they often are with enterprises offering cheap suits or help with drug addiction. A clinic in the hulk of one abandoned mill has a sign on the window reading simply “Cancer & Blood.”
The effect of all this is to remind you with every prospect that this is a place and a way of life from which the politicians have withdrawn their blessing. Like so many other American scenes, this one is the product of decades of deindustrialization, engineered by Republicans and rationalized by Democrats. This is a place where affluence never returns — not because affluence for Fall River is impossible or unimaginable, but because our country’s leaders have blandly accepted a social order that constantly bids down the wages of people like these while bidding up the rewards for innovators, creatives, and professionals.
Even the city’s one real hope for new employment opportunities — an Amazon warehouse that is now in the planning stages — will serve to lock in this relationship. If all goes according to plan, and if Amazon sticks to the practices it has pioneered elsewhere, people from Fall River will one day get to do exhausting work with few benefits while being electronically monitored for efficiency, in order to save the affluent customers of nearby Boston a few pennies when they buy books or electronics.
But that is all in the future. These days, the local newspaper publishes an endless stream of stories about drug arrests, shootings, drunk-driving crashes, the stupidity of local politicians, and the lamentable surplus of “affordable housing.” The town is up to its eyeballs in wrathful bitterness against public workers. As in: Why do they deserve a decent life when the rest of us have no chance at all? It’s every man for himself here in a “competition for crumbs,” as a Fall River friend puts it.
The Great Entrepreneurial Awakening
If Fall River is pocked with empty mills, the streets of Boston are dotted with facilities intended to make innovation and entrepreneurship easy and convenient. I was surprised to discover, during the time I spent exploring the city’s political landscape, that Boston boasts a full-blown Innovation District, a disused industrial neighborhood that has actually been zoned creative — a projection of the post-industrial blue-state ideal onto the urban grid itself. The heart of the neighborhood is a building called “District Hall” — “Boston’s New Home for Innovation” — which appeared to me to be a glorified multipurpose room, enclosed in a sharply angular façade, and sharing a roof with a restaurant that offers “inventive cuisine for innovative people.” The Wi-Fi was free, the screens on the walls displayed famous quotations about creativity, and the walls themselves were covered with a high-gloss finish meant to be written on with dry-erase markers; but otherwise it was not much different from an ordinary public library. Aside from not having anything to read, that is.
This was my introduction to the innovation infrastructure of the city, much of it built up by entrepreneurs shrewdly angling to grab a piece of the entrepreneur craze. There are “co-working” spaces, shared offices for startups that can’t afford the real thing. There are startup “incubators” and startup “accelerators,” which aim to ease the innovator’s eternal struggle with an uncaring public: the Startup Institute, for example, and the famous MassChallenge, the “World’s Largest Startup Accelerator,” which runs an annual competition for new companies and hands out prizes at the end.
And then there are the innovation Democrats, led by former Governor Deval Patrick, who presided over the Massachusetts government from 2007 to 2015. He is typical of liberal-class leaders; you might even say he is their most successful exemplar. Everyone seems to like him, even his opponents. He is a witty and affable public speaker as well as a man of competence, a highly educated technocrat who is comfortable in corporate surroundings. Thanks to his upbringing in a Chicago housing project, he also understands the plight of the poor, and (perhaps best of all) he is an honest politician in a state accustomed to wide-open corruption. Patrick was also the first black governor of Massachusetts and, in some ways, an ideal Democrat for the era of Barack Obama — who, as it happens, is one of his closest political allies.
As governor, Patrick became a kind of missionary for the innovation cult. “The Massachusetts economy is an innovation economy,” he liked to declare, and he made similar comments countless times, slightly varying the order of the optimistic keywords: “Innovation is a centerpiece of the Massachusetts economy,” et cetera. The governor opened “innovation schools,” a species of ramped-up charter school. He signed the “Social Innovation Compact,” which had something to do with meeting “the private sector’s need for skilled entry-level professional talent.” In a 2009 speech called “The Innovation Economy,” Patrick elaborated the political theory of innovation in greater detail, telling an audience of corporate types in Silicon Valley about Massachusetts’s “high concentration of brainpower” and “world-class” universities, and how “we in government are actively partnering with the private sector and the universities, to strengthen our innovation industries.”
What did all of this inno-talk mean? Much of the time, it was pure applesauce — standard-issue platitudes to be rolled out every time some pharmaceutical company opened an office building somewhere in the state.
On some occasions, Patrick’s favorite buzzword came with a gigantic price tag, like the billion dollars in subsidies and tax breaks that the governor authorized in 2008 to encourage pharmaceutical and biotech companies to do business in Massachusetts. On still other occasions, favoring inno has meant bulldozing the people in its path — for instance, the taxi drivers whose livelihoods are being usurped by ridesharing apps like Uber. When these workers staged a variety of protests in the Boston area, Patrick intervened decisively on the side of the distant software company. Apparently convenience for the people who ride in taxis was more important than good pay for people who drive those taxis. It probably didn’t hurt that Uber had hired a former Patrick aide as a lobbyist, but the real point was, of course, innovation: Uber was the future, the taxi drivers were the past, and the path for Massachusetts was obvious.
A short while later, Patrick became something of an innovator himself. After his time as governor came to an end last year, he won a job as a managing director of Bain Capital, the private equity firm that was founded by his predecessor Mitt Romney — and that had been so powerfully denounced by Democrats during the 2012 election. Patrick spoke about the job like it was just another startup: “It was a happy and timely coincidence I was interested in building a business that Bain was also interested in building,” he told theWall Street Journal. Romney reportedly phoned him with congratulations.
At a 2014 celebration of Governor Patrick’s innovation leadership, Google’s Eric Schmidt announced that “if you want to solve the economic problems of the U.S., create more entrepreneurs.” That sort of sums up the ideology in this corporate commonwealth: Entrepreneurs first. But how has such a doctrine become holy writ in a party dedicated to the welfare of the common man? And how has all this come to pass in the liberal state of Massachusetts?
The answer is that I’ve got the wrong liberalism. The kind of liberalism that has dominated Massachusetts for the last few decades isn’t the stuff of Franklin Roosevelt or the United Auto Workers; it’s the Route 128/suburban-professionals variety. (Senator Elizabeth Warren is the great exception to this rule.) Professional-class liberals aren’t really alarmed by oversized rewards for society’s winners. On the contrary, this seems natural to them — because they are society’s winners. The liberalism of professionals just does not extend to matters of inequality; this is the area where soft hearts abruptly turn hard.
Innovation liberalism is “a liberalism of the rich,” to use the straightforward phrase of local labor leader Harris Gruman. This doctrine has no patience with the idea that everyone should share in society’s wealth. What Massachusetts liberals pine for, by and large, is a more perfect meritocracy — a system where the essential thing is to ensure that the truly talented get into the right schools and then get to rise through the ranks of society. Unfortunately, however, as the blue-state model makes painfully clear, there is no solidarity in a meritocracy. The ideology of educational achievement conveniently negates any esteem we might feel for the poorly graduated.
This is a curious phenomenon, is it not? A blue state where the Democrats maintain transparent connections to high finance and big pharma; where they have deliberately chosen distant software barons over working-class members of their own society; and where their chief economic proposals have to do with promoting “innovation,” a grand and promising idea that remains suspiciously vague. Nor can these innovation Democrats claim that their hands were forced by Republicans. They came up with this program all on their own.
Thomas Frank is the author of the just-published Listen, Liberal, or What Ever Happened to the Party of the People? (Metropolitan Books) from which this essay is adapted. He has also written Pity the Billionaire, The Wrecking Crew, and What’s the Matter With Kansas? among other works. He is the founding editor of The Baffler.
Copyright 2016 Thomas Frank
Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Nick Turse’s Tomorrow’s Battlefield: U.S. Proxy Wars and Secret Ops in Africa, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
The other week, feeling sick, I spent a day on my couch with the TV on and was reminded of an odd fact of American life. More than seven months before Election Day, you can watch the 2016 campaign for the presidency at any moment of your choosing, and that’s been true since at least late last year. There is essentially never a time when some network or news channel isn’t reporting on, discussing, debating, analyzing, speculating about, or simply drooling over some aspect of the primary campaign, of Hillary, Bernie, Ted, and above all — a million times above all — The Donald (from the violence at his rallies to the size of his hands). In case you’re young and think this is more or less the American norm, it isn’t. Or wasn’t.
Truly, there is something new under the sun. Of course, in 1994 with O.J. Simpson’s white Ford Bronco chase (95 million viewers!), the 24/7 media event arrived full blown in American life and something changed when it came to the way we focused on our world and the media focused on us. But you can be sure of one thing: never in the history of television, or any other form of media, has a single figure garnered the amount of attention — hour after hour, day after day, week after week — as Donald Trump. If he’s the O.J. Simpson of twenty-first-century American politics and his run for the presidency is the eternal white Ford Bronco chase of our moment, then we’re in a truly strange world.
Or let me put it another way: this is not an election. I know the word “election” is being used every five seconds and somewhere along the line significant numbers of Americans (particularly, this season, Republicans) continue to enter voting booths or in the case of primary caucuses, school gyms and the like, to choose among various candidates, so it’s all still election-like. But take my word for it as a 71-year-old guy who’s been watching our politics for decades: this is not an election of the kind the textbooks once taught us was so crucial to American democracy. If, however, you’re sitting there waiting for me to tell you what it is, take a breath and don’t be too disappointed. I have no idea, though it’s certainly part bread-and-circuses spectacle, part celebrity obsession, and part media money machine.
Actually, before we go further, let me hedge my bets on the idea that Donald Trump is a twenty-first-century O.J. Simpson. It’s certainly a reasonable enough comparison, but I’ve begun to wonder about the usefulness of just about any comparison in our present situation. Even the most nightmarish of them — Donald Trump is Adolf Hitler, Benito Mussolini, or any past extreme demagogue of your choice — may actually prove to be covert gestures of consolation, reassurance, and comfort. Yes, what’s happening in our world is increasingly extreme and could hardly be weirder, we seem to have the urge to say, but it’s still recognizable. It’s something we’ve encountered before, something we’ve made sense of in the past and, in the process, overcome.
Round Up the Usual SuspectsBut what if that’s not true? In some ways, the most frightening, least acceptable thing to say about our American world right now — even if Donald Trump’s overwhelming presence all but begs us to say it — is that we’ve entered uncharted territory and, under the circumstances, comparisons might actually impair our ability to come to grips with our new reality. My own suspicion: Donald Trump is only the most obvious instance of this, the example no one can miss.In these first years of the twenty-first century, we may be witnessing a new world being born inside the hollowed-out shell of the American system. As yet, though we live with this reality every day, we evidently just can’t bear to recognize it for what it might be. When we survey the landscape, what we tend to focus on is that shell — the usual elections (in somewhat heightened form), the usual governmental bodies (a little tarnished) with the usual governmental powers (a little diminished or redistributed), including the usual checks and balances (a little out of whack), and the same old Constitution (much praised in its absence), and yes, we know that none of this is working particularly well, or sometimes at all, but it still feels comfortable to view what we have as a reduced, shabbier, and more dysfunctional version of the known.
Perhaps, however, it’s increasingly a version of the unknown. We say, for instance, that Congress is “paralyzed,” and that little can be done in a country where politics has become so “polarized,” and we wait for something to shake us loose from that “paralysis,” to return us to a Washington closer to what we remember and recognize. But maybe this is it. Maybe even if the Republicans somehow lost control of the House of Representatives and the Senate, we would still be in a situation something like what we’re now labeling paralysis. Maybe in our new American reality, Congress is actually some kind of glorified, well-lobbied, and well-financed version of a peanut gallery.
Of course, I don’t want to deny that much of what is “new” in our world has a long history. The present yawning inequality gap between the 1% and ordinary Americans first began to widen in the 1970s and — as Thomas Frank explains so brilliantly in his new book, Listen, Liberal — was already a powerful and much-discussed reality in the early 1990s, when Bill Clinton ran for president. Yes, that gap is now more like an abyss and looks ever more permanently embedded in the American system, but it has a genuine history, as for instance do 1% elections and the rise and self-organization of the “billionaire class,” even if no one, until this second, imagined that government of the billionaires, by the billionaires, and for the billionaires might devolve into government of the billionaire, by the billionaire, and for the billionaire — that is, just one of them.
Indeed, much of our shape-shifting world can be written about as a set of comparisons and in terms of historical reference points. Inequality has a history. The military-industrial complex and the all-volunteer military, like the warrior corporation, weren’t born yesterday; neither was our state of perpetual war, nor the national security state that now looms over Washington, nor its surveilling urge, the desire to know far too much about the private lives of Americans. (A little bow of remembrance to FBI Director J. Edgar Hoover is in order here.)
And yet, true as all that may be, Washington increasingly seems like a new land, sporting something like a new system in the midst of our much-described polarized and paralyzed politics. The national security state doesn’t seem faintly paralyzed or polarized to me. Nor does the Pentagon. On certain days when I catch the news, I can’t believe how strange and yet humdrum this uncharted new territory is. Remind me, for instance, where in the Constitution the Founding Fathers wrote about that national security state? And yet there it is in all its glory, all its powers, an ever more independent force in our nation’s capital. In what way, for instance, did those men of the revolutionary era prepare the ground for the Pentagon to loose its spy drones from our distant war zones over the United States? And yet, so it has. And no one even seems disturbed by the development. The news, barely noticed or noted, was instantly absorbed into what’s becoming the new normal.
Graduation Ceremonies in the Imperium
Let me mention here the almost random piece of news that recently made me wonder just what planet I was actually on. And I know you won’t believe it, but it had absolutely nothing to do with Donald Trump.
Given the carnage of America’s wars and conflicts across the Greater Middle East and Africa, which I’ve been following closely these last years, I’m unsure why this particular moment even got to me. Best guess? Maybe that, of all the once-obscure places — from Afghanistan to Yemen to Libya — in which the U.S. has been fighting recently, Somalia, where this particular little slaughter took place, seems to me like the most obscure of all. Yes, I’ve been half-attending to events there from the 1993 Blackhawk Down moment to the disastrous U.S.-backed Ethiopian invasion of 2006 to the hardly less disastrous invasion of that country by Kenyan and other African forces. Still, Somalia?
Recently, U.S. Reaper drones and manned aircraft launched a set of strikes against what the Pentagon claimed was a graduation ceremony for “low-level” foot soldiers in the Somali terror group al-Shabab. It was proudly announced that more than 150 Somalis had died in this attack. In a country where, in recent years, U.S. drones and special ops forces had carried out a modest number of strikes against individual al-Shabab leaders, this might be thought of as a distinct escalation of Washington’s endless low-level conflict there (with a raid involving U.S. special ops forces following soon after).
Now, let me try to put this in some personal context. Since I was a kid, I’ve always liked globes and maps. I have a reasonable sense of where most countries on this planet are. Still, Somalia? I have to stop and give that one some thought to truly locate it on a mental map of eastern Africa. Most Americans? Honestly, I doubt they’d have a clue. So the other day, when this news came out, I stopped a moment to take it in. If accurate, we killed 150 more or less nobodies (except to those who knew them) and maybe even a top leader or two in a country most Americans couldn’t locate on a map.
I mean, don’t you find that just a little odd, no matter how horrible the organization they were preparing to fight for? 150 Somalis? Blam!
Remind me: On just what basis was this modest massacre carried out? After all, the U.S. isn’t at war with Somalia or with al-Shabab. Of course, Congress no longer plays any real role in decisions about American war making. It no longer declares war on any group or country we fight. (Paralysis!) War is now purely a matter of executive power or, in reality, the collective power of the national security state and the White House. The essential explanation offered for the Somali strike, for instance, is that the U.S. had a small set of advisers stationed with African Union forces in that country and it was just faintly possible that those guerrilla graduates might soon prepare to attack some of those forces (and hence U.S. military personnel). It seems that if the U.S. puts advisers in place anywhere on the planet — and any day of any year they are now in scores of countries — that’s excuse enough to validate acts of war based on the “imminent” threat of their attack.
Or just think of it this way: a new, informal constitution is being written in these years in Washington. No need for a convention or a new bill of rights. It’s a constitution focused on the use of power, especially military power, and it’s being written in blood.
These days, our government (the unparalyzed one) acts regularly on the basis of that informal constitution-in-the-making, committing Somalia-like acts across significant swathes of the planet. In these years, we’ve been marrying the latest in wonder technology, our Hellfire-missile-armed drones, to executive power and slaughtering people we don’t much like in majority Muslim countries with a certain alacrity. By now, it’s simply accepted that any commander-in-chief is also our assassin-in-chief, and that all of this is part of a wartime-that-isn’t-wartime system, spreading the principle of chaos and dissolution to whole areas of the planet, leaving failed states and terror movements in its wake.
When was it, by the way, that “the people” agreed that the president could appoint himself assassin-in-chief, muster his legal beagles to write new “law” that covered any future acts of his (including the killing of American citizens), and year after year dispatch what essentially is his own private fleet of killer drones to knock off thousands of people across the Greater Middle East and parts of Africa? Weirdly enough, after almost 14 years of this sort of behavior, with ample evidence that such strikes don’t suppress the movements Washington loathes (and often only fan the flames of resentment and revenge that help them spread), neither the current president and his top officials, nor any of the candidates for his office have the slightest intention of ever grounding those drones.
And when exactly did the people say that, within the country’s vast standing military, which now garrisons much of the planet, a force of nearly 70,000 Special Operations personnel should be birthed, or that it should conduct covert missions globally, essentially accountable only to the president (if him)? And what I find strangest of all is that few in our world find such developments strange at all.
A Planet in Decline?
In some way, all of this could be said to work. At the very least, it is a functioning new system-in-the-making that we have yet to truly come to grips with, just as we haven’t come to grips with a national security state that surveils the world in a way that even science fiction writers (no less totalitarian rulers) of a previous era could never have imagined, or the strange version of media overkill that we still call an election. All of this is by now both old news and mind-bogglingly new.
Do I understand it? Not for a second.
This is not war as we knew it, nor government as we once understood it, nor are these elections as we once imagined them, nor is this democracy as it used to be conceived of, nor is this journalism of a kind ever taught in a journalism school. This is the definition of uncharted territory. It’s a genuine American terra incognita and yet in some fashion that unknown landscape is already part of our sense of ourselves and our world. In this “election” season, many remain shocked that a leading candidate for the presidency is a demagogue with a visible authoritarian side and what looks like an autocratic bent. All such labels are pinned on Donald Trump, but the new American system that’s been emerging from its chrysalis in these years already has just those tendencies. So don’t blame it all on Donald Trump. He should be far less of a shock to this country than he continues to be. After all, a Trumpian world-in-formation has paved the way for him.
Who knows? Perhaps what we’re watching is the new iteration of a very old story: a twenty-first-century version of an ancient tale of a great imperial power, perhaps the greatest ever — the “lone superpower” — sinking into decline. It’s a tale humanity has experienced often enough in the course of our long history. But lest you think once again that there’s nothing new under the sun, the context for all of this, for everything now happening in our world, is so new as to be quite literally outside of thousands of years of human experience. As the latest heat records indicate, we are, for the first time, on a planet in decline. And if that isn’t uncharted territory, what is?
Tom Engelhardt is a co-founder of the American Empire Project and the author of The United States of Fear as well as a history of the Cold War, The End of Victory Culture. He is a fellow of the Nation Institute and runs TomDispatch.com. His latest book is Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Follow TomDispatch on Twitter and join him on Facebook. Check out the newest Dispatch Book, Nick Turse’s Tomorrow’s Battlefield: U.S. Proxy Wars and Secret Ops in Africa, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 Tom Engelhardt
The refuge’s creation helped support nearby ranchers.
Nancy Langston Feb. 2, 2016
Reprinted with permission from High Country News
National wildlife refuges such as the one at Malheur near Burns, Oregon, have importance far beyond the current furor over who manages our public lands. Such refuges are becoming increasingly critical habitat for migratory birds because 95 percent of the wetlands along the Pacific Flyway have already been lost to development.
In some years, 25 million birds visit Malheur, and if the refuge were drained and converted to intensive cattle grazing – which is something the “occupiers” threatened to do – entire populations of ducks, sandhill cranes, and shorebirds would suffer. With their long-distance flights and distinctive songs, the migratory birds visiting Malheur’s wetlands now help to tie the continent together.
This was not always the case. By the 1930s, three decades of drainage, reclamation, and drought had decimated high-desert wetlands and the birds that depended upon them. Out of the hundreds of thousands of egrets that once nested on Malheur Lake, only 121 remained. The American population of the birds had dropped by 95 percent. It took the federal government to restore Malheur’s wetlands and recover waterbird populations, bringing back healthy populations of egrets and many other species.
Yet despite the importance of wildlife refuges to America’s birds, not everyone appreciates them. At one recent news conference, Ammon Bundy called the creation of Malheur National Wildlife refuge “an unconstitutional act” that removed ranchers from their lands and plunged the county into an economic depression. This is not a new complaint. Since the Sagebrush Rebellion of the 1980s, rural communities in the West have blamed their poverty on the 640 million acres of federal public lands, which make up 52 percent of the land in Western states.
Rural Western communities are indeed suffering, but the cause is not the wildlife refuge system. Conservation of bird habitat did not lead to economic devastation, nor were refuge lands “stolen” from ranchers. If any group has prior claims to Malheur refuge, it is the Paiute Indian Tribe.
For at least 6,000 years, Malheur was the Paiutes’ home. It took a brutal Army campaign to force the people from their reservation, marching them through the snow to the state of Washington in 1879. Homesteaders and cattle barons then moved onto Paiute lands, squeezing as much livestock as possible onto dwindling pastures, and warring with each other over whose land was whose. Scars from this era persist more than a century later.
In 1908, President Roosevelt established the Malheur Lake Bird Reservation on the lands of the former Malheur Indian Reservation. But the refuge included only the lake itself, not the rivers that fed into it. Deprived of water, the lake shrank during droughts, and squatters moved onto the drying lakebed. Conservationists, realizing they needed to protect the Blitzen River that fed the lake, began a campaign to expand the refuge.
But the federal government never forced the ranchers to sell, as the occupiers at Malheur claimed, and the sale did not impoverish the community. In fact, it was just the opposite: During the Depression years of the 1930s, the federal government paid the Swift Corp. $675,000 for ruined grazing lands. Impoverished homesteaders who had squatted on refuge lands eventually received payments substantial enough to set them up as cattle ranchers nearby.
John Scharff, Malheur’s manager from 1935 to 1971, sought to transform local suspicion into acceptance by allowing local ranchers to graze cattle on the refuge. Yet some tension persisted. In the 1970s, when concern about overgrazing reduced – but did not eliminate – refuge grazing, violence erupted again. Some environmentalists denounced ranchers as parasites who destroyed wildlife habitat. A few ranchers responded with death threats against environmentalists and federal employees.
But violence is not the basin’s most important historical legacy. Through the decades, community members have come together to negotiate a better future. In the 1920s, poor homesteaders worked with conservationists to save the refuge from irrigation drainage. In the 1990s, Paiute tribal members, ranchers, environmentalists and federal agencies collaborated on innovative grazing plans to restore bird habitat while also giving ranchers more flexibility. In 2013, such efforts resulted in a landmark collaborative conservation plan for the refuge, and it offers great hope for the local economy and for wildlife.
The poet Gary Snyder wrote, “We must learn to know, love, and join our place even more than we love our own ideas. People who can agree that they share a commitment to the landscape – even if they are otherwise locked in struggle with each other – have at least one deep thing to share.”
Collaborative processes are difficult and time-consuming. Yet they have proven that they have the potential to peacefully sustain both human and wildlife communities.
Nancy Langston is a contributor to Writers on the Range, the opinion service of High Country News. She is a professor of environmental history at Michigan Technological University, and the author of a history of Malheur Refuge, Where Land and Water Meet: A Western Landscape Transformed.
Reprinted from the New Economic Perspectives blog at the University of Missouri-Kansas City
Editor’s Note: William K. Black, author of “The Best Way to Rob a Bank is to Own One,” is Associate Professor of Law and Economics at the University of Missouri-Kansas City, where — according to James Galbraith — “the best economics is now being done.”
In the latest example of the New York Times’ reporters’ inability to read Paul Krugman, we have an article claiming that the “Growing Imbalance Between Germany and France Strains Their Relationship.” The article begins with Merkel’s major myth accepted as if it were unquestionable reality.
“It was a clear illustration of the dysfunction of the French-German partnership, the axis that for decades kept Europe on a united and dynamic track.
In Berlin this month, Chancellor Angela Merkel, riding high after nine years in power, delivered a strident defense in Parliament of austerity, which she has been pushing on Europe ever since a debt crisis broke out in 2009.”
No, not true on multiple grounds. First, the so-called “debt crisis” was a symptom rather than a cause. The reader will note that the year 2008, when the Great Recession became terrifying, has somehow been removed from the narrative because it would expose the misapprehension in Merkel’s myth. Prior to 2008, only Greece had debt levels given its abandonment of a sovereign currency that posed a material risk. The EU nations had unusually low budgetary deficits leading into the Great Recession. Indeed, that along with the extremely low budgetary deficits of the Clinton administration (the budget went into surplus near the end of his term) is likely one of the triggers for the Great Recession.
The Great Recession caused sharp increases in deficits – as we have long known will happen as part of the “automatic stabilizers.” This is normal and speeds recovery. The eurozone and the U.S. began to come out of the Great Recession in 2009. The U.S. recovery accelerated with the addition of stimulus. In the eurozone, however, the abandonment of sovereign currencies and adoption of the euro exposed the periphery to recurrent attacks by the “bond vigilantes.” The ECB could have stopped these attacks at any time, but it was very late intervening – largely because of German resistance. Instead, Merkel used the leverage provided by the bond vigilantes and the refusal of the ECB to act to end their attacks to force increasing austerity upon the eurozone and demands for severe cuts in workers’ wages in the periphery.
Merkel’s actions in forcing austerity and efforts to force sharp drops in workers’ wages in the periphery were not required to stop any “debt crisis.” The ECB had the ability to end the bond vigilantes’ attacks and reestablish the ability of the periphery to borrow at low cost, as it demonstrated. Merkel’s austerity demands and demands that (largely) left governments in the periphery slash workers’ wages promptly threw the entire Eurozone back into a second Great Recession – and much of the periphery into a Second Great Depression. It had the desired purpose of discrediting the governing parties of the left, particularly in Spain, Portugal, and Greece; that gave in to Merkel’s mandates that they adopt masochistic macroeconomic policies.
It is also false that Merkel began demanding that eurozone inflict austerity only in 2009. Merkel wanted to inflict austerity and her war on the workers and the parties they primarily supported long before 2009. What changed in 2009 was that the ECB, the Great Recession, and the bond vigilantes gave her the leverage to successfully extort the members of the eurozone who opposed austerity and her war on workers and the parties of the left.
But it is what is left out of the quoted passage above that is most amazing. The fact that Merkel’s orders that the eurozone leaders bleed their economies through austerity and the war on workers’ wages led to a gratuitous Second Great Recession in the eurozone – and Great Depression levels of unemployment in much of the periphery disappears. The fact that inflicting austerity and wage cuts in response to a Great Recession is economically illiterate and cruel disappears. The fact that the overall eurozone – six years after the financial crisis of 2008 and eight years after the financial bubbles popped in 2006 – has stagnated and caused tens of trillions of dollars in lost GDP and well over 10 million lost jobs is treated by the NYT article as if it were unrelated to Merkel’s infliction of austerity.
“But the French economy has grown stagnant, with unemployment stubbornly stuck near 11 percent and an unpopular government pledging to cut tens of billions in taxes on business, which many French fear will unravel their prized welfare state.”
No, the eurozone economy “has grown stagnant” and produced a Second Great Depression in much of the periphery. If France had a sovereign currency or if the EU were to make the euro and into a true sovereign currency France could simultaneously “cut tens of billions in taxes on business” while preserving the social safety net and speeding the recovery. The same is true of the rest of the eurozone – including Germany where Merkel’s policies have made the wealthy far wealthier and deepened the economic crisis in other eurozone nations by cutting German worker’s wages. The NYT article is disingenuous about both aspects of the German economy, noting only that “the German economy has shown signs of slowing down.” German growth was actually negative in the last quarter and the treatment of its workers weakens the German and overall eurozone recovery.
It continues to be obvious that it is a condition of employment for NYT reporters covering the eurozone’s economic policies that they never read Paul Krugman (or most any other American economist). Consider this claim in the article:
“[Prime Minister Manuel Valls] and Mr. Hollande have alienated many members of the Socialist Party by taking a more centrist approach to economic policy, stoking suspicions that the government is favoring business at the expense of the welfare state.”
I will take this part very slow. By my count Krugman has written at least six columns in the NYT explaining that there actually is a powerful consensus among economists. The “centrist approach” is that austerity in response to a Great Recession is self-destructive. We have known this for at least 75 years. Modern Republicans, when they hold the presidency, always respond to a recession with a stimulus package. Valls and Hollande are moving away from a “centrist approach to economic policy.” They are doing so despite observing first-hand the self-destructive nature of austerity (and proclaiming that it is self-destructive). They do so despite the demonstrated success of stimulus in responding to the financial crisis. They do so despite the fact that the results of the faux left parties adopting these economically illiterate neo-liberal economic policies is the destruction of the parties that betray their principles and the workers. Valls and Hollande are spectacularly unpopular in France because of these betrayals. It is clear why Valls and Hollande wish to avoid reading Krugman’s critique of their betrayals, but theNYT reporters have no excuse.
The reporters do not simply ignore the insanity of austerity and the plight of the eurozone’s workers – they assert that it is obvious that Merkel is correct and that the French reluctance to slash workers’ wages is obviously economically illiterate.
“Just over a decade ago, as Ms. Merkel is fond of noting, Germany was Europe’s sick economy. It recovered partly because of changes to labor laws and social welfare. Mr. Hollande now faces a similar task in an era of low or no growth.”
No. These two sentences propound multiple Merkel myths and assume (1) that France’s (and the rest of the eurozone’s) problems are the same as Germany’s issues “just over a decade ago,” (2) that Germany “recovered” due to slashing workers’ wages and social programs, and (3) that the German “solutions” would work for the eurozone as a whole.
Germany’s “reforms,” which included increasing financial deregulation, have proven disastrous. German banks finished third in the regulatory “race to the bottom” (“behind” Wall Street and the worst of the worst – the City of London). The officers that controlled Deutsche Bank and various state-owned German banks were among the leading causes of the financial crisis. German workers had lost ground even before the financial crisis and have lost even more ground since the crisis began. Inequality has also become increasingly more extreme in Germany.
The current problem in the eurozone is a critical shortage of demand exacerbated by the insanity of austerity and Merkel’s war on workers’ wages. The word “demand” and the concept, the centerpiece of the macroeconomics of recession, never appear in the article. An individual nation in which the wealthy have the political power to lower workers’ wages can increase its exports and employ more of its citizens. This obviously does not prove that the workers were overpaid. Merkel and the NYT ignore the “fallacy of composition,” which is particularly embarrassing because they are neo-mercantilists pushing the universal goal of being a net exporter. As Adam Smith emphasized, we can’t all be net exporters. A strategy that can work (for the elites) of one nation cannot logically be assumed to work for large numbers of nations.
The last thing a society should want in a recession is rapidly falling wages and prices that can create deflation (another word expunged from the NYT article because it would refute their ode to Merkel, austerity, and her war on the worker). If France were to slash workers’ wages to try to take exports from Ireland while Ireland slashed workers’ wages to try to take exports from Spain, which did the same to take exports from Italy the result would be deflation, a massive increase in inequality, the political destruction of any (allegedly) progressive political party that joined in the war on the worker, and a “race to Bangladesh” dynamic.
Germany’s “success” in being a very large net exporter makes it far more difficult – not easier – for any other eurozone nation to copy its export strategy successfully. As a group, the strategy cannot work for the eurozone. The strategy has, of course, not simply “not succeeded.” It has failed catastrophically. Merkel’s eurozone policies have caused trillions of dollars in extra losses in productivity, the gratuitous loss of over 10 million jobs, increased inequality, and the loss through emigration of many of the best educated young citizens of the periphery.
Hollande does not face “a similar task” to Merkel. He faces different problems and Merkel’s “solutions” are the chief causes of France’s economic stagnation rather than the answers to France’s problems.
I repeat my twin suggestions to the NYT reporters that cover the eurozone’s economy. The paper’s management should host a seminar in which Krugman educates his colleagues. Alternatively, come to UMKC and we’ll provide that seminar without charge. None of us can afford the cost of the reporters’ continuing willful ignorance of economics and their indifference to the victims of austerity and Merkel’s war on workers.
Reprinted from TomDispatch.com
Editor’s Note: Rebecca Solnit, is one of the best writers in America because she’s one of the most original thinkers. Here she reminds us of the revolutionary power of hope, and how hope overturns old regimes from the bottom up.
There have undoubtedly been stable periods in human history, but you and your parents, grandparents, and great-grandparents never lived through one, and neither will any children or grandchildren you may have or come to have. Everything has been changing continuously, profoundly — from the role of women to the nature of agriculture. For the past couple of hundred years, change has been accelerating in both magnificent and nightmarish ways.
Yet when we argue for change, notably changing our ways in response to climate change, we’re arguing against people who claim we’re disrupting a stable system. They insist that we’re rocking the boat unnecessarily.
I say: rock that boat. It’s a lifeboat; maybe the people in it will wake up and start rowing. Those who think they’re hanging onto a stable order are actually clinging to the wreckage of the old order, a ship already sinking, that we need to leave behind.
The oceans are changing fast, and for the worse. Fish stocks are dying off, as are shellfish. In many acidified oceanic regions, their shells are actually dissolving or failing to form, which is one of the scariest, most nightmarish things I’ve ever heard. So don’t tell me that we’re rocking a stable boat on calm seas. The glorious 10,000-year period of stable climate in which humanity flourished and then exploded to overrun the Earth and all its ecosystems is over.
But responding to these current cataclysmic changes means taking on people who believe, or at least assert, that those of us who want to react and act are gratuitously disrupting a stable system that’s working fine. It isn’t stable. It isworking fine — in the short term and the most limited sense — for oil companies and the people who profit from them and for some of us in the particularly cushy parts of the world who haven’t been impacted yet by weather events like, say, the recent torrential floods in Japan or southern Nevada and Arizona, or the monsoon versions of the same that have devastated parts of India and Pakistan, or the drought that has mummified my beloved California, or the wildfires of Australia.
The problem, of course, is that the people who most benefit from the current arrangements have effectively purchased a lot of politicians, and that a great many of the rest of them are either hopelessly dim or amazingly timid. Most of the Democrats recognize the reality of climate change but not the urgency of doing something about it. Many of the Republicans used to — John McCain has done an amazing about-face from being a sane voice on climate to a shrill denier — and they present a horrific obstacle to any international treaties.
Put it this way: in one country, one party holding 45 out of 100 seats in one legislative house, while serving a minority of the very rich, can basically block what quite a lot of the other seven billion people on Earth want and need, because a two-thirds majority in the Senate must consent to any international treaty the U.S. signs. Which is not to say much for the president, whose drill-baby-drill administration only looks good compared to the petroleum servants he faces, when he bothers to face them and isn’t just one of them. History will despise them all and much of the world does now, but as my mother would have said, they know which side their bread is buttered on.
As it happens, the butter is melting and the bread is getting more expensive. Global grain production is already down several percent thanks to climate change, says a terrifying new United Nations report. Declining crops cause food shortages and rising food prices, creating hunger and even famine for the poorest on Earth, and also sometimes cause massive unrest. Rising bread prices were one factor that helped spark the Arab Spring in 2011. Anyone who argues that doing something about global warming will be too expensive is dodging just how expensive unmitigated climate change is already proving to be.
It’s only a question of whether the very wealthy or the very poor will pay. Putting it that way, however, devalues all the nonmonetary things at stake, from the survival of myriad species to our confidence in the future. And yeah, climate change is here, now. We’ve already lost a lot and we’re going to lose more, but there’s a difference between terrible and apocalyptic. We still have some control over how extreme it gets. That’s not a great choice, but it’s the choice we have. There’s still a window open for action, but it’s closing. As the Secretary-General of the World Meteorological Society, Michel Jarraud, bluntly put it recently, “We are running out of time.”
New and Renewable Energies
The future is not yet written. Look at the world we’re in at this very moment. The Keystone XL tar sands pipeline was supposed to be built years ago, but activists catalyzed by the rural and indigenous communities across whose land it would go have stopped it so far, and made what was supposed to be a done deal a contentious issue. Activists changed the outcome.
Fracking has been challenged on the state level, and banned in townships and counties from upstate New York to central California. (It has also been banned in two Canadian provinces, France, and Bulgaria.) The fossil-fuel divestment movement has achieved a number of remarkable victories in its few bare years of existence and more are on the way. The actual divestments and commitments to divest fossil fuel stocks by various institutions ranging from the city of Seattle to the British Medical Association are striking. But the real power of the movement lies in the way it has called into question the wisdom of investing in fossil fuel corporations. Even mainstream voices like the British Parliament’s Environmental Audit Committee and publications like Forbes are now beginning to question whether they are safe places to put money. That’s a sea change.
Renewable energy has become more efficient, technologically sophisticated, and cheaper — the price of solar power in relation to the energy it generates has plummeted astonishingly over the past three decades and wind technology keeps getting better. While Americans overall are not yet curtailing their fossil-fuel habits, many individuals and communities are choosing other options, and those options are becoming increasingly viable. A Stanford University scientist has proposed a plan to allow each of the 50 states to run on 100% renewable energy by 2050.
Since, according to the latest report of the U.N.’s Intergovernmental Panel on Climate Change, fossil fuel reserves still in the ground are “at least four times larger than could safely be burned if global warming is to be kept to a tolerable level,” it couldn’t be more important to reach global agreements to do things differently on a planetary scale. Notably, most of those carbon reserves must be left untapped and the modest steps already taken locally andad hoc show that such changes are indeed possible and that an encouraging number of us want to pursue them.
We can do it. And we is the key word here. The world is not going to be saved by individual acts of virtue; it’s going to be saved, if it is to be saved, by collective acts of social and political change. That’s why I’m marching this Sunday with tens or maybe hundreds of thousands of others in New York City — to pressure the United Nations as it meets to address climate change. That’s why people who care about the future state of our planet will also be marching and demonstrating in New Delhi, Rio de Janeiro, Paris, Berlin, Melbourne, Kathmandu, Dublin, Manila, Seoul, Mumbai, Istanbul, and so many smaller places.
Mass movements work. Unarmed citizens have changed the course of history countless times in the modern era. When we come together as civil society, we have the capacity to transform policies, change old ways of doing things, and sometimes even topple regimes. And it is about governments. Like it or not, the global treaties, compacts, and agreements we need can only be made by governments, and governments will make those agreements when the pressure to do so is greater than the pressure not to. We can and must be that pressure.
The Long View from One Window
I lived in the same apartment for 25 years, moving into a poor but thriving black community in 1981 and out of the far more affluent, paler, and less neighborly place it had become in 2006. A lot of people moved in and out in that period, many of them staying only a year or two. Those transients always seemed to believe that the neighborhood they were passing through was a stable one. You had to be slower than change and stick around to see it. I saw it and it helped me learn how to take a historical view of things.
It’s crazy that anyone speaks as if our world is not undergoing rapid change, when the view from the window called history shows nothing but transformation, both incremental and dramatic. Exactly 25 years ago this month, Eastern Europe was astir. Remember that back then there was still a Soviet bloc, and a Soviet Union, and an Iron Curtain, and a Berlin Wall, and a Cold War. Most people thought those were permanent fixtures, but in the summer of 1989, Hungary decided to let East Germans (who were permitted to travel freely to that communist country) stream over to the West.
Thousands of people, tired of life in the totalitarian east, fled. Poland, Czechoslovakia, and Hungary, as well as East Germany, were already electrified by a resurgent civil society and activist communities that had dared to organize in the face of repression. At the time, politicians and pundits in the West were making careers out of explaining, among so many other things, why German reunification wasn’t going to happen in anyone’s lifetime. And they probably would have been proven right if people had stayed home and done nothing, if they hadn’t begun to hope and acted on that hope.
The bureaucrats on both sides of the Berlin Wall were still talking about the possibility of demilitarizing it when citizens showed up en masse and the guards began abandoning their posts. On that epochal night of November 9, 1989, the people made whole what had been broken. The lesson: showing up is half the battle.
British Prime Minister Margaret Thatcher had been so unnerved by developments in the Soviet Union’s Eastern European holdings that she went to Moscow, two months before the fall of the wall, to implore Soviet leader Mikhail Gorbachev to prevent any such thing. That was early September 1989. “No dramatic change in the situation in Czechoslovakia can be expected,” predicted a Czech official two months before a glorious popular uprising, remembered as the Velvet Revolution, erupted and abolished the government in which he was an official.
There are three things to note about those changes in 1989. First, most people in power dismissed the possibility that such extraordinary change could happen or deplored what it might bring. They were comfortable enough with things as they were, even though the status quo was several kinds of scary and awful. In other words, the status quo likes the status quo and dislikes change. Second, everything changed despite them, thanks to grassroots organizing and civil society, forces that — we are now regularly assured — are pointless and irrelevant. Third, the world that existed then has been largely swept away: the Soviet Union, the global alignments of that time, the idea of a binary world of communism and capitalism, and the policies that had kept us on the brink of nuclear annihilation for decades. We live in a very different world now (though nuclear weapons are still a terrible problem). Things do change.
Maybe, in fact, there’s a fourth point to note as well. That, important as they were, the front-page stories about the liberation of Eastern Europe weren’t what mattered most all those years ago. After all, hidden away deep inside theNew York Times that autumn, you can find a dozen or so articles about global warming, as the newly recognized phenomenon was then called. And small as they were, anyone reading them now can see that so long ago the essential problem and peril to our world was already clear.
The thought of what might have been accomplished, had a people’s movement arisen then to face global warming, could break your heart. That, after all, was still a time when the Earth’s atmosphere held just above 350 parts per million of carbon dioxide, the maximum safe level for a sustainable survivable planet, not the 400 parts per million of the present moment (“142% of the pre-industrial era” level of carbon, the World Meteorological Organization notes). In other words, we’ve been steadily filling the atmosphere with greenhouse gases and so imperiling the planet and humanity since we knew what we were doing.
The Great Smog and the Big Wind
In that fall a quarter of a century ago, the world changed profoundly right before our eyes. Then we settled back into the short-term, ahistorical view that things are really pretty stable, that ordinary people have no power, and that the world can’t be changed. With that in mind, it’s worth looking at Germany today. Maybe because Germans know better than us that things can change for the worse or the better fast, that the world is not a stable and settled place, and that we do shape it, they have been willing to change.
At one point last spring, cold, cloudy Germany managed to get almost 75% of its electricity from renewable sources. Scotland — cold, gray, oil-rich Scotland! — is on track to achieve 100% renewable electrical generation by 2020 and has already hit the 40% mark. Spain now generates about half its electricity through clean and renewable sources. Other European countries have similar accomplishments. In fact, many of the changes that we in the United States will be marching for this Sunday have already begun happening, sometimes on a significant scale, elsewhere.
To remember how radical this new Europe is, recall that most of these places were burning coal not just in power plants or factories but in homes, too, not so many decades ago. Everyone deplores the horrific air of Beijing and other Chinese cities now, but few remember that many European cities were similarly foul with smoke and smog from the industrial revolution into the postwar era. In December 1952, for instance, the “Great Smog” of London reduced daytime visibility to a few yards and killed about 4,000 people in three days.
A decade before that, in response to the war Germany started, North Americans radically reduced their use of private vehicles and gasoline and planted more than 20 million victory gardens, producing vast quantities of food by non-industrial means. We have done that; we could (and must) do it again.
At least, we don’t burn coal in our homes any more, and in the U.S. we’ve retired 178 coal-fired power plants, phasing out many more, and prevented many new ones from being built. The renewable energy sources that were, people insisted, too minor or unreliable or expensive or new are now beginning to work well, and the price to produce energy in such a fashion is dropping rapidly. UBS, the European investment giant, recently counseled that power plants and centralized power generation are no longer good investments, since decentralized renewables are likely to replace them.
Of course, Germany and Britain are still burning coal, and Poland remains a giant coal mine. Europe is not a perfect renewable energy paradise, just a part of the world that demonstrates the viability of changing how we produce and consume energy. We are already changing, even if not fast enough, not by a long shot, at least not yet. The same goes for divesting from fossil-fuel investments, even though dozens of universities, cities, religious institutions, and foundations have already committed to doing so, and some have by now actually purged their portfolios. The excuse that change is impossible is no longer available, because many places and entities have already changed.
If you want to know how potentially powerful you are, ask your enemies. The misogynists who attack feminism and try to intimidate feminists into silence only demonstrate in a roundabout way that feminism really is changing the world; they are the furious backlash and so the proof that something meaningful is at stake. The climate movement is similarly upsetting a lot of powerful people and institutions; to grasp that, you just have to look at the tsunamis of money spent opposing specific measures and misinforming the public. The carbon barons are demonstrating that we could change the world and that they don’t want us to.
We are powerful and need to become more so in the next year as a major conference in Paris approaches in December 2015 where the climate agreements we need could be hammered out. Or not. This is, after all, a sequel to the Copenhagen conference of 2009, where representatives of many smaller and more vulnerable nations, as well as citizens’ groups, were eager for a treaty that took on climate change in significant ways, only to have their hopes crushed by the recalcitrant governments of the United States and China.
Right now, we are in a churning sea of change, of climate change, of subtle changes in everyday life, of powerful efforts by elites to serve themselves and damn the rest of us, and of increasingly powerful activist and social-movement campaigns to make a world that benefits more beings, human and otherwise, in the longer term. Every choice you make aligns you with one set of these forces or another. That includes doing nothing, which means aligning yourself with the worst of the status quo dragging us down in that ocean of carbon and consumption.
To make personal changes is to do too little. Only great movements, only collective action can save us now. Only is a scary word, but when the ship is sinking, it can be an encouraging one as well. It can hold out hope. The world has changed again and again in ways that, until they happened, would have been considered improbable by just about everyone on the planet. It is changing now and the direction is up to us.
There will be another story to be told about what we did a quarter century after civil society toppled the East Bloc regimes, what we did in the pivotal years of 2014 and 2015. All we know now is that it is not yet written, and that we who live at this very moment have the power to write it with our lives and acts.
Copyright 2014 Rebecca Solnit