By Bill Moyers
Reprinted with permission from TomDispatch.com
Sixty-six years ago this summer, on my 16th birthday, I went to work for the daily newspaper in the small East Texas town of Marshall where I grew up. It was a good place to be a cub reporter — small enough to navigate but big enough to keep me busy and learning something every day. I soon had a stroke of luck. Some of the paper’s old hands were on vacation or out sick and I was assigned to help cover what came to be known across the country as “the housewives’ rebellion.”
Fifteen women in my hometown decided not to pay the social security withholding tax for their domestic workers. Those housewives were white, their housekeepers black. Almost half of all employed black women in the country then were in domestic service. Because they tended to earn lower wages, accumulate less savings, and be stuck in those jobs all their lives, social security was their only insurance against poverty in old age. Yet their plight did not move their employers.
The housewives argued that social security was unconstitutional and imposing it was taxation without representation. They even equated it with slavery. They also claimed that “requiring us to collect [the tax] is no different from requiring us to collect the garbage.” So they hired a high-powered lawyer — a notorious former congressman from Texas who had once chaired the House Un-American Activities Committee — and took their case to court. They lost, and eventually wound up holding their noses and paying the tax, but not before their rebellion had become national news.
The stories I helped report for the local paper were picked up and carried across the country by the Associated Press. One day, the managing editor called me over and pointed to the AP Teletype machine beside his desk. Moving across the wire was a notice citing our paper and its reporters for our coverage of the housewives’ rebellion.
I was hooked, and in one way or another I’ve continued to engage the issues of money and power, equality and democracy over a lifetime spent at the intersection between politics and journalism. It took me awhile to put the housewives’ rebellion into perspective. Race played a role, of course. Marshall was a segregated, antebellum town of 20,000, half of whom were white, the other half black. White ruled, but more than race was at work. Those 15 housewives were respectable townsfolk, good neighbors, regulars at church (some of them at my church). Their children were my friends; many of them were active in community affairs; and their husbands were pillars of the town’s business and professional class.
So what brought on that spasm of rebellion? They simply couldn’t see beyond their own prerogatives. Fiercely loyal to their families, their clubs, their charities, and their congregations — fiercely loyal, that is, to their own kind — they narrowly defined membership in democracy to include only people like themselves. They expected to be comfortable and secure in their old age, but the women who washed and ironed their laundry, wiped their children’s bottoms, made their husbands’ beds, and cooked their family’s meals would also grow old and frail, sick and decrepit, lose their husbands and face the ravages of time alone, with nothing to show from their years of labor but the crease in their brow and the knots on their knuckles.
In one way or another, this is the oldest story in our country’s history: the struggle to determine whether “we, the people” is a metaphysical reality — one nation, indivisible — or merely a charade masquerading as piety and manipulated by the powerful and privileged to sustain their own way of life at the expense of others.
“I Contain Multitudes”
There is a vast difference between a society whose arrangements roughly serve all its citizens and one whose institutions have been converted into a stupendous fraud, a democracy in name only. I have no doubt about what the United States of America was meant to be. It’s spelled out right there in the 52 most revolutionary words in our founding documents, the preamble to our Constitution, proclaiming the sovereignty of the people as the moral base of government:
“We the People of the United States, in Order to form a more perfect Union, establish Justice, insure domestic Tranquility, provide for the common defense, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity, do ordain and establish this Constitution for the United States of America.”
What do those words mean, if not that we are all in the business of nation-building together?
Now, I recognize that we’ve never been a country of angels guided by a presidium of saints. Early America was a moral morass. One in five people in the new nation was enslaved. Justice for the poor meant stocks and stockades. Women suffered virtual peonage. Heretics were driven into exile, or worse. Native people — the Indians — would be forcibly removed from their land, their fate a “trail of tears” and broken treaties.
No, I’m not a romantic about our history and I harbor no idealized notions of politics and democracy. Remember, I worked for President Lyndon Johnson. I heard him often repeat the story of the Texas poker shark who leaned across the table and said to his mark: “Play the cards fair, Reuben. I know what I dealt you.” LBJ knew politics.
Nor do I romanticize “the people.” When I began reporting on the state legislature while a student at the University of Texas, a wily old state senator offered to acquaint me with how the place worked. We stood at the back of the Senate floor as he pointed to his colleagues spread out around the chamber — playing cards, napping, nipping, winking at pretty young visitors in the gallery — and he said to me, “If you think these guys are bad, you should see the people who sent them there.”
And yet, despite the flaws and contradictions of human nature — or perhaps because of them — something took hold here. The American people forged a civilization: that thin veneer of civility stretched across the passions of the human heart. Because it can snap at any moment, or slowly weaken from abuse and neglect until it fades away, civilization requires a commitment to the notion (contrary to what those Marshall housewives believed) that we are all in this together.
American democracy grew a soul, as it were — given voice by one of our greatest poets, Walt Whitman, with his all-inclusive embrace in Song of Myself:
“Whoever degrades another degrades me,
and whatever is done or said returns at last to me…
I speak the pass-word primeval — I give the sign of democracy;
By God! I will accept nothing which all cannot have their counterpart of on the same terms…
(I am large — I contain multitudes.)”
Author Kathleen Kennedy Townsend has vividly described Whitman seeing himself in whomever he met in America. As he wrote in I Sing the Body Electric:
“– the horseman in his saddle,
Girls, mothers, house-keepers, in all their performances,
The group of laborers seated at noon-time with their open dinner-kettles and their wives waiting,
The female soothing a child — the farmer’s daughter in the garden or cow-yard,
The young fellow hoeing corn –”
Whitman’s words celebrate what Americans shared at a time when they were less dependent on each other than we are today. As Townsend put it, “Many more people lived on farms in the nineteenth century, and so they could be a lot more self-reliant; growing their own food, sewing their clothes, building their homes. But rather than applauding what each American could do in isolation, Whitman celebrated the vast chorus: ‘I hear America singing.’” The chorus he heard was of multitudinous voices, a mighty choir of humanity.
Whitman saw something else in the soul of the country: Americans at work, the laboring people whose toil and sweat built this nation. Townsend contrasts his attitude with the way politicians and the media today — in their endless debates about wealth creation, capital gains reduction, and high corporate taxes — seem to have forgotten working people. “But Whitman wouldn’t have forgotten them.” She writes, “He celebrates a nation where everyone is worthy, not where a few do well.”
President Franklin Delano Roosevelt understood the soul of democracy, too. He expressed it politically, although his words often ring like poetry. Paradoxically, to this scion of the American aristocracy, the soul of democracy meant political equality. “Inside the polling booth,” he said, “every American man and woman stands as the equal of every other American man and woman. There they have no superiors. There they have no masters save their own minds and consciences.”
God knows it took us a long time to get there. Every claim of political equality in our history has been met by fierce resistance from those who relished for themselves what they would deny others. After President Abraham Lincoln signed the Emancipation Proclamation it took a century before Lyndon Johnson signed the Voting Rights Act of 1965 — a hundred years of Jim Crow law and Jim Crow lynchings, of forced labor and coerced segregation, of beatings and bombings, of public humiliation and degradation, of courageous but costly protests and demonstrations. Think of it: another hundred years before the freedom won on the bloody battlefields of the Civil War was finally secured in the law of the land.
And here’s something else to think about: Only one of the women present at the first women’s rights convention in Seneca Falls in 1848 — only one, Charlotte Woodward — lived long enough to see women actually get to vote.
“We Pick That Rabbit Out of the Hat”
So it was, in the face of constant resistance, that many heroes — sung and unsung — sacrificed, suffered, and died so that all Americans could gain an equal footing inside that voting booth on a level playing field on the ground floor of democracy. And yet today money has become the great unequalizer, the usurper of our democratic soul.
No one saw this more clearly than that conservative icon Barry Goldwater, longtime Republican senator from Arizona and one-time Republican nominee for the presidency. Here are his words from almost 30 years ago:
“The fact that liberty depended on honest elections was of the utmost importance to the patriots who founded our nation and wrote the Constitution. They knew that corruption destroyed the prime requisite of constitutional liberty: an independent legislature free from any influence other than that of the people. Applying these principles to modern times, we can make the following conclusions: To be successful, representative government assumes that elections will be controlled by the citizenry at large, not by those who give the most money. Electors must believe that their vote counts. Elected officials must owe their allegiance to the people, not to their own wealth or to the wealth of interest groups that speak only for the selfish fringes of the whole community.”
About the time Senator Goldwater was writing those words, Oliver Stone released his movie Wall Street. Remember it? Michael Douglas played the high roller Gordon Gekko, who used inside information obtained by his ambitious young protégé, Bud Fox, to manipulate the stock of a company that he intended to sell off for a huge personal windfall, while throwing its workers, including Bud’s own blue-collar father, overboard. The younger man is aghast and repentant at having participated in such duplicity and chicanery, and he storms into Gekko’s office to protest, asking, “How much is enough, Gordon?”
“The richest one percent of this country owns half our country’s wealth, five trillion dollars… You got ninety percent of the American public out there with little or no net worth. I create nothing. I own. We make the rules, pal. The news, war, peace, famine, upheaval, the price per paper clip. We pick that rabbit out of the hat while everybody sits out there wondering how the hell we did it. Now, you’re not naïve enough to think we’re living in a democracy, are you, Buddy? It’s the free market. And you’re part of it.”
That was in the high-flying 1980s, the dawn of today’s new gilded age. The Greek historian Plutarch is said to have warned that “an imbalance between rich and poor is the oldest and most fatal ailment of a Republic.” Yet as theWashington Post pointed out recently, income inequality may be higher at this moment than at any time in the American past.
When I was a young man in Washington in the 1960s, most of the country’s growth accrued to the bottom 90% of households. From the end of World War II until the early 1970s, in fact, income grew at a slightly faster rate at the bottom and middle of American society than at the top. In 2009, economists Thomas Piketty and Emmanuel Saez explored decades of tax data and found that from 1950 through 1980 the average income of the bottom 90% of Americans had grown, from $ 17,719 to $ 30,941. That represented a 75% increase in 2008 dollars.
Since 1980, the economy has continued to grow impressively, but most of the benefits have migrated to the top. In these years, workers were more productive but received less of the wealth they were helping to create. In the late 1970s, the richest 1% received 9% of total income and held 19% of the nation’s wealth. The share of total income going to that 1% would then rise to more than 23% by 2007, while their share of total wealth would grow to 35%. And that was all before the economic meltdown of 2007-2008.
Even though everyone took a hit during the recession that followed, the top 10% now hold more than three-quarters of the country’s total family wealth.
I know, I know: statistics have a way of causing eyes to glaze over, but these statistics highlight an ugly truth about America: inequality matters. It slows economic growth, undermines health, erodes social cohesion and solidarity, and starves education. In their study The Spirit Level: Why Greater Equality Makes Societies Stronger, epidemiologists Richard Wilkinson and Kate Pickett found that the most consistent predictor of mental illness, infant mortality, low educational achievement, teenage births, homicides, and incarceration was economic inequality.
So bear with me as I keep the statistics flowing. The Pew Research Center recently released a new study indicating that, between 2000 and 2014, the middle class shrank in virtually all parts of the country. Nine out of ten metropolitan areas showed a decline in middle-class neighborhoods. And remember, we aren’t even talking about over 45 million people who are living in poverty. Meanwhile, between 2009 and 2013, that top 1% captured 85% percent of all income growth. Even after the economy improved in 2015, they still took in more than half of the income growth and by 2013 held nearly half of all the stock and mutual fund assets Americans owned.
Now, concentrations of wealth would be far less of an issue if the rest of society were benefitting proportionally. But that isn’t the case.
Once upon a time, according to Isabel Sawhill and Sara McClanahan in their 2006 report Opportunity in America, the American ideal was one in which all children had “a roughly equal chance of success regardless of the economic status of the family into which they were born.”
Almost 10 years ago, economist Jeffrey Madrick wrote that, as recently as the 1980s, economists thought that “in the land of Horatio Alger only 20 percent of one’s future income was determined by one’s father’s income.” He then cited research showing that, by 2007, “60 percent of a son’s income [was] determined by the level of income of the father. For women, it [was] roughly the same.” It may be even higher today, but clearly a child’s chance of success in life is greatly improved if he’s born on third base and his father has been tipping the umpire.
This raises an old question, one highlighted by the British critic and public intellectual Terry Eagleton in an article in the Chronicle of Higher Education:
”Why is it that the capitalist West has accumulated more resources than human history has ever witnessed, yet appears powerless to overcome poverty, starvation, exploitation, and inequality?… Why does private wealth seem to go hand in hand with public squalor? Is it… plausible to maintain that there is something in the nature of capitalism itself which generates deprivation and inequality?”
The answer, to me, is self-evident. Capitalism produces winners and losers big time. The winners use their wealth to gain political power, often through campaign contributions and lobbying. In this way, they only increase their influence over the choices made by the politicians indebted to them. While there are certainly differences between Democrats and Republicans on economic and social issues, both parties cater to wealthy individuals and interests seeking to enrich their bottom lines with the help of the policies of the state (loopholes, subsidies, tax breaks, deregulation). No matter which party is in power, the interests of big business are largely heeded.
More on that later, but first, a confession. The legendary broadcast journalist Edward R. Murrow told his generation of journalists that bias is okay as long as you don’t try to hide it. Here’s mine: plutocracy and democracy don’t mix. As the late (and great) Supreme Court Justice Louis Brandeis said, “We may have democracy, or we may have wealth concentrated in the hands of a few, but we can’t have both.” Of course the rich can buy more homes, cars, vacations, gadgets, and gizmos than anyone else, but they should not be able to buy more democracy. That they can and do is a despicable blot on American politics that is now spreading like a giant oil spill.
In May, President Obama and I both spoke at the Rutgers University commencement ceremony. He was at his inspirational best as 50,000 people leaned into every word. He lifted the hearts of those young men and women heading out into our troubled world, but I cringed when he said, “Contrary to what we hear sometimes from both the left as well as the right, the system isn’t as rigged as you think…”
Wrong, Mr. President, just plain wrong. The people are way ahead of you on this. In a recent poll, 71% of Americans across lines of ethnicity, class, age, and gender said they believe the U.S. economy is rigged. People reported that they are working harder for financial security. One quarter of the respondents had not taken a vacation in more than five years. Seventy-one percent said that they are afraid of unexpected medical bills; 53% feared not being able to make a mortgage payment; and, among renters, 60% worried that they might not make the monthly rent.
Millions of Americans, in other words, are living on the edge. Yet the country has not confronted the question of how we will continue to prosper without a workforce that can pay for its goods and services.
You didn’t have to read Das Kapital to see this coming or to realize that the United States was being transformed into one of the harshest, most unforgiving societies among the industrial democracies. You could instead have read the Economist, arguably the most influential business-friendly magazine in the English-speaking world. I keep in my files a warning published in that magazine a dozen years ago, on the eve of George W. Bush’s second term. The editors concluded back then that, with income inequality in the U.S. reaching levels not seen since the first Gilded Age and social mobility diminishing, “the United States risks calcifying into a European-style class-based society.”
And mind you, that was before the financial meltdown of 2007-2008, before the bailout of Wall Street, before the recession that only widened the gap between the super-rich and everyone else. Ever since then, the great sucking sound we’ve been hearing is wealth heading upwards. The United States now has a level of income inequality unprecedented in our history and so dramatic it’s almost impossible to wrap one’s mind around.
Contrary to what the president said at Rutgers, this is not the way the world works; it’s the way the world is made to work by those with the money and power. The movers and shakers — the big winners — keep repeating the mantra that this inequality was inevitable, the result of the globalization of finance and advances in technology in an increasingly complex world. Those are part of the story, but only part. As G.K. Chesterton wrote a century ago, “In every serious doctrine of the destiny of men, there is some trace of the doctrine of the equality of men. But the capitalist really depends on some religion of inequality.”
Exactly. In our case, a religion of invention, not revelation, politically engineered over the last 40 years. Yes, politically engineered. On this development, you can’t do better than read Winner Take All Politics: How Washington Made the Rich Richer and Turned Its Back on the Middle Classby Jacob Hacker and Paul Pierson, the Sherlock Holmes and Dr. Watson of political science.
They were mystified by what had happened to the post-World War II notion of “shared prosperity”; puzzled by the ways in which ever more wealth has gone to the rich and super rich; vexed that hedge-fund managers pull in billions of dollars, yet pay taxes at lower rates than their secretaries; curious about why politicians kept slashing taxes on the very rich and handing huge tax breaks and subsidies to corporations that are downsizing their work forces; troubled that the heart of the American Dream — upward mobility — seemed to have stopped beating; and dumbfounded that all of this could happen in a democracy whose politicians were supposed to serve the greatest good for the greatest number. So Hacker and Pierson set out to find out “how our economy stopped working to provide prosperity and security for the broad middle class.”
In other words, they wanted to know: “Who dunnit?” They found the culprit. With convincing documentation they concluded, “Step by step and debate by debate, America’s public officials have rewritten the rules of American politics and the American economy in ways that have benefitted the few at the expense of the many.”
There you have it: the winners bought off the gatekeepers, then gamed the system. And when the fix was in they turned our economy into a feast for the predators, “saddling Americans with greater debt, tearing new holes in the safety net, and imposing broad financial risks on Americans as workers, investors, and taxpayers.” The end result, Hacker and Pierson conclude, is that the United States is looking more and more like the capitalist oligarchies of Brazil, Mexico, and Russia, where most of the wealth is concentrated at the top while the bottom grows larger and larger with everyone in between just barely getting by.
Bruce Springsteen sings of “the country we carry in our hearts.” This isn’t it.
Looking back, you have to wonder how we could have ignored the warning signs. In the 1970s, Big Business began to refine its ability to act as a class and gang up on Congress. Even before the Supreme Court’s Citizens Uniteddecision, political action committees deluged politics with dollars. Foundations, corporations, and rich individuals funded think tanks that churned out study after study with results skewed to their ideology and interests. Political strategists made alliances with the religious right, with Jerry Falwell’s Moral Majority and Pat Robertson’s Christian Coalition, to zealously wage a cultural holy war that would camouflage the economic assault on working people and the middle class.
To help cover-up this heist of the economy, an appealing intellectual gloss was needed. So public intellectuals were recruited and subsidized to turn “globalization,” “neo-liberalism,” and “the Washington Consensus” into a theological belief system. The “dismal science of economics” became a miracle of faith. Wall Street glistened as the new Promised Land, while few noticed that those angels dancing on the head of a pin were really witchdoctors with MBAs brewing voodoo magic. The greed of the Gordon Gekkos — once considered a vice — was transformed into a virtue. One of the high priests of this faith, Lloyd Blankfein, CEO of Goldman Sachs, looking in wonder on all that his company had wrought, pronounced it “God’s work.”
A prominent neoconservative religious philosopher even articulated a “theology of the corporation.” I kid you not. And its devotees lifted their voices in hymns of praise to wealth creation as participation in the Kingdom of Heaven here on Earth. Self-interest became the Gospel of the Gilded Age.
No one today articulates this winner-take-all philosophy more candidly than Ray Dalio. Think of him as the King Midas of hedge funds, with a personal worth estimated at almost $16 billion and a company, Bridgewater Associates, reportedly worth as much as $154 billion.
Dalio fancies himself a philosopher and has written a book of maximsexplaining his philosophy. It boils down to: “Be a hyena. Attack the Wildebeest.” (Wildebeests, antelopes native to southern Africa — as I learned when we once filmed a documentary there — are no match for the flesh-eating dog-like spotted hyenas that gorge on them.) Here’s what Dalio wrote about being a Wall Street hyena:
“…when a pack of hyenas takes down a young wildebeest, is this good or bad? At face value, this seems terrible; the poor wildebeest suffers and dies. Some people might even say that the hyenas are evil. Yet this type of apparently evil behavior exists throughout nature through all species… like death itself, this behavior is integral to the enormously complex and efficient system that has worked for as long as there has been life… [It] is good for both the hyenas, who are operating in their self-interest, and the interests of the greater system, which includes the wildebeest, because killing and eating the wildebeest fosters evolution, i.e., the natural process of improvement… Like the hyenas attacking the wildebeest, successful people might not even know if or how their pursuit of self-interest helps evolution, but it typically does.”
He concludes: “How much money people have earned is a rough measure of how much they gave society what it wanted…”
Not this time, Ray. This time, the free market for hyenas became a slaughterhouse for the wildebeest. Collapsing shares and house prices destroyed more than a quarter of the wealth of the average household. Many people have yet to recover from the crash and recession that followed. They are still saddled with burdensome debt; their retirement accounts are still anemic. All of this was, by the hyena’s accounting, a social good, “an improvement in the natural process,” as Dalio puts it. Nonsense. Bull. Human beings have struggled long and hard to build civilization; his doctrine of “progress” is taking us back to the jungle.
And by the way, there’s a footnote to the Dalio story. Early this year, the founder of the world’s largest hedge fund, and by many accounts the richest man in Connecticut where it is headquartered, threatened to take his firm elsewhere if he didn’t get concessions from the state. You might have thought that the governor, a Democrat, would have thrown him out of his office for the implicit threat involved. But no, he buckled and Dalio got the $22 million in aid — a $5 million grant and a $17 million loan — that he was demanding to expand his operations. It’s a loan that may be forgiven if he keeps jobs in Connecticut and creates new ones. No doubt he left the governor’s office grinning like a hyena, his shoes tracking wildebeest blood across the carpet.
Our founders warned against the power of privileged factions to capture the machinery of democracies. James Madison, who studied history through a tragic lens, saw that the life cycle of previous republics had degenerated into anarchy, monarchy, or oligarchy. Like many of his colleagues, he was well aware that the republic they were creating could go the same way. Distrusting, even detesting concentrated private power, the founders attempted to erect safeguards to prevent private interests from subverting the moral and political compact that begins, “We, the people.” For a while, they succeeded.
When the brilliant young French aristocrat Alexis de Tocqueville toured America in the 1830s, he was excited by the democratic fervor he witnessed. Perhaps that excitement caused him to exaggerate the equality he celebrated. Close readers of de Tocqueville will notice, however, that he did warn of the staying power of the aristocracy, even in this new country. He feared what he called, in the second volume of his masterwork, Democracy in America, an “aristocracy created by business.” He described it as already among “the harshest that ever existed in the world” and suggested that, “if ever a permanent inequality of conditions and aristocracy again penetrate the world, it may be predicted that this is the gate by which they will enter.”
And so it did. Half a century later, the Gilded Age arrived with a new aristocratic hierarchy of industrialists, robber barons, and Wall Street tycoons in the vanguard. They had their own apologist in the person of William Graham Sumner, an Episcopal minister turned professor of political economy at Yale University. He famously explained that “competition… is a law of nature” and that nature “grants her rewards to the fittest, therefore, without regard to other considerations of any kind.”
From Sumner’s essays to the ravenous excesses of Wall Street in the 1920s to the ravings of Rush Limbaugh, Glenn Beck, and Fox News, to the business press’s wide-eyed awe of hyena-like CEOs; from the Republican war on government to the Democratic Party’s shameless obeisance to big corporations and contributors, this “law of nature” has served to legitimate the yawning inequality of income and wealth, even as it has protected networks of privilege and monopolies in major industries like the media, the tech sector, and the airlines.
A plethora of studies conclude that America’s political system has already been transformed from a democracy into an oligarchy (the rule of a wealthy elite). Martin Gilens and Benjamin Page, for instance, studied data from 1,800 different policy initiatives launched between 1981 and 2002. They found that “economic elites and organized groups representing business interests have substantial independent impacts on U.S. government policy while mass-based interest groups and average citizens have little or no independent influence.” Whether Republican or Democratic, they concluded, the government more often follows the preferences of major lobbying or business groups than it does those of ordinary citizens.
We can only be amazed that a privileged faction in a fervent culture of politically protected greed brought us to the brink of a second Great Depression, then blamed government and a “dependent” 47% of the population for our problems, and ended up richer and more powerful than ever.
The Truth of Your Life
Which brings us back to those Marshall housewives — to all those who simply can’t see beyond their own prerogatives and so narrowly define membership in democracy to include only people like themselves.
How would I help them recoup their sanity, come home to democracy, and help build the sort of moral compact embodied in the preamble to the Constitution, that declaration of America’s intent and identity?
First, I’d do my best to remind them that societies can die of too much inequality.
Second, I’d give them copies of anthropologist Jared Diamond’s book Collapse: How Societies Choose to Fail or Succeed to remind them that we are not immune. Diamond won the Pulitzer Prize for describing how the damage humans have inflicted on their environment has historically led to the decline of civilizations. In the process, he vividly depicts how elites repeatedly isolate and delude themselves until it’s too late. How, extracting wealth from commoners, they remain well fed while everyone else is slowly starving until, in the end, even they (or their offspring) become casualties of their own privilege. Any society, it turns out, contains a built-in blueprint for failure if elites insulate themselves endlessly from the consequences of their decisions.
Third, I’d discuss the real meaning of “sacrifice and bliss” with them. That was the title of the fourth episode of my PBS series Joseph Campbell and the Power of Myth. In that episode, Campbell and I discussed the influence on him of the German philosopher Arthur Schopenhauer, who believed that the will to live is the fundamental reality of human nature. So he puzzled about why some people override it and give up their lives for others.
“Can this happen?” Campbell asked. “That what we normally think of as the first law of nature, namely self-preservation, is suddenly dissolved. What creates that breakthrough when we put another’s well-being ahead of our own?” He then told me of an incident that took place near his home in Hawaii, up in the heights where the trade winds from the north come rushing through a great ridge of mountains. People go there to experience the force of nature, to let their hair be blown in the winds — and sometimes to commit suicide.
One day, two policemen were driving up that road when, just beyond the railing, they saw a young man about to jump. One of the policemen bolted from the car and grabbed the fellow just as he was stepping off the ledge. His momentum threatened to carry both of them over the cliff, but the policeman refused to let go. Somehow he held on long enough for his partner to arrive and pull the two of them to safety. When a newspaper reporter asked, “Why didn’t you let go? You would have been killed,” he answered: “I couldn’t… I couldn’t let go. If I had, I couldn’t have lived another day of my life.”
Campbell then added: “Do you realize what had suddenly happened to that policeman? He had given himself over to death to save a stranger. Everything else in his life dropped off. His duty to his family, his duty to his job, his duty to his own career, all of his wishes and hopes for life, just disappeared.” What mattered was saving that young man, even at the cost of his own life.
How can this be, Campbell asked? Schopenhauer’s answer, he said, was that a psychological crisis represents the breakthrough of a metaphysical reality, which is that you and the other are two aspects of one life, and your apparent separateness is but an effect of the way we experience forms under the conditions of space and time. Our true reality is our identity and unity with all life.
Sometimes, however instinctively or consciously, our actions affirm that reality through some unselfish gesture or personal sacrifice. It happens in marriage, in parenting, in our relations with the people immediately around us, and in our participation in building a society based on reciprocity.
The truth of our country isn’t actually so complicated. It’s in the moral compact implicit in the preamble to our Constitution: we’re all in this together. We are all one another’s first responders. As the writer Alberto Rios once put it, “I am in your family tree and you are in mine.”
I realize that the command to love our neighbor is one of the hardest of all religious concepts, but I also recognize that our connection to others goes to the core of life’s mystery and to the survival of democracy. When we claim this as the truth of our lives — when we live as if it’s so — we are threading ourselves into the long train of history and the fabric of civilization; we are becoming “we, the people.”
The religion of inequality — of money and power — has failed us; its gods are false gods. There is something more essential — more profound — in the American experience than the hyena’s appetite. Once we recognize and nurture this, once we honor it, we can reboot democracy and get on with the work of liberating the country we carry in our hearts.
Bill Moyers has been an organizer of the Peace Corps, a top White House aide, a publisher, and a prolific broadcast journalist whose work earned 37 Emmy Awards and nine Peabody Awards. He is president of the Schumann Media Center, which supports independent journalism. This essay is adapted from remarks he prepared for delivery this past summer at the Chautauqua Institution’s week-long focus on money and power. He is grateful to his colleagues Karen Kimball and Gail Ablow for their research and fact checking.
Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Nick Turse’s Next Time They’ll Come to Count the Dead, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 Bill Moyers
By Michael Klare
Reprinted with permission from TomDispatch.com
Here’s the good news: wind power, solar power, and other renewable forms of energy are expanding far more quickly than anyone expected, ensuring that these systems will provide an ever-increasing share of our future energy supply. According to the most recent projections from the Energy Information Administration (EIA) of the U.S. Department of Energy, global consumption of wind, solar, hydropower, and other renewables will double between now and 2040, jumping from 64 to 131 quadrillion British thermal units (BTUs).
And here’s the bad news: the consumption of oil, coal, and natural gas is also growing, making it likely that, whatever the advances of renewable energy, fossil fuels will continue to dominate the global landscape for decades to come, accelerating the pace of global warming and ensuring the intensification of climate-change catastrophes.
The rapid growth of renewable energy has given us much to cheer about. Not so long ago, energy analysts were reporting that wind and solar systems were too costly to compete with oil, coal, and natural gas in the global marketplace. Renewables would, it was then assumed, require pricey subsidies that might not always be available. That was then and this is now. Today, remarkably enough, wind and solar are already competitive with fossil fuels for many uses and in many markets.
If that wasn’t predicted, however, neither was this: despite such advances, the allure of fossil fuels hasn’t dissipated. Individuals, governments, whole societies continue to opt for such fuels even when they gain no significant economic advantage from that choice and risk causing severe planetary harm. Clearly, something irrational is at play. Think of it as the fossil-fuel equivalent of an addictive inclination writ large.
The contradictory and troubling nature of the energy landscape is on clear display in the 2016 edition of the International Energy Outlook, the annual assessment of global trends released by the EIA this May. The good news about renewables gets prominent attention in the report, which includes projections of global energy use through 2040. “Renewables are the world’s fastest-growing energy source over the projection period,” it concludes. Wind and solar are expected to demonstrate particular vigor in the years to come, their growth outpacing every other form of energy. But because renewables start from such a small base — representing just 12% of all energy used in 2012 — they will continue to be overshadowed in the decades ahead, explosive growth or not. In 2040, according to the report’s projections, fossil fuels will still have a grip on a staggering 78% of the world energy market, and — if you don’t mind getting thoroughly depressed — oil, coal, and natural gas will each still command larger shares of the market than all renewables combined.
Keep in mind that total energy consumption is expected to be much greater in 2040 than at present. At that time, humanity will be using an estimated 815 quadrillion BTUs (compared to approximately 600 quadrillion today). In other words, though fossil fuels will lose some of their market share to renewables, they will still experience striking growth in absolute terms. Oil consumption, for example, is expected to increase by 34% from 90 million to 121 million barrels per day by 2040. Despite all the negative publicity it’s been getting lately, coal, too, should experience substantial growth, rising from 153 to 180 quadrillion BTUs in “delivered energy” over this period. And natural gas will be the fossil-fuel champ, with global demand for it jumping by 70%. Put it all together and the consumption of fossil fuels is projected to increase by 177 quadrillion BTUs, or 38%, over the period the report surveys.
Anyone with even the most rudimentary knowledge of climate science has to shudder at such projections. After all, emissions from the combustion of fossil fuels account for approximately three-quarters of the greenhouse gases humans are putting into the atmosphere. An increase in their consumption of such magnitude will have a corresponding impact on the greenhouse effect that is accelerating the rise in global temperatures.
At the United Nations Climate Summit in Paris last December, delegates from more than 190 countries adopted a plan aimed at preventing global warming from exceeding 2 degrees Celsius (about 3.6 degrees Fahrenheit) above the pre-industrial level. This target was chosen because most scientists believe that any warming beyond that will result in catastrophic and irreversible climate effects, including the melting of the Greenland and Antarctic ice caps (and a resulting sea-level rise of 10-20 feet). Under the Paris Agreement, the participating nations signed onto a plan to take immediate steps to halt the growth of greenhouse gas emissions and then move to actual reductions. Although the agreement doesn’t specify what measures should be taken to satisfy this requirement — each country is obliged to devise its own “intended nationally determined contributions” to the overall goal — the only practical approach for most countries would be to reduce fossil fuel consumption.
As the 2016 EIA report makes eye-poppingly clear, however, the endorsers of the Paris Agreement aren’t on track to reduce their consumption of oil, coal, and natural gas. In fact, greenhouse gas emissions are expected to rise by an estimated 34% between 2012 and 2040 (from 32.3 billion to 43.2 billion metric tons). That net increase of 10.9 billion metric tons is equal to the total carbon emissions of the United States, Canada, and Europe in 2012. If such projections prove accurate, global temperatures will rise, possibly significantly above that 2 degree mark, with the destructive effects of climate change we are already witnessing today — the fires, heat waves, floods, droughts, storms, and sea level rise — only intensifying.
Exploring the Roots of Addiction
How to explain the world’s tenacious reliance on fossil fuels, despite all that we know about their role in global warming and those lofty promises made in Paris?
To some degree, it is undoubtedly the product of built-in momentum: our existing urban, industrial, and transportation infrastructure was largely constructed around fossil fuel-powered energy systems, and it will take a long time to replace or reconfigure them for a post-carbon future. Most of our electricity, for example, is provided by coal- and gas-fired power plants that will continue to operate for years to come. Even with the rapid growth of renewables, coal and natural gas are projected to supply 56% of the fuel for the world’s electrical power generation in 2040 (a drop of only 5% from today). Likewise, the overwhelming majority of cars and trucks on the road are now fueled by gasoline and diesel. Even if the number of new ones running on electricity were to spike, it would still be many years before oil-powered vehicles lost their commanding position. As history tells us, transitions from one form of energy to another take time.
Then there’s the problem — and what a problem it is! — of vested interests. Energy is the largest and most lucrative business in the world, and the giant fossil fuel companies have long enjoyed a privileged and highly profitable status. Oil corporations like Chevron and ExxonMobil, along with their state-owned counterparts like Gazprom of Russia and Saudi Aramco, are consistently ranked among the world’s most valuable enterprises. These companies — and the governments they’re associated with — are not inclined to surrender the massive profits they generate year after year for the future wellbeing of the planet.
As a result, it’s a guarantee that they will employ any means at their disposal (including well-established, well-funded ties to friendly politicians and political parties) to slow the transition to renewables. In the United States, for example, the politicians of coal-producing states are now at work on plans to block the Obama administration’s “clean power” drive, which might indeed lead to a sharp reduction in coal consumption. Similarly, Exxon has recruited friendly Republican officials to impede the efforts of some state attorney generals to investigate that company’s past suppression of information on the links between fossil fuel use and climate change. And that’s just to scratch the surface of corporate efforts to mislead the public that have included the funding of the Heartland Institute and other climate-change-denying think tanks.
Of course, nowhere is the determination to sustain fossil fuels fiercer than in the “petro-states” that rely on their production for government revenues, provide energy subsidies to their citizens, and sometimes sell their products at below-market rates to encourage their use. According to the International Energy Agency (IEA), in 2014 fossil fuel subsidies of various sorts added up to a staggering $493 billion worldwide — far more than those for the development of renewable forms of energy. The G-20 group of leading industrial powers agreed in 2009 to phase out such subsidies, but a meeting of G-20 energy ministers in Beijing in June failed to adopt a timeline to complete the phase-out process, suggesting that little progress will be made when the heads of state of those countries meet in Hangzhou, China, this September.
None of this should surprise anyone, given the global economy’s institutionalized dependence on fossil fuels and the amounts of money at stake. What it doesn’t explain, however, is the projected growth in global fossil fuel consumption. A gradual decline, accelerating over time, would be consistent with a broad-scale but slow transition from carbon-based fuels to renewables. That the opposite seems to be happening, that their use is actually expanding in most parts of the world, suggests that another factor is in play: addiction.
We all know that smoking tobacco, snorting cocaine, or consuming too much alcohol is bad for us, but many of us persist in doing so anyway, finding the resulting thrill, the relief, or the dulling of the pain of everyday life simply too great to resist. In the same way, much of the world now seems to find it easier to fill up the car with the usual tankful of gasoline or flip the switch and receive electricity from coal or natural gas than to begin to shake our addiction to fossil fuels. As in everyday life, so at a global level, the power of addiction seems regularly to trump the obvious desirability of embarking on another, far healthier path.
On a Fossil Fuel Bridge to Nowhere
Without acknowledging any of this, the 2016 EIA report indicates just how widespread and prevalent our fossil-fuel addiction remains. In explaining the rising demand for oil, for example, it notes that “in the transportation sector, liquid fuels [predominantly petroleum] continue to provide most of the energy consumed.” Even though “advances in nonliquids-based [electrical] transportation technologies are anticipated,” they will not prove sufficient “to offset the rising demand for transportation services worldwide,” and so the demand for gasoline and diesel will continue to grow.
Most of the increase in demand for petroleum-based fuels is expected tooccur in the developing world, where hundreds of millions of people are entering the middle class, buying their first gas-powered cars, and about to be hooked on an energy way of life that should be, but isn’t, dying. Oil use is expected to grow in China by 57% between 2012 and 2040, and at a faster rate (131%!) in India. Even in the United States, however, a growing preference for sport utility vehicles and pickup trucks continues to mean higher petroleum use. In 2016, according to Edmunds.com, a car shopping and research site, nearly 75% of the people who traded in a hybrid or electric car to a dealer replaced it with an all-gas car, typically a larger vehicle like an SUV or a pickup.
The rising demand for coal follows a depressingly similar pattern. Although it remains a major source of the greenhouse gases responsible for climate change, many developing nations, especially in Asia, continue to favor it when adding electricity capacity because of its low cost and familiar technology. Although the demand for coal in China — long the leading consumer of that fuel — is slowing, that country is still expected to increase its usage by 12% by 2035. The big story here, however, is India: according to the EIA, its coal consumption will grow by 62% in the years surveyed, eventually making it, not the United States, the world’s second largest consumer. Most of that extra coal will go for electricity generation, once again to satisfy an “expanding middle class using more electricity-consuming appliances.”
And then there’s the mammoth expected increase in the demand for natural gas. According to the latest EIA projections, its consumption will rise faster than any fuel except renewables. Given the small base from which renewables start, however, gas will experience the biggest absolute increase of any fuel, 87 quadrillion BTUs between 2012 and 2040. (In contrast, renewables are expected to grow by 68 quadrillion and oil by 62 quadrillion BTUs during this period.)
At present, natural gas appears to enjoy an enormous advantage in the global energy marketplace. “In the power sector, natural gas is an attractive choice for new generating plants given its moderate capital cost and attractive pricing in many regions as well as the relatively high fuel efficiency and moderate capital cost of gas-fired plants,” the EIA notes. It is also said to benefit from its “clean” reputation (compared to coal) in generating electricity. “As more governments begin implementing national or regional plans to reduce carbon dioxide emissions, natural gas may displace consumption of the more carbon-intensive coal and liquid fuels.”
Unfortunately, despite that reputation, natural gas remains a carbon-based fossil fuel, and its expanded consumption will result in a significant increase in global greenhouse gas emissions. In fact, the EIA claims that it will generate a larger increase in such emissions over the next quarter-century than either coal or oil — a disturbing note for those who contend that natural gas provides a “bridge” to a green energy future.
If you were to read through the EIA’s latest report as I did, you, too, might end up depressed by humanity’s addictive need for its daily fossil fuel hit. While the EIA’s analysts add the usual caveats, including the possibility that a more sweeping than expected follow-up climate agreement or strict enforcement of the one adopted last December could alter their projections, they detect no signs of the beginning of a determined move away from the reliance on fossil fuels.
If, indeed, addiction is a big part of the problem, any strategies undertaken to address climate change must incorporate a treatment component. Simply saying that global warming is bad for the planet, and that prudence and morality oblige us to prevent the worst climate-related disasters, will no more suffice than would telling addicts that tobacco and hard drugs are bad for them. Success in any global drive to avert climate catastrophe will involve tackling addictive behavior at its roots and promoting lasting changes in lifestyle. To do that, it will be necessary to learn from the anti-drug and anti-tobacco communities about best practices, and apply them to fossil fuels.
Consider, for example, the case of anti-smoking efforts. It was the medical community that first took up the struggle against tobacco and began by banning smoking in hospitals and other medical facilities. This effort was later extended to public facilities — schools, government buildings, airports, and so on — until vast areas of the public sphere became smoke-free. Anti-smoking activists also campaigned to have warning labels displayed in tobacco advertising and cigarette packaging.
Such approaches helped reduce tobacco consumption around the world and can be adapted to the anti-carbon struggle. College campuses and town centers could, for instance, be declared car-free — a strategy already embraced by London’s newly elected mayor, Sadiq Khan. Express lanes on major streets and highways can be reserved for hybrids, electric cars, and other alternative vehicles. Gas station pumps and oil advertising can be made to incorporate warning signs saying something like, “Notice: consumption of this product increases your exposure to asthma, heat waves, sea level rise, and other threats to public health.” Once such an approach began to be seriously considered, there would undoubtedly be a host of other ideas for how to begin to put limits on our fossil fuel addiction.
Such measures would have to be complemented by major moves to combat the excessive influence of the fossil fuel companies and energy states when it comes to setting both local and global policy. In the U.S., for instance, severely restricting the scope of private donations in campaign financing, as Senator Bernie Sanders advocated in his presidential campaign, would be a way to start down this path. Another would step up legal efforts to hold giant energy companies like ExxonMobil accountable for malfeasance in suppressing information about the links between fossil fuel combustion and global warming, just as, decades ago, anti-smoking activists tried to expose tobacco company criminality in suppressing information on the links between smoking and cancer.
Without similar efforts of every sort on a global level, one thing seems certain: the future projected by the EIA will indeed come to pass and human suffering of a previously unimaginable sort will be the order of the day.
Michael T. Klare, a TomDispatch regular, is a professor of peace and world security studies at Hampshire College and the author, most recently, of The Race for What’s Left. A documentary movie version of his book Blood and Oil is available from the Media Education Foundation. Follow him on Twitter at @mklare1.
Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Nick Turse’s Next Time They’ll Come to Count the Dead, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 Michael T. Klare
By David Morris
Reprinted with permission from the Institute for Local Self-Reliance
[Editor’s Comment: Note in the article that follows the discussion of the Bank of North Dakota, which has received particular attention in the last few years because of its beneficial role in helping North Dakota weather the storms of the Great Recession. Note especially in this regard the writings of Ellen Brown, President of the Public Banking Institute, author of Web of Debt and the The Public Bank Solution. See also her article, “NORTH DAKOTA’S ECONOMIC “MIRACLE”—IT’S NOT OIL.”]
On June 14th, North Dakotans voted to overrule their government’s decision to allow corporate ownership of farms. That they had the power to do so was a result of a political revolution that occurred almost exactly a century before, a revolution that may hold lessons for those like Bernie Sanders’ supporters who seek to establish a bottom-up political movement in the face of hostile political parties today.
Here’s the story. In the early 1900s North Dakota was effectively an economic colony of Minneapolis/Saint Paul. A Saint Paul based railroad tycoon controlled its freight prices. Minnesota companies owned many of the grain elevators that sat next to the rail lines and often cheated farmers by giving their wheat a lower grade than deserved. Since the flour mills were in Minneapolis, shipping costs reduced the price wheat farmers received. Minneapolis banks held farmers’ mortgages and their operating loans to farmers carried a higher interest than they charged at home.
Farmers, who represented a majority of the population, tried to free themselves from bondage by making the political system more responsive. In 1913 they gained an important victory when the legislature gave them the right, by petition, to initiate a law or constitutional amendment as well as to overturn a law passed by the legislature.
But this was a limited victory for while the people could enable they could not compel.
In 1914, for example, after a 30-year effort, voters authorized the legislature to build a state-owned grain elevator and mill. But in January 1915 a state legislative committee concluded it “would be a waste of the people’s money as well as a humiliating disappointment to the people of the state.” The legislature refused funding.
A few weeks later, two former candidates on the Socialist Party ticket, Arthur C. Townley and Albert Bowen, launched a new political organization, the Non Partisan League (NPL). The name conveyed their strategy: To rely more on program-based politics than party-based politics. According to the NPL its program intended to end the “utterly unendurable” situation in which “the people of this state have always been dependent on their existence on industries, banks, markets, storage and transportation facilities either existing altogether outside of the state or controlled by great private interests outside the state.”
The NPL’s platform contained concrete and specific measures: state ownership of elevators, flour mills, packing houses and cold storage plants; state inspection of grain grading and dockage; state hail insurance; rural credit banks operating at cost; exemption of farm improvements from taxation.
In his recent book, Insurgent Democracy Michael Lansing explains, “Small-property holders anxious to use government to create a more equitable form of capitalism cannot be easily categorized in contemporary political term.” The NPL “reminded Americans that corporate capitalism was not the only way forward.” Supporters of the NPL wanted state sponsored market fairness but not state control. They wanted public options, not public monopolies.
In the language of our 2016 political campaigns, it would not be much of a stretch to characterize the NPL as a movement for an American-style decentralized, anti-corporate, democratic socialism.
The NPL was as one contemporary observer, Thorstein Veblen described it, “large, loose, animated and untidy, but sure of itself in its settled disallowance of the Vested Interests… “
The movement was membership-based. Members were kept informed through a regular newsletter. This was part of a massive popular education effort. Membership fees allowed the NPL to hire organizers and lecturers who traveled throughout the state. Townley, the founder and leader of the NPL, proved an entertaining and charismatic speaker. Sometimes thousands would gather to hear him speak. Speeches themselves were community affairs.
The goal was to convince farmers that collectively they could significantly influence the decisions that would affect their personal and business lives.
To gain power the NPL relied on a political tool born of the Progressive movement: the political primary. To make government more responsive and transparent, Progressives urged states to bypass political conventions, political bosses and backroom deals and adopt direct primaries. By 1916, 25 of the then 48 U.S. states had adopted the primary as the vehicle for nominating political candidates.
The primary system gave people the power to elect candidates of their political party, but the key to the remarkable political revolution that swept through North Dakota was its adoption, in 1908, of an “open primary” law that allowed anyone to vote in a party’s primary even if unaffiliated with that party.
On March 29, 1916 the NPL took advantage of that law by convening its first convention. Attendees endorsed candidates who swore allegiance to its platform. These candidates ran in the June Republican primary, a primary targeted by the NPL because then (as now) the Republican Party dominated North Dakota.
In June 1916 the NPL effectively took over the Republican Party. In November 1916 NPL –endorsed candidates won every statewide office except one and gained a majority in the state Assembly, although not in the Senate. By that time the NPL boasted 40,000 members, an astonishing number given the state population of only 620,000.
In the succeeding legislative session the NPL was able to implement parts of its platform: a grain grading system, a 9-hour workday for women, regulation of railroad shipping rates and increased state aid to rural schools. But the Senate narrowly defeated the key to implementing NPL’s broad vision: a constitutional amendment to allow for state-owned businesses.
In 1918, the NPL gained a majority in the state Senate. That year North Dakotans voted on 10 constitutional amendments. They approved every one. One, endorsed by a resounding margin of 59-41 gave state, county and local governments permission “to engage in industry, enterprises or businesses.” Another allowed the state to guarantee $2 million in bonds and established voting requirements for future bonding. Another created state hail insurance.
Other amendments expanded the possibility of direct democracy by reducing the number of signatures required to put an initiative on the ballot, and by allowing constitutional amendments to be passed by a simple majority of the voters.
In June 1919, voters approved 6 of 7 legislatively referred statutes, including the establishment of a state bank, that latter by a vote of 56-44. The one ballot initiative North Dakotans rejected—giving the Governor the authority to appoint every county school superintendent—was itself revealing. North Dakotans wanted a state that could stand up to big out-of-state corporations but they preferred local control to state control.
The Bank of North Dakota (BND) was the centerpiece of the NPL’s effort to take back control of their economy. It was intended to strengthen, not undercut local banks. It established no branches, nor did it accept independent deposits or accounts. The Bank “strongly recommended” that borrowers seek mortgages by working through local institutions. Banks across the state used the BND as a clearinghouse for various financial transactions.
Farmers immediately benefited as their interest rates on loans dropped to about 6 percent from the prevailing 8.7 percent.
In November 1920 voters strengthened the BND by narrowly approving an initiative requiring all state, county, township, municipal and school district funds be deposited there.
In March 1920 the NPL legislature referred to the people a constitutional amendment allowing them to petition for the recall of any elected officials. That unprecedented extension of direct democracy proved its undoing, for in late 1918, at the peak of the NPL’s power, political opposition had coalesced into a new organization, the Independent Voters Association (IVA). As the NPL battled internal divisions and a growing unease that it had begun to pursue measures beyond its mandate, the IVA gained support.
The IVA used the political tools the NPL had created. In 1921 its members successfully petitioned for recall elections for the three state officers who constituted the membership of the Industrial Commission that oversaw state enterprises: the governor, attorney general and commissioner of agriculture. The IVA slate won by a whisker. It was the first and last time a U.S. Governor has been successfully recalled.
The IVA immediately set about to undo the NPL program by putting nine provisions on the ballot, including one to abolish the state Bank. Another intended to shrink the capacity of state government by reducing the amount of state bonded debt. Another would have undermined the open primary by requiring separate party ballots for primaries.
Every ballot measure lost, albeit by very narrow margins.
In November 1922 the IVA achieved what the NPL had four years before: Control of all three branches of state government. The NPL’s abrupt disintegration resulted from a number of factors. In 1921 the price of wheat dropped about 60 percent. The resulting economic pain would have reduced the support for any sitting government. The Russian Revolution ushered in a nationwide “Red Scare.” The opposition labeled the NPL’s leaders Communists and Bolsheviks and launched a new magazine called Red Flame. Townley himself was jailed under a Minnesota sedition law for opposing the U.S. involvement in WWI. Meanwhile, internal divisions continued to beset the NPL.
The Legacy of the NPL
As the Nation magazine observed in 1923, “…although the visible machinery largely melted away, a sentiment and a point of view had been established in the minds of hundreds of thousands of farmers and ranchers.” Looking back in 1955, Robert L. Morian, author of the classic Political Prairie Fire, comment that the NPL helped to develop “some of the most independently minded electorates in the country.”
Those independently minded electorates and their anti-corporate, pro-cooperative and independent business sentiment continued to inform and often guide policymakers in the decades to come.
The North Dakota Mill and Elevator Association began operation in a modern facility in 1922. Today it consists of 7 milling units, an elevator and flour mill and a packing warehouse to prepare bagged products for shipment. It is the largest flour mill in the U.S. and the only state-owned milling facility.
In 1932, North Dakotans voted 57 to 43 to ban corporations from owning or leasing farmland.
In 1963 the legislature enacted a law that required pharmacies be owned by a state-registered pharmacist. The effect was to ban chains, except those operating at the time the law was passed.
In 1980 North Dakotans voted to establish a State Housing Finance Agency to provide mortgages to low income households.
In recent years several of these laws protecting independent farmers and businesses have come under attack by big corporations. After several attempts by Big Pharmacy failed to convince the legislature to repeal the Pharmacy Ownership Law, Wal Mart spent $9.3 million to finance a ballot initiative. In November 2014, by a vote of 59-41 the initiative lost.
In 2015 big corporations did convince the legislature to overturn the 1932 anti-corporate farming law. This June, as noted at the beginning of this article, by a resounding margin of 76-24 North Dakotans voted to reinstate the old law.
Today the economic structure of North Dakota reflects its focus on independent and cooperative businesses.
The Pharmacy Ownership law, for example, has markedly benefited North Dakota. A report by the Institute for Local Self-Reliance (ILSR) found that on every key measure of pharmacy care, including quality and the price of drugs, North Dakota’s independent pharmacies outperform those of neighboring states and the U.S. as a whole. Unsurprisingly North Dakota also has more pharmacies per capita than other states. Its rural residents are more likely to have a nearby pharmacist.
North Dakota’s banking system reflects a similar community-based structure. An analysis by ILSR found that, on a per capita basis, the state boasts almost six times as many locally owned financial institutions as the rest of the nation. (89 small and mid sized community banks and 38 credit unions). These control 83 percent of the deposits of the state. North Dakota’s community banks have given 400 percent more small business loans than the national average. Student loan rates are among the lowest in the country.
As Stacy Mitchell, Director of ILSR’s Community-Scaled Economy Initiative observes, “While the publicly owned BND might well be characterized as a socialist institution, it has had the effect of enabling North Dakota’s local banks to be very successful capitalists.” In recent years local banks in North Dakota have earned a return on capital nearly twice that of the nation’s largest 20 banks.
In the last two decades years the BND has generated almost $1 billion in “profit” and returned almost half of that to the state’s general fund.
Recall that in 1919, voters had approved the Bank of North Dakota, by the very slim margin of 51-49. A switch of 2,000 votes would have killed the Bank in its infancy. Today no party would dare propose its destruction.
North Dakota’s impressive 21st century telecommunications infrastructure is also a testament to its historic focus on local and independent ownership. The state ranks 47th in population density. That means it has one of the highest costs per household for installing state-of-the-art high-speed fiber networks. Nevertheless it boasts the highest percentage of people with access to such networks in the country. Why? One reason is its abundance of rural cooperatives and small telecom companies, 41 providers in all, including 17 cooperatives.
North Dakota is also home to the Dakota Carrier Network. Owned by 15 independent rural telecommunications companies, the DCN crisscrosses the state with more 1,400 miles of fiber backbone. In the last five years independently owned companies have invested more than $100 million per year to bring fiber to the home. They now serve more than 164,000 customers in 250 communities.
What Should Bernie’s Brethren Do?
Certainly the road to political power faces many more obstacles now than the NPL faced a century ago. North Dakota was a largely agricultural state. The key to NPL’s organizing effort was access to a car and gas money, not an easy get in those days, but much easier than the amount of money now needed to mount a political campaign.
Most new movements will be unable to take advantage of the open primary. After the NPL gained power in more than half a dozen states, the existing parties fought back. Nevertheless, 11 states still have pure open primaries; about a dozen more have hybrid systems.
Recently the courts have not been sympathetic to the open primary. Not long ago the Supreme Court invented a new “right of association” and bestowed that right on political parties. In 2000, for example, by a 7-2 vote, the Court overturned a California form of open primary approved by the voters by a 60-40 vote. Writing for the Majority, Justice Antonin Scalia objected that the California law “forces political parties to associate with—to have their nominees, and hence their positions, determined by—those who, at best, have refused to affiliate with the party, and, at worst, have expressly affiliated with a rival.”
After the California decision the voters of Washington, by a similar 60-40 vote, adopted an open primary system similar to California’s but with a key difference: The candidate would have to declare a “party preference” that would appear next to his or her name on the ballot. In 2008, the Supreme Court, again by a 7-2 vote, this time upheld that law, a ruling that might allow for a variant of the NPL strategy.
Before we develop a strategy for winning office we need to take a page from the NPL playbook and develop a platform, one consisting of specific, concrete, policies, not a laundry list of all desirable policies.
Bernie Sanders and his followers currently are working to write a platform for the Democratic Party convention. That is important and useful, but that platform by its nature will have a national focus and speak to the exercise of power by the federal government. We also need platforms that focus on states and cities and counties and school districts and offer concrete measures they have the authority to enact.
Those platforms will provide the basis for endorsing candidates, regardless of their political affiliation or whether they run in a closed or open primary state. In those states that permit, we may be able to enact various planks of the platform through initiative and referendum. At this point 27 states have initiative and 24 have referendum. Nineteen allow constitutional amendments by initiative.
The Nonpartisan League’s tenure in power was brief, but its policies, the public institutions it built and perhaps most important, the public sentiment it nurtured and brought to maturity, endure to this day: A true example of a political revolution from the bottom up.
The Age of Disintegration: Neoliberalism, Interventionism, the Resource Curse, and a Fragmenting World
By Patrick Cockburn
Reprinted with permission from TomDispatch.com
Introduction by Tom Engelhardt:
Here’s an unavoidable fact: we are now in a Brexit world. We are seeing the first signs of a major fragmentation of this planet that, until recently, the cognoscenti were convinced was globalizing rapidly and headed for unifications of all sorts. If you want a single figure that catches the grim spirit of our moment, it’s 65 million. That’s the record-setting number of people that the Office of the U.N. High Commissioner for Refugees estimates were displaced in 2015 by “conflict and persecution,” one of every 113 inhabitants of the planet. That’s more than were generated in the wake of World War II at a time when significant parts of the globe had been devastated. Of the 21 million refugees among them, 51% were children (often separated from their parents and lacking any access to education). Most of the displaced of 2015 were, in fact, internal refugees, still in their own often splintered states. Almost half of those who fled across borders have come from three countries: Syria (4.9 million), Afghanistan (2.7 million), and Somalia (1.1 million).
Despite the headlines about refugees heading for Europe — approximately a million of them made it there last year (with more dying on the way) — most of the uprooted who leave their homelands end up in poor or economically mid-level neighboring lands, with Turkey at 2.5 million refugees leading the way. In this fashion, the disruption of spreading conflicts and chaos, especially across the Greater Middle East and Africa, only brings more conflict and chaos with it wherever those refugees are forced to go.
And keep in mind that, as extreme as that 65 million figure may seem, it undoubtedly represents the beginning, not the end, of a process. For one thing, it doesn’t even include the estimated 19 million people displaced last year by extreme weather events and other natural disasters. Yet in coming decades, the heating of our planet, with attendant weather extremes (like the present heat wave in the American West) and rising sea levels, will undoubtedly produce its own waves of new refugees, only adding to both the conflicts and the fragmentation.
As Patrick Cockburn points out today, we have entered “an age of disintegration.” And he should know. There may be no Western reporter who has covered the grim dawn of that age in the Greater Middle East and North Africa — from Afghanistan to Iraq, Syria to Libya — more fully or movingly than he has over this last decade and a half. His latest book, Chaos & Caliphate: Jihadis and the West in the Struggle for the Middle East, gives a vivid taste of his reporting and of a world that is at present cracking under the pressure of the conflicts he has witnessed. And imagine that so much of this began, at the bargain-basement cost of a mere $400,000 to $500,000, with 19 (mainly Saudi) fanatics, and a few hijacked airliners. Osama bin Laden must be smiling in his watery grave. Tom
The Age of Disintegration
Neoliberalism, Interventionism, the Resource Curse, and a Fragmenting World
By Patrick Cockburn
We live in an age of disintegration. Nowhere is this more evident than in the Greater Middle East and Africa. Across the vast swath of territory between Pakistan and Nigeria, there are at least seven ongoing wars — in Afghanistan, Iraq, Syria, Yemen, Libya, Somalia, and South Sudan. These conflicts are extraordinarily destructive. They are tearing apart the countries in which they are taking place in ways that make it doubtful they will ever recover. Cities like Aleppo in Syria, Ramadi in Iraq, Taiz in Yemen, and Benghazi in Libya have been partly or entirely reduced to ruins. There are also at least three other serious insurgencies: in southeast Turkey, where Kurdish guerrillas are fighting the Turkish army, in Egypt’s Sinai Peninsula where a little-reported but ferocious guerrilla conflict is underway, and in northeast Nigeria and neighboring countries where Boko Haram continues to launch murderous attacks.
All of these have a number of things in common: they are endless and seem never to produce definitive winners or losers. (Afghanistan has effectively been at war since 1979, Somalia since 1991.) They involve the destruction or dismemberment of unified nations, their de facto partition amid mass population movements and upheavals — well publicized in the case of Syria and Iraq, less so in places like South Sudan where more than 2.4 million people have been displaced in recent years.
Add in one more similarity, no less crucial for being obvious: in most of these countries, where Islam is the dominant religion, extreme Salafi-Jihadi movements, including the Islamic State (IS), al-Qaeda, and the Taliban are essentially the only available vehicles for protest and rebellion. By now, they have completely replaced the socialist and nationalist movements that predominated in the twentieth century; these years have, that is, seen a remarkable reversion to religious, ethnic, and tribal identity, to movements that seek to establish their own exclusive territory by the persecution and expulsion of minorities.
In the process and under the pressure of outside military intervention, a vast region of the planet seems to be cracking open. Yet there is very little understanding of these processes in Washington. This was recently well illustrated by the protest of 51 State Department diplomats against President Obama’s Syrian policy and their suggestion that air strikes be launched targeting Syrian regime forces in the belief that President Bashar al-Assad would then abide by a ceasefire. The diplomats’ approach remains typically simpleminded in this most complex of conflicts, assuming as it does that the Syrian government’s barrel-bombing of civilians and other grim acts are the “root cause of the instability that continues to grip Syria and the broader region.”
It is as if the minds of these diplomats were still in the Cold War era, as if they were still fighting the Soviet Union and its allies. Against all the evidence of the last five years, there is an assumption that a barely extant moderate Syrian opposition would benefit from the fall of Assad, and a lack of understanding that the armed opposition in Syria is entirely dominated by the Islamic State and al-Qaeda clones.
Though the invasion of Iraq in 2003 is now widely admitted to have been a mistake (even by those who supported it at the time), no real lessons have been learned about why direct or indirect military interventions by the U.S. and its allies in the Middle East over the last quarter century have all only exacerbated violence and accelerated state failure.
A Mass Extinction of Independent States
The Islamic State, just celebrating its second anniversary, is the grotesque outcome of this era of chaos and conflict. That such a monstrous cult exists at all is a symptom of the deep dislocation societies throughout that region, ruled by corrupt and discredited elites, have suffered. Its rise — and that of various Taliban and al-Qaeda-style clones — is a measure of the weakness of its opponents.
The Iraqi army and security forces, for example, had 350,000 soldiers and 660,000 police on the books in June 2014 when a few thousand Islamic State fighters captured Mosul, the country’s second largest city, which they still hold. Today the Iraqi army, security services, and about 20,000 Shia paramilitaries backed by the massive firepower of the United States and allied air forces have fought their way into the city of Fallujah, 40 miles west of Baghdad, against the resistance of IS fighters who may have numbered as few as 900. In Afghanistan, the resurgence of the Taliban, supposedly decisively defeated in 2001, came about less because of the popularity of that movement than the contempt with which Afghans came to regard their corrupt government in Kabul.
Everywhere nation states are enfeebled or collapsing, as authoritarian leaders battle for survival in the face of mounting external and internal pressures. This is hardly the way the region was expected to develop. Countries that had escaped from colonial rule in the second half of the twentieth century were supposed to become more, not less, unified as time passed.
Between 1950 and 1975, nationalist leaders came to power in much of the previously colonized world. They promised to achieve national self-determination by creating powerful independent states through the concentration of whatever political, military, and economic resources were at hand. Instead, over the decades, many of these regimes transmuted into police states controlled by small numbers of staggeringly wealthy families and a coterie of businessmen dependent on their connections to such leaders as Hosni Mubarak in Egypt or Bashar al-Assad in Syria.
In recent years, such countries were also opened up to the economic whirlwind of neoliberalism, which destroyed any crude social contract that existed between rulers and ruled. Take Syria. There, rural towns and villages that had once supported the Baathist regime of the al-Assad family because it provided jobs and kept the prices of necessities low were, after 2000, abandoned to market forces skewed in favor of those in power. These places would become the backbone of the post-2011 uprising. At the same time, institutions like the Organization of Petroleum Exporting Countries (OPEC) that had done so much to enhance the wealth and power of regional oil producers in the 1970s have lost their capacity for united action.
The question for our moment: Why is a “mass extinction” of independent states taking place in the Middle East, North Africa, and beyond? Western politicians and media often refer to such countries as “failed states.” The implication embedded in that term is that the process is a self-destructive one. But several of the states now labeled “failed” like Libya only became so after Western-backed opposition movements seized power with the support and military intervention of Washington and NATO, and proved too weak to impose their own central governments and so a monopoly of violence within the national territory.
In many ways, this process began with the intervention of a U.S.-led coalition in Iraq in 2003 leading to the overthrow of Saddam Hussein, the shutting down of his Baathist Party, and the disbanding of his military. Whatever their faults, Saddam and Libya’s autocratic ruler Muammar Gaddafi were clearly demonized and blamed for all ethnic, sectarian, and regional differences in the countries they ruled, forces that were, in fact, set loose in grim ways upon their deaths.
A question remains, however: Why did the opposition to autocracy and to Western intervention take on an Islamic form and why were the Islamic movements that came to dominate the armed resistance in Iraq and Syria in particular so violent, regressive, and sectarian? Put another way, how could such groups find so many people willing to die for their causes, while their opponents found so few? When IS battle groups were sweeping through northern Iraq in the summer of 2014, soldiers who had thrown aside their uniforms and weapons and deserted that country’s northern cities would justify their flight by saying derisively: “Die for [then-Prime Minister Nouri] al-Maliki? Never!”
A common explanation for the rise of Islamic resistance movements is that the socialist, secularist, and nationalist opposition had been crushed by the old regimes’ security forces, while the Islamists were not. In countries like Libya and Syria, however, Islamists were savagely persecuted, too, and they still came to dominate the opposition. And yet, while these religious movements were strong enough to oppose governments, they generally have not proven strong enough to replace them.
Too Weak to Win, But Too Strong to Lose
Though there are clearly many reasons for the present disintegration of states and they differ somewhat from place to place, one thing is beyond question: the phenomenon itself is becoming the norm across vast reaches of the planet.
If you’re looking for the causes of state failure in our time, the place to start is undoubtedly with the end of the Cold War a quarter-century ago. Once it was over, neither the U.S. nor the new Russia that emerged from the Soviet Union’s implosion had a significant interest in continuing to prop up “failed states,” as each had for so long, fearing that the rival superpower and its local proxies would otherwise take over. Previously, national leaders in places like the Greater Middle East had been able to maintain a degree of independence for their countries by balancing between Moscow and Washington. With the break-up of the Soviet Union, this was no longer feasible.
In addition, the triumph of neoliberal free-market economics in the wake of the Soviet Union’s collapse added a critical element to the mix. It would prove far more destabilizing than it looked at the time.
Again, consider Syria. The expansion of the free market in a country where there was neither democratic accountability nor the rule of law meant one thing above all: plutocrats linked to the nation’s ruling family took anything that seemed potentially profitable. In the process, they grew staggeringly wealthy, while the denizens of Syria’s impoverished villages, country towns, and city slums, who had once looked to the state for jobs and cheap food, suffered. It should have surprised no one that those places became the strongholds of the Syrian uprising after 2011. In the capital, Damascus, as the reign of neoliberalism spread, even the lesser members of the mukhabarat, or secret police, found themselves living on only $200 to $300 a month, while the state became a machine for thievery.
This sort of thievery and the auctioning off of the nation’s patrimony spread across the region in these years. The new Egyptian ruler, General Abdel Fattah al-Sisi, merciless toward any sign of domestic dissent, was typical. In a country that once had been a standard bearer for nationalist regimes the world over, he didn’t hesitate this April to try to hand over two islands in the Red Sea to Saudi Arabia on whose funding and aid his regime is dependent. (To the surprise of everyone, an Egyptian court recently overruled Sisi’s decision.)
That gesture, deeply unpopular among increasingly impoverished Egyptians, was symbolic of a larger change in the balance of power in the Middle East: once the most powerful states in the region — Egypt, Syria, and Iraq — had been secular nationalists and a genuine counterbalance to Saudi Arabia and the Persian Gulf monarchies. As those secular autocracies weakened, however, the power and influence of the Sunni fundamentalist monarchies only increased. If 2011 saw rebellion and revolution spread across the Greater Middle East as the Arab Spring briefly blossomed, it also saw counterrevolution spread, funded by those oil-rich absolute Gulf monarchies, which were never going to tolerate democratic secular regime change in Syria or Libya.
Add in one more process at work making such states ever more fragile: the production and sale of natural resources — oil, gas, and minerals — and the kleptomania that goes with it. Such countries often suffer from what has become known as “the resources curse”: states increasingly dependent for revenues on the sale of their natural resources — enough to theoretically provide the whole population with a reasonably decent standard of living — turn instead into grotesquely corrupt dictatorships. In them, the yachts of local billionaires with crucial connections to the regime of the moment bob in harbors surrounded by slums running with raw sewage. In such nations, politics tends to focus on elites battling and maneuvering to steal state revenues and transfer them as rapidly as possible out of the country.
This has been the pattern of economic and political life in much of sub-Saharan Africa from Angola to Nigeria. In the Middle East and North Africa, however, a somewhat different system exists, one usually misunderstood by the outside world. There is similarly great inequality in Iraq or Saudi Arabia with similarly kleptocratic elites. They have, however, ruled over patronage states in which a significant part of the population is offered jobs in the public sector in return for political passivity or support for the kleptocrats.
In Iraq with a population of 33 million people, for instance, no less than seven million of them are on the government payroll, thanks to salaries or pensions that cost the government $4 billion a month. This crude way of distributing oil revenues to the people has often been denounced by Western commentators and economists as corruption. They, in turn, generally recommend cutting the number of these jobs, but this would mean that all, rather than just part, of the state’s resource revenues would be stolen by the elite. This, in fact, is increasingly the case in such lands as oil prices bottom out and even the Saudi royals begin to cut back on state support for the populace.
Neoliberalism was once believed to be the path to secular democracy and free-market economies. In practice, it has been anything but. Instead, in conjunction with the resource curse, as well as repeated military interventions by Washington and its allies, free-market economics has profoundly destabilized the Greater Middle East. Encouraged by Washington and Brussels, twenty-first-century neoliberalism has made unequal societies ever more unequal and helped transform already corrupt regimes into looting machines. This is also, of course, a formula for the success of the Islamic State or any other radical alternative to the status quo. Such movements are bound to find support in impoverished or neglected regions like eastern Syria or eastern Libya.
Note, however, that this process of destabilization is by no means confined to the Greater Middle East and North Africa. We are indeed in the age of destabilization, a phenomenon that is on the rise globally and at present spreading into the Balkans and Eastern Europe (with the European Union ever less able to influence events there). People no longer speak of European integration, but of how to prevent the complete break-up of the European Union in the wake of the British vote to leave.
The reasons why a narrow majority of Britons voted for Brexit have parallels with the Middle East: the free-market economic policies pursued by governments since Margaret Thatcher was prime minister have widened the gap between rich and poor and between wealthy cities and much of the rest of the country. Britain might be doing well, but millions of Britons did not share in the prosperity. The referendum about continued membership in the European Union, the option almost universally advocated by the British establishment, became the catalyst for protest against the status quo. The anger of the “Leave” voters has much in common with that of Donald Trump supporters in the United States.
The U.S. remains a superpower, but is no longer as powerful as it once was. It, too, is feeling the strains of this global moment, in which it and its local allies are powerful enough to imagine they can get rid of regimes they do not like, but either they do not quite succeed, as in Syria, or succeed but cannot replace what they have destroyed, as in Libya. An Iraqi politician once said that the problem in his country was that parties and movements were “too weak to win, but too strong to lose.” This is increasingly the pattern for the whole region and is spreading elsewhere. It carries with it the possibility of an endless cycle of indecisive wars and an era of instability that has already begun.
Patrick Cockburn is a Middle East correspondent for the Independent of London and the author of five books on the Middle East, the latest of which is Chaos and Caliphate: Jihadis and the West in the Struggle for the Middle East(OR Books).
Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Nick Turse’s Next Time They’ll Come to Count the Dead, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 Patrick Cockburn
Sunday, April 17th was the designated moment. The world’s leading oil producers were expected to bring fresh discipline to the chaotic petroleum market and spark a return to high prices. Meeting in Doha, the glittering capital of petroleum-rich Qatar, the oil ministers of the Organization of the Petroleum Exporting Countries (OPEC), along with such key non-OPEC producers as Russia and Mexico, were scheduled to ratify a draft agreement obliging them to freeze their oil output at current levels. In anticipation of such a deal, oil prices had begun to creep inexorably upward, from $30 per barrel in mid-January to $43 on the eve of the gathering. But far from restoring the old oil order, the meeting ended in discord, driving prices down again and revealing deep cracks in the ranks of global energy producers.
It is hard to overstate the significance of the Doha debacle. At the very least, it will perpetuate the low oil prices that have plagued the industry for the past two years, forcing smaller firms into bankruptcy and erasing hundreds of billions of dollars of investments in new production capacity. It may also have obliterated any future prospects for cooperation between OPEC and non-OPEC producers in regulating the market. Most of all, however, it demonstrated that the petroleum-fueled world we’ve known these last decades — with oil demand always thrusting ahead of supply, ensuring steady profits for all major producers — is no more. Replacing it is an anemic, possibly even declining, demand for oil that is likely to force suppliers to fight one another for ever-diminishing market shares.
The Road to Doha
Before the Doha gathering, the leaders of the major producing countries expressed confidence that a production freeze would finally halt the devastating slump in oil prices that began in mid-2014. Most of them are heavily dependent on petroleum exports to finance their governments and keep restiveness among their populaces at bay. Both Russia and Venezuela, for instance, rely on energy exports for approximately 50% of government income, while for Nigeria it’s more like 75%. So the plunge in prices had already cut deep into government spending around the world, causing civil unrest and even in some cases political turmoil.
No one expected the April 17th meeting to result in an immediate, dramatic price upturn, but everyone hoped that it would lay the foundation for a steady rise in the coming months. The leaders of these countries were well aware of one thing: to achieve such progress, unity was crucial. Otherwise they were not likely to overcome the various factors that had caused the price collapsein the first place. Some of these were structural and embedded deep in the way the industry had been organized; some were the product of their own feckless responses to the crisis.
On the structural side, global demand for energy had, in recent years, ceased to rise quickly enough to soak up all the crude oil pouring onto the market, thanks in part to new supplies from Iraq and especially from the expanding shale fields of the United States. This oversupply triggered the initial 2014 price drop when Brent crude — the international benchmark blend — went from a high of $115 on June 19th to $77 on November 26th, the day before a fateful OPEC meeting in Vienna. The next day, OPEC members, led by Saudi Arabia, failed to agree on either production cuts or a freeze, and the price of oil went into freefall.
The failure of that November meeting has been widely attributed to the Saudis’ desire to kill off new output elsewhere — especially shale production in the United States — and to restore their historic dominance of the global oil market. Many analysts were also convinced that Riyadh was seeking to punish regional rivals Iran and Russia for their support of the Assad regime in Syria (which the Saudis seek to topple).
The rejection, in other words, was meant to fulfill two tasks at the same time: blunt or wipe out the challenge posed by North American shale producers and undermine two economically shaky energy powers that opposed Saudi goals in the Middle East by depriving them of much needed oil revenues. Because Saudi Arabia could produce oil so much more cheaply than other countries — for as little as $3 per barrel — and because it could draw upon hundreds of billions of dollars in sovereign wealth funds to meet any budget shortfalls of its own, its leaders believed it more capable of weathering any price downturn than its rivals. Today, however, that rosy prediction is looking grimmer as the Saudi royals begin to feel the pinch of low oil prices, and find themselves cutting back on the benefits they had been passing on to an ever-growing, potentially restive population while still financing a costly, inconclusive, and increasingly disastrous war in Yemen.
Many energy analysts became convinced that Doha would prove the decisive moment when Riyadh would finally be amenable to a production freeze. Just days before the conference, participants expressed growing confidence that such a plan would indeed be adopted. After all, preliminary negotiations between Russia, Venezuela, Qatar, and Saudi Arabia had produced a draft document that most participants assumed was essentially ready for signature. The only sticking point: the nature of Iran’s participation.
The Iranians were, in fact, agreeable to such a freeze, but only after they were allowed to raise their relatively modest daily output to levels achieved in 2012 before the West imposed sanctions in an effort to force Tehran to agree to dismantle its nuclear enrichment program. Now that those sanctions were, in fact, being lifted as a result of the recently concluded nuclear deal, Tehran was determined to restore the status quo ante. On this, the Saudis balked, having no wish to see their arch-rival obtain added oil revenues. Still, most observers assumed that, in the end, Riyadh would agree to a formula allowing Iran some increase before a freeze. “There are positive indications an agreement will be reached during this meeting… an initial agreement on freezing production,” said Nawal Al-Fuzaia, Kuwait’s OPEC representative, echoing the views of other Doha participants.
But then something happened. According to people familiar with the sequence of events, Saudi Arabia’s Deputy Crown Prince and key oil strategist, Mohammed bin Salman, called the Saudi delegation in Doha at 3:00 a.m. on April 17th and instructed them to spurn a deal that provided leeway of any sort for Iran. When the Iranians — who chose not to attend the meeting — signaled that they had no intention of freezing their output to satisfy their rivals, the Saudis rejected the draft agreement it had helped negotiate and the assembly ended in disarray.
Geopolitics to the Fore
Most analysts have since suggested that the Saudi royals simply considered punishing Iran more important than raising oil prices. No matter the cost to them, in other words, they could not bring themselves to help Iran pursue its geopolitical objectives, including giving yet more support to Shiite forces in Iraq, Syria, Yemen, and Lebanon. Already feeling pressured by Tehran and ever less confident of Washington’s support, they were ready to use any means available to weaken the Iranians, whatever the danger to themselves.
“The failure to reach an agreement in Doha is a reminder that Saudi Arabia is in no mood to do Iran any favors right now and that their ongoing geopolitical conflict cannot be discounted as an element of the current Saudi oil policy,” said Jason Bordoff of the Center on Global Energy Policy at Columbia University.
Many analysts also pointed to the rising influence of Deputy Crown Prince Mohammed bin Salman, entrusted with near-total control of the economy and the military by his aging father, King Salman. As Minister of Defense, the prince has spearheaded the Saudi drive to counter the Iranians in a regional struggle for dominance. Most significantly, he is the main force behind Saudi Arabia’s ongoing intervention in Yemen, aimed at defeating the Houthi rebels, a largely Shia group with loose ties to Iran, and restoring deposed former president Abd Rabbuh Mansur Hadi. After a year of relentless U.S.-backed airstrikes (including the use of cluster bombs), the Saudi intervention has, in fact, failed to achieve its intended objectives, though it has produced thousands of civilian casualties, provoking fierce condemnation from U.N. officials, and created space for the rise of al-Qaeda in the Arabian Peninsula. Nevertheless, the prince seems determined to keep the conflict going and to counter Iranian influence across the region.
For Prince Mohammed, the oil market has evidently become just another arena for this ongoing struggle. “Under his guidance,” the Financial Timesnoted in April, “Saudi Arabia’s oil policy appears to be less driven by the price of crude than global politics, particularly Riyadh’s bitter rivalry with post-sanctions Tehran.” This seems to have been the backstory for Riyadh’s last-minute decision to scuttle the talks in Doha. On April 16th, for instance, Prince Mohammed couldn’t have been blunter to Bloomberg, even if he didn’t mention the Iranians by name: “If all major producers don’t freeze production, we will not freeze production.”
With the proposed agreement in tatters, Saudi Arabia is now expected to boost its own output, ensuring that prices will remain bargain-basement low and so deprive Iran of any windfall from its expected increase in exports. The kingdom, Prince Mohammed told Bloomberg, was prepared to immediately raise production from its current 10.2 million barrels per day to 11.5 million barrels and could add another million barrels “if we wanted to” in the next six to nine months. With Iranian and Iraqi oil heading for market in larger quantities, that’s the definition of oversupply. It would certainly ensure Saudi Arabia’s continued dominance of the market, but it might also wound the kingdom in a major way, if not fatally.
A New Global Reality
No doubt geopolitics played a significant role in the Saudi decision, but that’s hardly the whole story. Overshadowing discussions about a possible production freeze was a new fact of life for the oil industry: the past would be no predictor of the future when it came to global oil demand. Whatever the Saudis think of the Iranians or vice versa, their industry is being fundamentally transformed, altering relationships among the major producers and eroding their inclination to cooperate.
Until very recently, it was assumed that the demand for oil would continue to expand indefinitely, creating space for multiple producers to enter the market, and for ones already in it to increase their output. Even when supply outran demand and drove prices down, as has periodically occurred, producers could always take solace in the knowledge that, as in the past, demand would eventually rebound, jacking prices up again. Under such circumstances and at such a moment, it was just good sense for individual producers to cooperate in lowering output, knowing that everyone would benefit sooner or later from the inevitable price increase.
But what happens if confidence in the eventual resurgence of demand begins to wither? Then the incentives to cooperate begin to evaporate, too, and it’s every producer for itself in a mad scramble to protect market share. This new reality — a world in which “peak oil demand,” rather than “peak oil,” will shape the consciousness of major players — is what the Doha catastrophe foreshadowed.
At the beginning of this century, many energy analysts were convinced that we were at the edge of the arrival of “peak oil”; a peak, that is, in the output of petroleum in which planetary reserves would be exhausted long before the demand for oil disappeared, triggering a global economic crisis. As a result of advances in drilling technology, however, the supply of oil has continued to grow, while demand has unexpectedly begun to stall. This can be traced both to slowing economic growth globally and to an accelerating “green revolution” in which the planet will be transitioning to non-carbon fuel sources. With most nations now committed to measures aimed at reducing emissions of greenhouse gases under the just-signed Paris climate accord, the demand for oil is likely to experience significant declines in the years ahead. In other words, global oil demand will peak long before supplies begin to run low, creating a monumental challenge for the oil-producing countries.
This is no theoretical construct. It’s reality itself. Net consumption of oil in the advanced industrialized nations has already dropped from 50 million barrels per day in 2005 to 45 million barrels in 2014. Further declines are in store as strict fuel efficiency standards for the production of new vehicles and other climate-related measures take effect, the price of solar and wind power continues to fall, and other alternative energy sources come on line. While the demand for oil does continue to rise in the developing world, even there it’s not climbing at rates previously taken for granted. With such countries also beginning to impose tougher constraints on carbon emissions, global consumption is expected to reach a peak and begin an inexorable decline.According to experts Thijs Van de Graaf and Aviel Verbruggen, overall world peak demand could be reached as early as 2020.
In such a world, high-cost oil producers will be driven out of the market and the advantage — such as it is — will lie with the lowest-cost ones. Countries that depend on petroleum exports for a large share of their revenues will come under increasing pressure to move away from excessive reliance on oil. This may have been another consideration in the Saudi decision at Doha. In the months leading up to the April meeting, senior Saudi officials dropped hints that they were beginning to plan for a post-petroleum era and that Deputy Crown Prince bin Salman would play a key role in overseeing the transition.
On April 1st, the prince himself indicated that steps were underway to begin this process. As part of the effort, he announced, he was planning an initial public offering of shares in state-owned Saudi Aramco, the world’s number one oil producer, and would transfer the proceeds, an estimated $2 trillion, to its Public Investment Fund (PIF). “IPOing Aramco and transferring its shares to PIF will technically make investments the source of Saudi government revenue, not oil,” the prince pointed out. “What is left now is to diversify investments. So within 20 years, we will be an economy or state that doesn’t depend mainly on oil.”
For a country that more than any other has rested its claim to wealth and power on the production and sale of petroleum, this is a revolutionary statement. If Saudi Arabia says it is ready to begin a move away from reliance on petroleum, we are indeed entering a new world in which, among other things, the titans of oil production will no longer hold sway over our lives as they have in the past.
This, in fact, appears to be the outlook adopted by Prince Mohammed in the wake of the Doha debacle. In announcing the kingdom’s new economic blueprint on April 25th, he vowed to liberate the country from its “addiction” to oil.” This will not, of course, be easy to achieve, given the kingdom’s heavy reliance on oil revenues and lack of plausible alternatives. The 30-year-old prince could also face opposition from within the royal family to his audacious moves (as well as his blundering ones in Yemen and possibly elsewhere). Whatever the fate of the Saudi royals, however, if predictions of a future peak in world oil demand prove accurate, the debacle in Doha will be seen as marking the beginning of the end of the old oil order.
Michael T. Klare, a TomDispatch regular, is a professor of peace and world security studies at Hampshire College and the author, most recently, of The Race for What’s Left. A documentary movie version of his book Blood and Oil is available from the Media Education Foundation. Follow him on Twitter at @mklare1.
Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Nick Turse’s Tomorrow’s Battlefield: U.S. Proxy Wars and Secret Ops in Africa, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 Michael T. Klare
By Robert Reich
Reprinted from Robert Reich’s blog at robertreich.org
A crowning achievement of the historic March on Washington, where Dr. Martin Luther King gave his “I have a dream” speech, was pushing through the landmark Voting Rights Act of 1965. Recognizing the history of racist attempts to prevent Black people from voting, that federal law forced a number of southern states and districts to adhere to federal guidelines allowing citizens access to the polls.
But in 2013 the Supreme Court effectively gutted many of these protections. As a result, states are finding new ways to stop more and more people—especially African-Americans and other likely Democratic voters—from reaching the polls.
Several states are requiring government-issued photo IDs—like drivers licenses—to vote even though there’s no evidence of the voter fraud this is supposed to prevent. But there’s plenty of evidence that these ID measures depress voting, especially among communities of color, young voters, and lower-income Americans.
Alabama, after requiring photo IDs, has practically closed driver’s license offices in counties with large percentages of black voters. Wisconsin requires a government-issued photo ID but hasn’t provided any funding to explain to prospective voters how to secure those IDs.
Other states are reducing opportunities for early voting.
And several state legislatures—not just in the South—are gerrymandering districts to reduce the political power of people of color and Democrats, and thereby guarantee Republican control in Congress.
We need to move to the next stage of voting rights—a new Voting Rights Act—that renews the law that was effectively repealed by the conservative activists on the Supreme Court.
That new Voting Rights Act should also set minimum national standards—providing automatic voter registration when people get driver’s licenses, allowing at least 2 weeks of early voting, and taking districting away from the politicians and putting it under independent commissions.
Voting isn’t a privilege. It’s a right. And that right is too important to be left to partisan politics. We must not allow anyone’s votes to be taken away.
ROBERT B. REICH is Chancellor’s Professor of Public Policy at the University of California at Berkeley and Senior Fellow at the Blum Center for Developing Economies. He served as Secretary of Labor in the Clinton administration, for which Time Magazine named him one of the ten most effective cabinet secretaries of the twentieth century. He has written fourteen books, including the best sellers “Aftershock, “The Work of Nations,” and”Beyond Outrage,” and, his most recent, “Saving Capitalism.” He is also a founding editor of the American Prospect magazine, chairman of Common Cause, a member of the American Academy of Arts and Sciences, and co-creator of the award-winning documentary, INEQUALITY FOR ALL.
By Thomas Frank, author of the just-published Listen, Liberal, or What Ever Happened to the Party of the People? (Metropolitan Books) from which this essay is adapted. He has also written Pity the Billionaire, The Wrecking Crew, and What’s the Matter With Kansas? among other works. He is the founding editor of The Baffler. Reprinted with permission from Tomdispatch.com
When you press Democrats on their uninspiring deeds — their lousy free trade deals, for example, or their flaccid response to Wall Street misbehavior — when you press them on any of these things, they automatically reply that this is the best anyone could have done. After all, they had to deal with those awful Republicans, and those awful Republicans wouldn’t let the really good stuff get through. They filibustered in the Senate. They gerrymandered the congressional districts. And besides, change takes a long time. Surely you don’t think the tepid-to-lukewarm things Bill Clinton and Barack Obama have done in Washington really represent the fiery Democratic soul.
So let’s go to a place that does. Let’s choose a locale where Democratic rule is virtually unopposed, a place where Republican obstruction and sabotage can’t taint the experiment.
Let’s go to Boston, Massachusetts, the spiritual homeland of the professional class and a place where the ideology of modern liberalism has been permitted to grow and flourish without challenge or restraint. As the seat of American higher learning, it seems unsurprising that Boston should anchor one of the most Democratic of states, a place where elected Republicans (like the new governor) are highly unusual. This is the city that virtually invented the blue-state economic model, in which prosperity arises from higher education and the knowledge-based industries that surround it.
The coming of post-industrial society has treated this most ancient of American cities extremely well. Massachusetts routinely occupies the number one spot on the State New Economy Index, a measure of how “knowledge-based, globalized, entrepreneurial, IT-driven, and innovation-based” a place happens to be. Boston ranks high on many of Richard Florida’s statistical indices of approbation — in 2003, it was number one on the “creative class index,” number three in innovation and in high tech — and his many books marvel at the city’s concentration of venture capital, its allure to young people, or the time it enticed some firm away from some unenlightened locale in the hinterlands.
Boston’s knowledge economy is the best, and it is the oldest. Boston’s metro area encompasses some 85 private colleges and universities, the greatest concentration of higher-ed institutions in the country — probably in the world. The region has all the ancillary advantages to show for this: a highly educated population, an unusually large number of patents, and more Nobel laureates than any other city in the country.
The city’s Route 128 corridor was the original model for a suburban tech district, lined ever since it was built with defense contractors and computer manufacturers. The suburbs situated along this golden thoroughfare are among the wealthiest municipalities in the nation, populated by engineers, lawyers, and aerospace workers. Their public schools are excellent, their downtowns are cute, and back in the seventies their socially enlightened residents were the prototype for the figure of the “suburban liberal.”
Another prototype: the Massachusetts Institute of Technology, situated in Cambridge, is where our modern conception of the university as an incubator for business enterprises began. According to a report on MIT’s achievements in this category, the school’s alumni have started nearly 26,000 companies over the years, including Intel, Hewlett Packard, and Qualcomm. If you were to take those 26,000 companies as a separate nation, the report tells us, its economy would be one of the most productive in the world.
Then there are Boston’s many biotech and pharmaceutical concerns, grouped together in what is known as the “life sciences super cluster,” which, properly understood, is part of an “ecosystem” in which PhDs can “partner” with venture capitalists and in which big pharmaceutical firms can acquire small ones. While other industries shrivel, the Boston super cluster grows, with the life-sciences professionals of the world lighting out for the Athens of America and the massive new “innovation centers” shoehorning themselves one after the other into the crowded academic suburb of Cambridge.
To think about it slightly more critically, Boston is the headquarters for two industries that are steadily bankrupting middle America: big learning and big medicine, both of them imposing costs that everyone else is basically required to pay and which increase at a far more rapid pace than wages or inflation. A thousand dollars a pill, 30 grand a semester: the debts that are gradually choking the life out of people where you live are what has madethis city so very rich.
Perhaps it makes sense, then, that another category in which Massachusetts ranks highly is inequality. Once the visitor leaves the brainy bustle of Boston, he discovers that this state is filled with wreckage — with former manufacturing towns in which workers watch their way of life draining away, and with cities that are little more than warehouses for people on Medicare. According to one survey, Massachusetts has the eighth-worst rate of income inequality among the states; by another metric it ranks fourth. However you choose to measure the diverging fortunes of the country’s top 10% and the rest, Massachusetts always seems to finish among the nation’s most unequal places.
Seething City on a Cliff
You can see what I mean when you visit Fall River, an old mill town 50 miles south of Boston. Median household income in that city is $33,000, among the lowest in the state; unemployment is among the highest, 15% in March 2014, nearly five years after the recession ended. Twenty-three percent of Fall River’s inhabitants live in poverty. The city lost its many fabric-making concerns decades ago and with them it lost its reason for being. People have been deserting the place for decades.
Many of the empty factories in which their ancestors worked are still standing, however. Solid nineteenth-century structures of granite or brick, these huge boxes dominate the city visually — there always seems to be one or two of them in the vista, contrasting painfully with whatever colorful plastic fast-food joint has been slapped up next door.
Most of the old factories are boarded up, unmistakable emblems of hopelessness right up to the roof. But the ones that have been successfully repurposed are in some ways even worse, filled as they often are with enterprises offering cheap suits or help with drug addiction. A clinic in the hulk of one abandoned mill has a sign on the window reading simply “Cancer & Blood.”
The effect of all this is to remind you with every prospect that this is a place and a way of life from which the politicians have withdrawn their blessing. Like so many other American scenes, this one is the product of decades of deindustrialization, engineered by Republicans and rationalized by Democrats. This is a place where affluence never returns — not because affluence for Fall River is impossible or unimaginable, but because our country’s leaders have blandly accepted a social order that constantly bids down the wages of people like these while bidding up the rewards for innovators, creatives, and professionals.
Even the city’s one real hope for new employment opportunities — an Amazon warehouse that is now in the planning stages — will serve to lock in this relationship. If all goes according to plan, and if Amazon sticks to the practices it has pioneered elsewhere, people from Fall River will one day get to do exhausting work with few benefits while being electronically monitored for efficiency, in order to save the affluent customers of nearby Boston a few pennies when they buy books or electronics.
But that is all in the future. These days, the local newspaper publishes an endless stream of stories about drug arrests, shootings, drunk-driving crashes, the stupidity of local politicians, and the lamentable surplus of “affordable housing.” The town is up to its eyeballs in wrathful bitterness against public workers. As in: Why do they deserve a decent life when the rest of us have no chance at all? It’s every man for himself here in a “competition for crumbs,” as a Fall River friend puts it.
The Great Entrepreneurial Awakening
If Fall River is pocked with empty mills, the streets of Boston are dotted with facilities intended to make innovation and entrepreneurship easy and convenient. I was surprised to discover, during the time I spent exploring the city’s political landscape, that Boston boasts a full-blown Innovation District, a disused industrial neighborhood that has actually been zoned creative — a projection of the post-industrial blue-state ideal onto the urban grid itself. The heart of the neighborhood is a building called “District Hall” — “Boston’s New Home for Innovation” — which appeared to me to be a glorified multipurpose room, enclosed in a sharply angular façade, and sharing a roof with a restaurant that offers “inventive cuisine for innovative people.” The Wi-Fi was free, the screens on the walls displayed famous quotations about creativity, and the walls themselves were covered with a high-gloss finish meant to be written on with dry-erase markers; but otherwise it was not much different from an ordinary public library. Aside from not having anything to read, that is.
This was my introduction to the innovation infrastructure of the city, much of it built up by entrepreneurs shrewdly angling to grab a piece of the entrepreneur craze. There are “co-working” spaces, shared offices for startups that can’t afford the real thing. There are startup “incubators” and startup “accelerators,” which aim to ease the innovator’s eternal struggle with an uncaring public: the Startup Institute, for example, and the famous MassChallenge, the “World’s Largest Startup Accelerator,” which runs an annual competition for new companies and hands out prizes at the end.
And then there are the innovation Democrats, led by former Governor Deval Patrick, who presided over the Massachusetts government from 2007 to 2015. He is typical of liberal-class leaders; you might even say he is their most successful exemplar. Everyone seems to like him, even his opponents. He is a witty and affable public speaker as well as a man of competence, a highly educated technocrat who is comfortable in corporate surroundings. Thanks to his upbringing in a Chicago housing project, he also understands the plight of the poor, and (perhaps best of all) he is an honest politician in a state accustomed to wide-open corruption. Patrick was also the first black governor of Massachusetts and, in some ways, an ideal Democrat for the era of Barack Obama — who, as it happens, is one of his closest political allies.
As governor, Patrick became a kind of missionary for the innovation cult. “The Massachusetts economy is an innovation economy,” he liked to declare, and he made similar comments countless times, slightly varying the order of the optimistic keywords: “Innovation is a centerpiece of the Massachusetts economy,” et cetera. The governor opened “innovation schools,” a species of ramped-up charter school. He signed the “Social Innovation Compact,” which had something to do with meeting “the private sector’s need for skilled entry-level professional talent.” In a 2009 speech called “The Innovation Economy,” Patrick elaborated the political theory of innovation in greater detail, telling an audience of corporate types in Silicon Valley about Massachusetts’s “high concentration of brainpower” and “world-class” universities, and how “we in government are actively partnering with the private sector and the universities, to strengthen our innovation industries.”
What did all of this inno-talk mean? Much of the time, it was pure applesauce — standard-issue platitudes to be rolled out every time some pharmaceutical company opened an office building somewhere in the state.
On some occasions, Patrick’s favorite buzzword came with a gigantic price tag, like the billion dollars in subsidies and tax breaks that the governor authorized in 2008 to encourage pharmaceutical and biotech companies to do business in Massachusetts. On still other occasions, favoring inno has meant bulldozing the people in its path — for instance, the taxi drivers whose livelihoods are being usurped by ridesharing apps like Uber. When these workers staged a variety of protests in the Boston area, Patrick intervened decisively on the side of the distant software company. Apparently convenience for the people who ride in taxis was more important than good pay for people who drive those taxis. It probably didn’t hurt that Uber had hired a former Patrick aide as a lobbyist, but the real point was, of course, innovation: Uber was the future, the taxi drivers were the past, and the path for Massachusetts was obvious.
A short while later, Patrick became something of an innovator himself. After his time as governor came to an end last year, he won a job as a managing director of Bain Capital, the private equity firm that was founded by his predecessor Mitt Romney — and that had been so powerfully denounced by Democrats during the 2012 election. Patrick spoke about the job like it was just another startup: “It was a happy and timely coincidence I was interested in building a business that Bain was also interested in building,” he told theWall Street Journal. Romney reportedly phoned him with congratulations.
At a 2014 celebration of Governor Patrick’s innovation leadership, Google’s Eric Schmidt announced that “if you want to solve the economic problems of the U.S., create more entrepreneurs.” That sort of sums up the ideology in this corporate commonwealth: Entrepreneurs first. But how has such a doctrine become holy writ in a party dedicated to the welfare of the common man? And how has all this come to pass in the liberal state of Massachusetts?
The answer is that I’ve got the wrong liberalism. The kind of liberalism that has dominated Massachusetts for the last few decades isn’t the stuff of Franklin Roosevelt or the United Auto Workers; it’s the Route 128/suburban-professionals variety. (Senator Elizabeth Warren is the great exception to this rule.) Professional-class liberals aren’t really alarmed by oversized rewards for society’s winners. On the contrary, this seems natural to them — because they are society’s winners. The liberalism of professionals just does not extend to matters of inequality; this is the area where soft hearts abruptly turn hard.
Innovation liberalism is “a liberalism of the rich,” to use the straightforward phrase of local labor leader Harris Gruman. This doctrine has no patience with the idea that everyone should share in society’s wealth. What Massachusetts liberals pine for, by and large, is a more perfect meritocracy — a system where the essential thing is to ensure that the truly talented get into the right schools and then get to rise through the ranks of society. Unfortunately, however, as the blue-state model makes painfully clear, there is no solidarity in a meritocracy. The ideology of educational achievement conveniently negates any esteem we might feel for the poorly graduated.
This is a curious phenomenon, is it not? A blue state where the Democrats maintain transparent connections to high finance and big pharma; where they have deliberately chosen distant software barons over working-class members of their own society; and where their chief economic proposals have to do with promoting “innovation,” a grand and promising idea that remains suspiciously vague. Nor can these innovation Democrats claim that their hands were forced by Republicans. They came up with this program all on their own.
Thomas Frank is the author of the just-published Listen, Liberal, or What Ever Happened to the Party of the People? (Metropolitan Books) from which this essay is adapted. He has also written Pity the Billionaire, The Wrecking Crew, and What’s the Matter With Kansas? among other works. He is the founding editor of The Baffler.
Copyright 2016 Thomas Frank
Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Nick Turse’s Tomorrow’s Battlefield: U.S. Proxy Wars and Secret Ops in Africa, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
The other week, feeling sick, I spent a day on my couch with the TV on and was reminded of an odd fact of American life. More than seven months before Election Day, you can watch the 2016 campaign for the presidency at any moment of your choosing, and that’s been true since at least late last year. There is essentially never a time when some network or news channel isn’t reporting on, discussing, debating, analyzing, speculating about, or simply drooling over some aspect of the primary campaign, of Hillary, Bernie, Ted, and above all — a million times above all — The Donald (from the violence at his rallies to the size of his hands). In case you’re young and think this is more or less the American norm, it isn’t. Or wasn’t.
Truly, there is something new under the sun. Of course, in 1994 with O.J. Simpson’s white Ford Bronco chase (95 million viewers!), the 24/7 media event arrived full blown in American life and something changed when it came to the way we focused on our world and the media focused on us. But you can be sure of one thing: never in the history of television, or any other form of media, has a single figure garnered the amount of attention — hour after hour, day after day, week after week — as Donald Trump. If he’s the O.J. Simpson of twenty-first-century American politics and his run for the presidency is the eternal white Ford Bronco chase of our moment, then we’re in a truly strange world.
Or let me put it another way: this is not an election. I know the word “election” is being used every five seconds and somewhere along the line significant numbers of Americans (particularly, this season, Republicans) continue to enter voting booths or in the case of primary caucuses, school gyms and the like, to choose among various candidates, so it’s all still election-like. But take my word for it as a 71-year-old guy who’s been watching our politics for decades: this is not an election of the kind the textbooks once taught us was so crucial to American democracy. If, however, you’re sitting there waiting for me to tell you what it is, take a breath and don’t be too disappointed. I have no idea, though it’s certainly part bread-and-circuses spectacle, part celebrity obsession, and part media money machine.
Actually, before we go further, let me hedge my bets on the idea that Donald Trump is a twenty-first-century O.J. Simpson. It’s certainly a reasonable enough comparison, but I’ve begun to wonder about the usefulness of just about any comparison in our present situation. Even the most nightmarish of them — Donald Trump is Adolf Hitler, Benito Mussolini, or any past extreme demagogue of your choice — may actually prove to be covert gestures of consolation, reassurance, and comfort. Yes, what’s happening in our world is increasingly extreme and could hardly be weirder, we seem to have the urge to say, but it’s still recognizable. It’s something we’ve encountered before, something we’ve made sense of in the past and, in the process, overcome.
Round Up the Usual SuspectsBut what if that’s not true? In some ways, the most frightening, least acceptable thing to say about our American world right now — even if Donald Trump’s overwhelming presence all but begs us to say it — is that we’ve entered uncharted territory and, under the circumstances, comparisons might actually impair our ability to come to grips with our new reality. My own suspicion: Donald Trump is only the most obvious instance of this, the example no one can miss.In these first years of the twenty-first century, we may be witnessing a new world being born inside the hollowed-out shell of the American system. As yet, though we live with this reality every day, we evidently just can’t bear to recognize it for what it might be. When we survey the landscape, what we tend to focus on is that shell — the usual elections (in somewhat heightened form), the usual governmental bodies (a little tarnished) with the usual governmental powers (a little diminished or redistributed), including the usual checks and balances (a little out of whack), and the same old Constitution (much praised in its absence), and yes, we know that none of this is working particularly well, or sometimes at all, but it still feels comfortable to view what we have as a reduced, shabbier, and more dysfunctional version of the known.
Perhaps, however, it’s increasingly a version of the unknown. We say, for instance, that Congress is “paralyzed,” and that little can be done in a country where politics has become so “polarized,” and we wait for something to shake us loose from that “paralysis,” to return us to a Washington closer to what we remember and recognize. But maybe this is it. Maybe even if the Republicans somehow lost control of the House of Representatives and the Senate, we would still be in a situation something like what we’re now labeling paralysis. Maybe in our new American reality, Congress is actually some kind of glorified, well-lobbied, and well-financed version of a peanut gallery.
Of course, I don’t want to deny that much of what is “new” in our world has a long history. The present yawning inequality gap between the 1% and ordinary Americans first began to widen in the 1970s and — as Thomas Frank explains so brilliantly in his new book, Listen, Liberal — was already a powerful and much-discussed reality in the early 1990s, when Bill Clinton ran for president. Yes, that gap is now more like an abyss and looks ever more permanently embedded in the American system, but it has a genuine history, as for instance do 1% elections and the rise and self-organization of the “billionaire class,” even if no one, until this second, imagined that government of the billionaires, by the billionaires, and for the billionaires might devolve into government of the billionaire, by the billionaire, and for the billionaire — that is, just one of them.
Indeed, much of our shape-shifting world can be written about as a set of comparisons and in terms of historical reference points. Inequality has a history. The military-industrial complex and the all-volunteer military, like the warrior corporation, weren’t born yesterday; neither was our state of perpetual war, nor the national security state that now looms over Washington, nor its surveilling urge, the desire to know far too much about the private lives of Americans. (A little bow of remembrance to FBI Director J. Edgar Hoover is in order here.)
And yet, true as all that may be, Washington increasingly seems like a new land, sporting something like a new system in the midst of our much-described polarized and paralyzed politics. The national security state doesn’t seem faintly paralyzed or polarized to me. Nor does the Pentagon. On certain days when I catch the news, I can’t believe how strange and yet humdrum this uncharted new territory is. Remind me, for instance, where in the Constitution the Founding Fathers wrote about that national security state? And yet there it is in all its glory, all its powers, an ever more independent force in our nation’s capital. In what way, for instance, did those men of the revolutionary era prepare the ground for the Pentagon to loose its spy drones from our distant war zones over the United States? And yet, so it has. And no one even seems disturbed by the development. The news, barely noticed or noted, was instantly absorbed into what’s becoming the new normal.
Graduation Ceremonies in the Imperium
Let me mention here the almost random piece of news that recently made me wonder just what planet I was actually on. And I know you won’t believe it, but it had absolutely nothing to do with Donald Trump.
Given the carnage of America’s wars and conflicts across the Greater Middle East and Africa, which I’ve been following closely these last years, I’m unsure why this particular moment even got to me. Best guess? Maybe that, of all the once-obscure places — from Afghanistan to Yemen to Libya — in which the U.S. has been fighting recently, Somalia, where this particular little slaughter took place, seems to me like the most obscure of all. Yes, I’ve been half-attending to events there from the 1993 Blackhawk Down moment to the disastrous U.S.-backed Ethiopian invasion of 2006 to the hardly less disastrous invasion of that country by Kenyan and other African forces. Still, Somalia?
Recently, U.S. Reaper drones and manned aircraft launched a set of strikes against what the Pentagon claimed was a graduation ceremony for “low-level” foot soldiers in the Somali terror group al-Shabab. It was proudly announced that more than 150 Somalis had died in this attack. In a country where, in recent years, U.S. drones and special ops forces had carried out a modest number of strikes against individual al-Shabab leaders, this might be thought of as a distinct escalation of Washington’s endless low-level conflict there (with a raid involving U.S. special ops forces following soon after).
Now, let me try to put this in some personal context. Since I was a kid, I’ve always liked globes and maps. I have a reasonable sense of where most countries on this planet are. Still, Somalia? I have to stop and give that one some thought to truly locate it on a mental map of eastern Africa. Most Americans? Honestly, I doubt they’d have a clue. So the other day, when this news came out, I stopped a moment to take it in. If accurate, we killed 150 more or less nobodies (except to those who knew them) and maybe even a top leader or two in a country most Americans couldn’t locate on a map.
I mean, don’t you find that just a little odd, no matter how horrible the organization they were preparing to fight for? 150 Somalis? Blam!
Remind me: On just what basis was this modest massacre carried out? After all, the U.S. isn’t at war with Somalia or with al-Shabab. Of course, Congress no longer plays any real role in decisions about American war making. It no longer declares war on any group or country we fight. (Paralysis!) War is now purely a matter of executive power or, in reality, the collective power of the national security state and the White House. The essential explanation offered for the Somali strike, for instance, is that the U.S. had a small set of advisers stationed with African Union forces in that country and it was just faintly possible that those guerrilla graduates might soon prepare to attack some of those forces (and hence U.S. military personnel). It seems that if the U.S. puts advisers in place anywhere on the planet — and any day of any year they are now in scores of countries — that’s excuse enough to validate acts of war based on the “imminent” threat of their attack.
Or just think of it this way: a new, informal constitution is being written in these years in Washington. No need for a convention or a new bill of rights. It’s a constitution focused on the use of power, especially military power, and it’s being written in blood.
These days, our government (the unparalyzed one) acts regularly on the basis of that informal constitution-in-the-making, committing Somalia-like acts across significant swathes of the planet. In these years, we’ve been marrying the latest in wonder technology, our Hellfire-missile-armed drones, to executive power and slaughtering people we don’t much like in majority Muslim countries with a certain alacrity. By now, it’s simply accepted that any commander-in-chief is also our assassin-in-chief, and that all of this is part of a wartime-that-isn’t-wartime system, spreading the principle of chaos and dissolution to whole areas of the planet, leaving failed states and terror movements in its wake.
When was it, by the way, that “the people” agreed that the president could appoint himself assassin-in-chief, muster his legal beagles to write new “law” that covered any future acts of his (including the killing of American citizens), and year after year dispatch what essentially is his own private fleet of killer drones to knock off thousands of people across the Greater Middle East and parts of Africa? Weirdly enough, after almost 14 years of this sort of behavior, with ample evidence that such strikes don’t suppress the movements Washington loathes (and often only fan the flames of resentment and revenge that help them spread), neither the current president and his top officials, nor any of the candidates for his office have the slightest intention of ever grounding those drones.
And when exactly did the people say that, within the country’s vast standing military, which now garrisons much of the planet, a force of nearly 70,000 Special Operations personnel should be birthed, or that it should conduct covert missions globally, essentially accountable only to the president (if him)? And what I find strangest of all is that few in our world find such developments strange at all.
A Planet in Decline?
In some way, all of this could be said to work. At the very least, it is a functioning new system-in-the-making that we have yet to truly come to grips with, just as we haven’t come to grips with a national security state that surveils the world in a way that even science fiction writers (no less totalitarian rulers) of a previous era could never have imagined, or the strange version of media overkill that we still call an election. All of this is by now both old news and mind-bogglingly new.
Do I understand it? Not for a second.
This is not war as we knew it, nor government as we once understood it, nor are these elections as we once imagined them, nor is this democracy as it used to be conceived of, nor is this journalism of a kind ever taught in a journalism school. This is the definition of uncharted territory. It’s a genuine American terra incognita and yet in some fashion that unknown landscape is already part of our sense of ourselves and our world. In this “election” season, many remain shocked that a leading candidate for the presidency is a demagogue with a visible authoritarian side and what looks like an autocratic bent. All such labels are pinned on Donald Trump, but the new American system that’s been emerging from its chrysalis in these years already has just those tendencies. So don’t blame it all on Donald Trump. He should be far less of a shock to this country than he continues to be. After all, a Trumpian world-in-formation has paved the way for him.
Who knows? Perhaps what we’re watching is the new iteration of a very old story: a twenty-first-century version of an ancient tale of a great imperial power, perhaps the greatest ever — the “lone superpower” — sinking into decline. It’s a tale humanity has experienced often enough in the course of our long history. But lest you think once again that there’s nothing new under the sun, the context for all of this, for everything now happening in our world, is so new as to be quite literally outside of thousands of years of human experience. As the latest heat records indicate, we are, for the first time, on a planet in decline. And if that isn’t uncharted territory, what is?
Tom Engelhardt is a co-founder of the American Empire Project and the author of The United States of Fear as well as a history of the Cold War, The End of Victory Culture. He is a fellow of the Nation Institute and runs TomDispatch.com. His latest book is Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Follow TomDispatch on Twitter and join him on Facebook. Check out the newest Dispatch Book, Nick Turse’s Tomorrow’s Battlefield: U.S. Proxy Wars and Secret Ops in Africa, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 Tom Engelhardt
The refuge’s creation helped support nearby ranchers.
Nancy Langston Feb. 2, 2016
Reprinted with permission from High Country News
National wildlife refuges such as the one at Malheur near Burns, Oregon, have importance far beyond the current furor over who manages our public lands. Such refuges are becoming increasingly critical habitat for migratory birds because 95 percent of the wetlands along the Pacific Flyway have already been lost to development.
In some years, 25 million birds visit Malheur, and if the refuge were drained and converted to intensive cattle grazing – which is something the “occupiers” threatened to do – entire populations of ducks, sandhill cranes, and shorebirds would suffer. With their long-distance flights and distinctive songs, the migratory birds visiting Malheur’s wetlands now help to tie the continent together.
This was not always the case. By the 1930s, three decades of drainage, reclamation, and drought had decimated high-desert wetlands and the birds that depended upon them. Out of the hundreds of thousands of egrets that once nested on Malheur Lake, only 121 remained. The American population of the birds had dropped by 95 percent. It took the federal government to restore Malheur’s wetlands and recover waterbird populations, bringing back healthy populations of egrets and many other species.
Yet despite the importance of wildlife refuges to America’s birds, not everyone appreciates them. At one recent news conference, Ammon Bundy called the creation of Malheur National Wildlife refuge “an unconstitutional act” that removed ranchers from their lands and plunged the county into an economic depression. This is not a new complaint. Since the Sagebrush Rebellion of the 1980s, rural communities in the West have blamed their poverty on the 640 million acres of federal public lands, which make up 52 percent of the land in Western states.
Rural Western communities are indeed suffering, but the cause is not the wildlife refuge system. Conservation of bird habitat did not lead to economic devastation, nor were refuge lands “stolen” from ranchers. If any group has prior claims to Malheur refuge, it is the Paiute Indian Tribe.
For at least 6,000 years, Malheur was the Paiutes’ home. It took a brutal Army campaign to force the people from their reservation, marching them through the snow to the state of Washington in 1879. Homesteaders and cattle barons then moved onto Paiute lands, squeezing as much livestock as possible onto dwindling pastures, and warring with each other over whose land was whose. Scars from this era persist more than a century later.
In 1908, President Roosevelt established the Malheur Lake Bird Reservation on the lands of the former Malheur Indian Reservation. But the refuge included only the lake itself, not the rivers that fed into it. Deprived of water, the lake shrank during droughts, and squatters moved onto the drying lakebed. Conservationists, realizing they needed to protect the Blitzen River that fed the lake, began a campaign to expand the refuge.
But the federal government never forced the ranchers to sell, as the occupiers at Malheur claimed, and the sale did not impoverish the community. In fact, it was just the opposite: During the Depression years of the 1930s, the federal government paid the Swift Corp. $675,000 for ruined grazing lands. Impoverished homesteaders who had squatted on refuge lands eventually received payments substantial enough to set them up as cattle ranchers nearby.
John Scharff, Malheur’s manager from 1935 to 1971, sought to transform local suspicion into acceptance by allowing local ranchers to graze cattle on the refuge. Yet some tension persisted. In the 1970s, when concern about overgrazing reduced – but did not eliminate – refuge grazing, violence erupted again. Some environmentalists denounced ranchers as parasites who destroyed wildlife habitat. A few ranchers responded with death threats against environmentalists and federal employees.
But violence is not the basin’s most important historical legacy. Through the decades, community members have come together to negotiate a better future. In the 1920s, poor homesteaders worked with conservationists to save the refuge from irrigation drainage. In the 1990s, Paiute tribal members, ranchers, environmentalists and federal agencies collaborated on innovative grazing plans to restore bird habitat while also giving ranchers more flexibility. In 2013, such efforts resulted in a landmark collaborative conservation plan for the refuge, and it offers great hope for the local economy and for wildlife.
The poet Gary Snyder wrote, “We must learn to know, love, and join our place even more than we love our own ideas. People who can agree that they share a commitment to the landscape – even if they are otherwise locked in struggle with each other – have at least one deep thing to share.”
Collaborative processes are difficult and time-consuming. Yet they have proven that they have the potential to peacefully sustain both human and wildlife communities.
Nancy Langston is a contributor to Writers on the Range, the opinion service of High Country News. She is a professor of environmental history at Michigan Technological University, and the author of a history of Malheur Refuge, Where Land and Water Meet: A Western Landscape Transformed.
Reprinted from the New Economic Perspectives blog at the University of Missouri-Kansas City
Editor’s Note: William K. Black, author of “The Best Way to Rob a Bank is to Own One,” is Associate Professor of Law and Economics at the University of Missouri-Kansas City, where — according to James Galbraith — “the best economics is now being done.”
In the latest example of the New York Times’ reporters’ inability to read Paul Krugman, we have an article claiming that the “Growing Imbalance Between Germany and France Strains Their Relationship.” The article begins with Merkel’s major myth accepted as if it were unquestionable reality.
“It was a clear illustration of the dysfunction of the French-German partnership, the axis that for decades kept Europe on a united and dynamic track.
In Berlin this month, Chancellor Angela Merkel, riding high after nine years in power, delivered a strident defense in Parliament of austerity, which she has been pushing on Europe ever since a debt crisis broke out in 2009.”
No, not true on multiple grounds. First, the so-called “debt crisis” was a symptom rather than a cause. The reader will note that the year 2008, when the Great Recession became terrifying, has somehow been removed from the narrative because it would expose the misapprehension in Merkel’s myth. Prior to 2008, only Greece had debt levels given its abandonment of a sovereign currency that posed a material risk. The EU nations had unusually low budgetary deficits leading into the Great Recession. Indeed, that along with the extremely low budgetary deficits of the Clinton administration (the budget went into surplus near the end of his term) is likely one of the triggers for the Great Recession.
The Great Recession caused sharp increases in deficits – as we have long known will happen as part of the “automatic stabilizers.” This is normal and speeds recovery. The eurozone and the U.S. began to come out of the Great Recession in 2009. The U.S. recovery accelerated with the addition of stimulus. In the eurozone, however, the abandonment of sovereign currencies and adoption of the euro exposed the periphery to recurrent attacks by the “bond vigilantes.” The ECB could have stopped these attacks at any time, but it was very late intervening – largely because of German resistance. Instead, Merkel used the leverage provided by the bond vigilantes and the refusal of the ECB to act to end their attacks to force increasing austerity upon the eurozone and demands for severe cuts in workers’ wages in the periphery.
Merkel’s actions in forcing austerity and efforts to force sharp drops in workers’ wages in the periphery were not required to stop any “debt crisis.” The ECB had the ability to end the bond vigilantes’ attacks and reestablish the ability of the periphery to borrow at low cost, as it demonstrated. Merkel’s austerity demands and demands that (largely) left governments in the periphery slash workers’ wages promptly threw the entire Eurozone back into a second Great Recession – and much of the periphery into a Second Great Depression. It had the desired purpose of discrediting the governing parties of the left, particularly in Spain, Portugal, and Greece; that gave in to Merkel’s mandates that they adopt masochistic macroeconomic policies.
It is also false that Merkel began demanding that eurozone inflict austerity only in 2009. Merkel wanted to inflict austerity and her war on the workers and the parties they primarily supported long before 2009. What changed in 2009 was that the ECB, the Great Recession, and the bond vigilantes gave her the leverage to successfully extort the members of the eurozone who opposed austerity and her war on workers and the parties of the left.
But it is what is left out of the quoted passage above that is most amazing. The fact that Merkel’s orders that the eurozone leaders bleed their economies through austerity and the war on workers’ wages led to a gratuitous Second Great Recession in the eurozone – and Great Depression levels of unemployment in much of the periphery disappears. The fact that inflicting austerity and wage cuts in response to a Great Recession is economically illiterate and cruel disappears. The fact that the overall eurozone – six years after the financial crisis of 2008 and eight years after the financial bubbles popped in 2006 – has stagnated and caused tens of trillions of dollars in lost GDP and well over 10 million lost jobs is treated by the NYT article as if it were unrelated to Merkel’s infliction of austerity.
“But the French economy has grown stagnant, with unemployment stubbornly stuck near 11 percent and an unpopular government pledging to cut tens of billions in taxes on business, which many French fear will unravel their prized welfare state.”
No, the eurozone economy “has grown stagnant” and produced a Second Great Depression in much of the periphery. If France had a sovereign currency or if the EU were to make the euro and into a true sovereign currency France could simultaneously “cut tens of billions in taxes on business” while preserving the social safety net and speeding the recovery. The same is true of the rest of the eurozone – including Germany where Merkel’s policies have made the wealthy far wealthier and deepened the economic crisis in other eurozone nations by cutting German worker’s wages. The NYT article is disingenuous about both aspects of the German economy, noting only that “the German economy has shown signs of slowing down.” German growth was actually negative in the last quarter and the treatment of its workers weakens the German and overall eurozone recovery.
It continues to be obvious that it is a condition of employment for NYT reporters covering the eurozone’s economic policies that they never read Paul Krugman (or most any other American economist). Consider this claim in the article:
“[Prime Minister Manuel Valls] and Mr. Hollande have alienated many members of the Socialist Party by taking a more centrist approach to economic policy, stoking suspicions that the government is favoring business at the expense of the welfare state.”
I will take this part very slow. By my count Krugman has written at least six columns in the NYT explaining that there actually is a powerful consensus among economists. The “centrist approach” is that austerity in response to a Great Recession is self-destructive. We have known this for at least 75 years. Modern Republicans, when they hold the presidency, always respond to a recession with a stimulus package. Valls and Hollande are moving away from a “centrist approach to economic policy.” They are doing so despite observing first-hand the self-destructive nature of austerity (and proclaiming that it is self-destructive). They do so despite the demonstrated success of stimulus in responding to the financial crisis. They do so despite the fact that the results of the faux left parties adopting these economically illiterate neo-liberal economic policies is the destruction of the parties that betray their principles and the workers. Valls and Hollande are spectacularly unpopular in France because of these betrayals. It is clear why Valls and Hollande wish to avoid reading Krugman’s critique of their betrayals, but theNYT reporters have no excuse.
The reporters do not simply ignore the insanity of austerity and the plight of the eurozone’s workers – they assert that it is obvious that Merkel is correct and that the French reluctance to slash workers’ wages is obviously economically illiterate.
“Just over a decade ago, as Ms. Merkel is fond of noting, Germany was Europe’s sick economy. It recovered partly because of changes to labor laws and social welfare. Mr. Hollande now faces a similar task in an era of low or no growth.”
No. These two sentences propound multiple Merkel myths and assume (1) that France’s (and the rest of the eurozone’s) problems are the same as Germany’s issues “just over a decade ago,” (2) that Germany “recovered” due to slashing workers’ wages and social programs, and (3) that the German “solutions” would work for the eurozone as a whole.
Germany’s “reforms,” which included increasing financial deregulation, have proven disastrous. German banks finished third in the regulatory “race to the bottom” (“behind” Wall Street and the worst of the worst – the City of London). The officers that controlled Deutsche Bank and various state-owned German banks were among the leading causes of the financial crisis. German workers had lost ground even before the financial crisis and have lost even more ground since the crisis began. Inequality has also become increasingly more extreme in Germany.
The current problem in the eurozone is a critical shortage of demand exacerbated by the insanity of austerity and Merkel’s war on workers’ wages. The word “demand” and the concept, the centerpiece of the macroeconomics of recession, never appear in the article. An individual nation in which the wealthy have the political power to lower workers’ wages can increase its exports and employ more of its citizens. This obviously does not prove that the workers were overpaid. Merkel and the NYT ignore the “fallacy of composition,” which is particularly embarrassing because they are neo-mercantilists pushing the universal goal of being a net exporter. As Adam Smith emphasized, we can’t all be net exporters. A strategy that can work (for the elites) of one nation cannot logically be assumed to work for large numbers of nations.
The last thing a society should want in a recession is rapidly falling wages and prices that can create deflation (another word expunged from the NYT article because it would refute their ode to Merkel, austerity, and her war on the worker). If France were to slash workers’ wages to try to take exports from Ireland while Ireland slashed workers’ wages to try to take exports from Spain, which did the same to take exports from Italy the result would be deflation, a massive increase in inequality, the political destruction of any (allegedly) progressive political party that joined in the war on the worker, and a “race to Bangladesh” dynamic.
Germany’s “success” in being a very large net exporter makes it far more difficult – not easier – for any other eurozone nation to copy its export strategy successfully. As a group, the strategy cannot work for the eurozone. The strategy has, of course, not simply “not succeeded.” It has failed catastrophically. Merkel’s eurozone policies have caused trillions of dollars in extra losses in productivity, the gratuitous loss of over 10 million jobs, increased inequality, and the loss through emigration of many of the best educated young citizens of the periphery.
Hollande does not face “a similar task” to Merkel. He faces different problems and Merkel’s “solutions” are the chief causes of France’s economic stagnation rather than the answers to France’s problems.
I repeat my twin suggestions to the NYT reporters that cover the eurozone’s economy. The paper’s management should host a seminar in which Krugman educates his colleagues. Alternatively, come to UMKC and we’ll provide that seminar without charge. None of us can afford the cost of the reporters’ continuing willful ignorance of economics and their indifference to the victims of austerity and Merkel’s war on workers.