A crowning achievement of the historic March on Washington, where Dr. Martin Luther King gave his “I have a dream” speech, was pushing through the landmark Voting Rights Act of 1965. Recognizing the history of racist attempts to prevent Black people from voting, that federal law forced a number of southern states and districts to adhere to federal guidelines allowing citizens access to the polls.
But in 2013 the Supreme Court effectively gutted many of these protections. As a result, states are finding new ways to stop more and more people—especially African-Americans and other likely Democratic voters—from reaching the polls.
Several states are requiring government-issued photo IDs—like drivers licenses—to vote even though there’s no evidence of the voter fraud this is supposed to prevent. But there’s plenty of evidence that these ID measures depress voting, especially among communities of color, young voters, and lower-income Americans.
Alabama, after requiring photo IDs, has practically closed driver’s license offices in counties with large percentages of black voters. Wisconsin requires a government-issued photo ID but hasn’t provided any funding to explain to prospective voters how to secure those IDs.
Other states are reducing opportunities for early voting.
And several state legislatures—not just in the South—are gerrymandering districts to reduce the political power of people of color and Democrats, and thereby guarantee Republican control in Congress.
We need to move to the next stage of voting rights—a new Voting Rights Act—that renews the law that was effectively repealed by the conservative activists on the Supreme Court.
That new Voting Rights Act should also set minimum national standards—providing automatic voter registration when people get driver’s licenses, allowing at least 2 weeks of early voting, and taking districting away from the politicians and putting it under independent commissions.
Voting isn’t a privilege. It’s a right. And that right is too important to be left to partisan politics. We must not allow anyone’s votes to be taken away.
ROBERT B. REICH is Chancellor’s Professor of Public Policy at the University of California at Berkeley and Senior Fellow at the Blum Center for Developing Economies. He served as Secretary of Labor in the Clinton administration, for which Time Magazine named him one of the ten most effective cabinet secretaries of the twentieth century. He has written fourteen books, including the best sellers “Aftershock, “The Work of Nations,” and”Beyond Outrage,” and, his most recent, “Saving Capitalism.” He is also a founding editor of the American Prospect magazine, chairman of Common Cause, a member of the American Academy of Arts and Sciences, and co-creator of the award-winning documentary, INEQUALITY FOR ALL.
When you press Democrats on their uninspiring deeds — their lousy free trade deals, for example, or their flaccid response to Wall Street misbehavior — when you press them on any of these things, they automatically reply that this is the best anyone could have done. After all, they had to deal with those awful Republicans, and those awful Republicans wouldn’t let the really good stuff get through. They filibustered in the Senate. They gerrymandered the congressional districts. And besides, change takes a long time. Surely you don’t think the tepid-to-lukewarm things Bill Clinton and Barack Obama have done in Washington really represent the fiery Democratic soul.
So let’s go to a place that does. Let’s choose a locale where Democratic rule is virtually unopposed, a place where Republican obstruction and sabotage can’t taint the experiment.
Let’s go to Boston, Massachusetts, the spiritual homeland of the professional class and a place where the ideology of modern liberalism has been permitted to grow and flourish without challenge or restraint. As the seat of American higher learning, it seems unsurprising that Boston should anchor one of the most Democratic of states, a place where elected Republicans (like the new governor) are highly unusual. This is the city that virtually invented the blue-state economic model, in which prosperity arises from higher education and the knowledge-based industries that surround it.
The coming of post-industrial society has treated this most ancient of American cities extremely well. Massachusetts routinely occupies the number one spot on the State New Economy Index, a measure of how “knowledge-based, globalized, entrepreneurial, IT-driven, and innovation-based” a place happens to be. Boston ranks high on many of Richard Florida’s statistical indices of approbation — in 2003, it was number one on the “creative class index,” number three in innovation and in high tech — and his many books marvel at the city’s concentration of venture capital, its allure to young people, or the time it enticed some firm away from some unenlightened locale in the hinterlands.
Boston’s knowledge economy is the best, and it is the oldest. Boston’s metro area encompasses some 85 private colleges and universities, the greatest concentration of higher-ed institutions in the country — probably in the world. The region has all the ancillary advantages to show for this: a highly educated population, an unusually large number of patents, and more Nobel laureates than any other city in the country.
The city’s Route 128 corridor was the original model for a suburban tech district, lined ever since it was built with defense contractors and computer manufacturers. The suburbs situated along this golden thoroughfare are among the wealthiest municipalities in the nation, populated by engineers, lawyers, and aerospace workers. Their public schools are excellent, their downtowns are cute, and back in the seventies their socially enlightened residents were the prototype for the figure of the “suburban liberal.”
Another prototype: the Massachusetts Institute of Technology, situated in Cambridge, is where our modern conception of the university as an incubator for business enterprises began. According to a report on MIT’s achievements in this category, the school’s alumni have started nearly 26,000 companies over the years, including Intel, Hewlett Packard, and Qualcomm. If you were to take those 26,000 companies as a separate nation, the report tells us, its economy would be one of the most productive in the world.
Then there are Boston’s many biotech and pharmaceutical concerns, grouped together in what is known as the “life sciences super cluster,” which, properly understood, is part of an “ecosystem” in which PhDs can “partner” with venture capitalists and in which big pharmaceutical firms can acquire small ones. While other industries shrivel, the Boston super cluster grows, with the life-sciences professionals of the world lighting out for the Athens of America and the massive new “innovation centers” shoehorning themselves one after the other into the crowded academic suburb of Cambridge.
To think about it slightly more critically, Boston is the headquarters for two industries that are steadily bankrupting middle America: big learning and big medicine, both of them imposing costs that everyone else is basically required to pay and which increase at a far more rapid pace than wages or inflation. A thousand dollars a pill, 30 grand a semester: the debts that are gradually choking the life out of people where you live are what has madethis city so very rich.
Perhaps it makes sense, then, that another category in which Massachusetts ranks highly is inequality. Once the visitor leaves the brainy bustle of Boston, he discovers that this state is filled with wreckage — with former manufacturing towns in which workers watch their way of life draining away, and with cities that are little more than warehouses for people on Medicare. According to one survey, Massachusetts has the eighth-worst rate of income inequality among the states; by another metric it ranks fourth. However you choose to measure the diverging fortunes of the country’s top 10% and the rest, Massachusetts always seems to finish among the nation’s most unequal places.
Seething City on a Cliff
You can see what I mean when you visit Fall River, an old mill town 50 miles south of Boston. Median household income in that city is $33,000, among the lowest in the state; unemployment is among the highest, 15% in March 2014, nearly five years after the recession ended. Twenty-three percent of Fall River’s inhabitants live in poverty. The city lost its many fabric-making concerns decades ago and with them it lost its reason for being. People have been deserting the place for decades.
Many of the empty factories in which their ancestors worked are still standing, however. Solid nineteenth-century structures of granite or brick, these huge boxes dominate the city visually — there always seems to be one or two of them in the vista, contrasting painfully with whatever colorful plastic fast-food joint has been slapped up next door.
Most of the old factories are boarded up, unmistakable emblems of hopelessness right up to the roof. But the ones that have been successfully repurposed are in some ways even worse, filled as they often are with enterprises offering cheap suits or help with drug addiction. A clinic in the hulk of one abandoned mill has a sign on the window reading simply “Cancer & Blood.”
The effect of all this is to remind you with every prospect that this is a place and a way of life from which the politicians have withdrawn their blessing. Like so many other American scenes, this one is the product of decades of deindustrialization, engineered by Republicans and rationalized by Democrats. This is a place where affluence never returns — not because affluence for Fall River is impossible or unimaginable, but because our country’s leaders have blandly accepted a social order that constantly bids down the wages of people like these while bidding up the rewards for innovators, creatives, and professionals.
Even the city’s one real hope for new employment opportunities — an Amazon warehouse that is now in the planning stages — will serve to lock in this relationship. If all goes according to plan, and if Amazon sticks to the practices it has pioneered elsewhere, people from Fall River will one day get to do exhausting work with few benefits while being electronically monitored for efficiency, in order to save the affluent customers of nearby Boston a few pennies when they buy books or electronics.
But that is all in the future. These days, the local newspaper publishes an endless stream of stories about drug arrests, shootings, drunk-driving crashes, the stupidity of local politicians, and the lamentable surplus of “affordable housing.” The town is up to its eyeballs in wrathful bitterness against public workers. As in: Why do they deserve a decent life when the rest of us have no chance at all? It’s every man for himself here in a “competition for crumbs,” as a Fall River friend puts it.
The Great Entrepreneurial Awakening
If Fall River is pocked with empty mills, the streets of Boston are dotted with facilities intended to make innovation and entrepreneurship easy and convenient. I was surprised to discover, during the time I spent exploring the city’s political landscape, that Boston boasts a full-blown Innovation District, a disused industrial neighborhood that has actually been zoned creative — a projection of the post-industrial blue-state ideal onto the urban grid itself. The heart of the neighborhood is a building called “District Hall” — “Boston’s New Home for Innovation” — which appeared to me to be a glorified multipurpose room, enclosed in a sharply angular façade, and sharing a roof with a restaurant that offers “inventive cuisine for innovative people.” The Wi-Fi was free, the screens on the walls displayed famous quotations about creativity, and the walls themselves were covered with a high-gloss finish meant to be written on with dry-erase markers; but otherwise it was not much different from an ordinary public library. Aside from not having anything to read, that is.
This was my introduction to the innovation infrastructure of the city, much of it built up by entrepreneurs shrewdly angling to grab a piece of the entrepreneur craze. There are “co-working” spaces, shared offices for startups that can’t afford the real thing. There are startup “incubators” and startup “accelerators,” which aim to ease the innovator’s eternal struggle with an uncaring public: the Startup Institute, for example, and the famous MassChallenge, the “World’s Largest Startup Accelerator,” which runs an annual competition for new companies and hands out prizes at the end.
And then there are the innovation Democrats, led by former Governor Deval Patrick, who presided over the Massachusetts government from 2007 to 2015. He is typical of liberal-class leaders; you might even say he is their most successful exemplar. Everyone seems to like him, even his opponents. He is a witty and affable public speaker as well as a man of competence, a highly educated technocrat who is comfortable in corporate surroundings. Thanks to his upbringing in a Chicago housing project, he also understands the plight of the poor, and (perhaps best of all) he is an honest politician in a state accustomed to wide-open corruption. Patrick was also the first black governor of Massachusetts and, in some ways, an ideal Democrat for the era of Barack Obama — who, as it happens, is one of his closest political allies.
As governor, Patrick became a kind of missionary for the innovation cult. “The Massachusetts economy is an innovation economy,” he liked to declare, and he made similar comments countless times, slightly varying the order of the optimistic keywords: “Innovation is a centerpiece of the Massachusetts economy,” et cetera. The governor opened “innovation schools,” a species of ramped-up charter school. He signed the “Social Innovation Compact,” which had something to do with meeting “the private sector’s need for skilled entry-level professional talent.” In a 2009 speech called “The Innovation Economy,” Patrick elaborated the political theory of innovation in greater detail, telling an audience of corporate types in Silicon Valley about Massachusetts’s “high concentration of brainpower” and “world-class” universities, and how “we in government are actively partnering with the private sector and the universities, to strengthen our innovation industries.”
What did all of this inno-talk mean? Much of the time, it was pure applesauce — standard-issue platitudes to be rolled out every time some pharmaceutical company opened an office building somewhere in the state.
On some occasions, Patrick’s favorite buzzword came with a gigantic price tag, like the billion dollars in subsidies and tax breaks that the governor authorized in 2008 to encourage pharmaceutical and biotech companies to do business in Massachusetts. On still other occasions, favoring inno has meant bulldozing the people in its path — for instance, the taxi drivers whose livelihoods are being usurped by ridesharing apps like Uber. When these workers staged a variety of protests in the Boston area, Patrick intervened decisively on the side of the distant software company. Apparently convenience for the people who ride in taxis was more important than good pay for people who drive those taxis. It probably didn’t hurt that Uber had hired a former Patrick aide as a lobbyist, but the real point was, of course, innovation: Uber was the future, the taxi drivers were the past, and the path for Massachusetts was obvious.
A short while later, Patrick became something of an innovator himself. After his time as governor came to an end last year, he won a job as a managing director of Bain Capital, the private equity firm that was founded by his predecessor Mitt Romney — and that had been so powerfully denounced by Democrats during the 2012 election. Patrick spoke about the job like it was just another startup: “It was a happy and timely coincidence I was interested in building a business that Bain was also interested in building,” he told theWall Street Journal. Romney reportedly phoned him with congratulations.
At a 2014 celebration of Governor Patrick’s innovation leadership, Google’s Eric Schmidt announced that “if you want to solve the economic problems of the U.S., create more entrepreneurs.” That sort of sums up the ideology in this corporate commonwealth: Entrepreneurs first. But how has such a doctrine become holy writ in a party dedicated to the welfare of the common man? And how has all this come to pass in the liberal state of Massachusetts?
The answer is that I’ve got the wrong liberalism. The kind of liberalism that has dominated Massachusetts for the last few decades isn’t the stuff of Franklin Roosevelt or the United Auto Workers; it’s the Route 128/suburban-professionals variety. (Senator Elizabeth Warren is the great exception to this rule.) Professional-class liberals aren’t really alarmed by oversized rewards for society’s winners. On the contrary, this seems natural to them — because they are society’s winners. The liberalism of professionals just does not extend to matters of inequality; this is the area where soft hearts abruptly turn hard.
Innovation liberalism is “a liberalism of the rich,” to use the straightforward phrase of local labor leader Harris Gruman. This doctrine has no patience with the idea that everyone should share in society’s wealth. What Massachusetts liberals pine for, by and large, is a more perfect meritocracy — a system where the essential thing is to ensure that the truly talented get into the right schools and then get to rise through the ranks of society. Unfortunately, however, as the blue-state model makes painfully clear, there is no solidarity in a meritocracy. The ideology of educational achievement conveniently negates any esteem we might feel for the poorly graduated.
This is a curious phenomenon, is it not? A blue state where the Democrats maintain transparent connections to high finance and big pharma; where they have deliberately chosen distant software barons over working-class members of their own society; and where their chief economic proposals have to do with promoting “innovation,” a grand and promising idea that remains suspiciously vague. Nor can these innovation Democrats claim that their hands were forced by Republicans. They came up with this program all on their own.
The other week, feeling sick, I spent a day on my couch with the TV on and was reminded of an odd fact of American life. More than seven months before Election Day, you can watch the 2016 campaign for the presidency at any moment of your choosing, and that’s been true since at least late last year. There is essentially never a time when some network or news channel isn’t reporting on, discussing, debating, analyzing, speculating about, or simply drooling over some aspect of the primary campaign, of Hillary, Bernie, Ted, and above all — a million times above all — The Donald (from the violence at his rallies to the size of his hands). In case you’re young and think this is more or less the American norm, it isn’t. Or wasn’t.
Truly, there is something new under the sun. Of course, in 1994 with O.J. Simpson’s white Ford Bronco chase (95 million viewers!), the 24/7 media event arrived full blown in American life and something changed when it came to the way we focused on our world and the media focused on us. But you can be sure of one thing: never in the history of television, or any other form of media, has a single figure garnered the amount of attention — hour after hour, day after day, week after week — as Donald Trump. If he’s the O.J. Simpson of twenty-first-century American politics and his run for the presidency is the eternal white Ford Bronco chase of our moment, then we’re in a truly strange world.
Or let me put it another way: this is not an election. I know the word “election” is being used every five seconds and somewhere along the line significant numbers of Americans (particularly, this season, Republicans) continue to enter voting booths or in the case of primary caucuses, school gyms and the like, to choose among various candidates, so it’s all still election-like. But take my word for it as a 71-year-old guy who’s been watching our politics for decades: this is not an election of the kind the textbooks once taught us was so crucial to American democracy. If, however, you’re sitting there waiting for me to tell you what it is, take a breath and don’t be too disappointed. I have no idea, though it’s certainly part bread-and-circuses spectacle, part celebrity obsession, and part media money machine.
Actually, before we go further, let me hedge my bets on the idea that Donald Trump is a twenty-first-century O.J. Simpson. It’s certainly a reasonable enough comparison, but I’ve begun to wonder about the usefulness of just about any comparison in our present situation. Even the most nightmarish of them — Donald Trump is Adolf Hitler, Benito Mussolini, or any past extreme demagogue of your choice — may actually prove to be covert gestures of consolation, reassurance, and comfort. Yes, what’s happening in our world is increasingly extreme and could hardly be weirder, we seem to have the urge to say, but it’s still recognizable. It’s something we’ve encountered before, something we’ve made sense of in the past and, in the process, overcome.
Round Up the Usual SuspectsBut what if that’s not true? In some ways, the most frightening, least acceptable thing to say about our American world right now — even if Donald Trump’s overwhelming presence all but begs us to say it — is that we’ve entered uncharted territory and, under the circumstances, comparisons might actually impair our ability to come to grips with our new reality. My own suspicion: Donald Trump is only the most obvious instance of this, the example no one can miss.In these first years of the twenty-first century, we may be witnessing a new world being born inside the hollowed-out shell of the American system. As yet, though we live with this reality every day, we evidently just can’t bear to recognize it for what it might be. When we survey the landscape, what we tend to focus on is that shell — the usual elections (in somewhat heightened form), the usual governmental bodies (a little tarnished) with the usual governmental powers (a little diminished or redistributed), including the usual checks and balances (a little out of whack), and the same old Constitution (much praised in its absence), and yes, we know that none of this is working particularly well, or sometimes at all, but it still feels comfortable to view what we have as a reduced, shabbier, and more dysfunctional version of the known.
Perhaps, however, it’s increasingly a version of the unknown. We say, for instance, that Congress is “paralyzed,” and that little can be done in a country where politics has become so “polarized,” and we wait for something to shake us loose from that “paralysis,” to return us to a Washington closer to what we remember and recognize. But maybe this is it. Maybe even if the Republicans somehow lost control of the House of Representatives and the Senate, we would still be in a situation something like what we’re now labeling paralysis. Maybe in our new American reality, Congress is actually some kind of glorified, well-lobbied, and well-financed version of a peanut gallery.
Of course, I don’t want to deny that much of what is “new” in our world has a long history. The present yawning inequality gap between the 1% and ordinary Americans first began to widen in the 1970s and — as Thomas Frank explains so brilliantly in his new book, Listen, Liberal — was already a powerful and much-discussed reality in the early 1990s, when Bill Clinton ran for president. Yes, that gap is now more like an abyss and looks ever more permanently embedded in the American system, but it has a genuine history, as for instance do 1% elections and the rise and self-organization of the “billionaire class,” even if no one, until this second, imagined that government of the billionaires, by the billionaires, and for the billionaires might devolve into government of the billionaire, by the billionaire, and for the billionaire — that is, just one of them.
Indeed, much of our shape-shifting world can be written about as a set of comparisons and in terms of historical reference points. Inequality has a history. The military-industrial complex and the all-volunteer military, like the warrior corporation, weren’t born yesterday; neither was our state of perpetual war, nor the national security state that now looms over Washington, nor its surveilling urge, the desire to know far too much about the private lives of Americans. (A little bow of remembrance to FBI Director J. Edgar Hoover is in order here.)
And yet, true as all that may be, Washington increasingly seems like a new land, sporting something like a new system in the midst of our much-described polarized and paralyzed politics. The national security state doesn’t seem faintly paralyzed or polarized to me. Nor does the Pentagon. On certain days when I catch the news, I can’t believe how strange and yet humdrum this uncharted new territory is. Remind me, for instance, where in the Constitution the Founding Fathers wrote about that national security state? And yet there it is in all its glory, all its powers, an ever more independent force in our nation’s capital. In what way, for instance, did those men of the revolutionary era prepare the ground for the Pentagon to loose its spy drones from our distant war zones over the United States? And yet, so it has. And no one even seems disturbed by the development. The news, barely noticed or noted, was instantly absorbed into what’s becoming the new normal.
Graduation Ceremonies in the Imperium
Let me mention here the almost random piece of news that recently made me wonder just what planet I was actually on. And I know you won’t believe it, but it had absolutely nothing to do with Donald Trump.
Given the carnage of America’s wars and conflicts across the Greater Middle East and Africa, which I’ve been following closely these last years, I’m unsure why this particular moment even got to me. Best guess? Maybe that, of all the once-obscure places — from Afghanistan to Yemen to Libya — in which the U.S. has been fighting recently, Somalia, where this particular little slaughter took place, seems to me like the most obscure of all. Yes, I’ve been half-attending to events there from the 1993 Blackhawk Down moment to the disastrous U.S.-backed Ethiopian invasion of 2006 to the hardly less disastrous invasion of that country by Kenyan and other African forces. Still, Somalia?
Recently, U.S. Reaper drones and manned aircraft launched a set of strikes against what the Pentagon claimed was a graduation ceremony for “low-level” foot soldiers in the Somali terror group al-Shabab. It was proudly announced that more than 150 Somalis had died in this attack. In a country where, in recent years, U.S. drones and special ops forces had carried out a modest number of strikes against individual al-Shabab leaders, this might be thought of as a distinct escalation of Washington’s endless low-level conflict there (with a raid involving U.S. special ops forces following soon after).
Now, let me try to put this in some personal context. Since I was a kid, I’ve always liked globes and maps. I have a reasonable sense of where most countries on this planet are. Still, Somalia? I have to stop and give that one some thought to truly locate it on a mental map of eastern Africa. Most Americans? Honestly, I doubt they’d have a clue. So the other day, when this news came out, I stopped a moment to take it in. If accurate, we killed 150 more or less nobodies (except to those who knew them) and maybe even a top leader or two in a country most Americans couldn’t locate on a map.
I mean, don’t you find that just a little odd, no matter how horrible the organization they were preparing to fight for? 150 Somalis? Blam!
Remind me: On just what basis was this modest massacre carried out? After all, the U.S. isn’t at war with Somalia or with al-Shabab. Of course, Congress no longer plays any real role in decisions about American war making. It no longer declares war on any group or country we fight. (Paralysis!) War is now purely a matter of executive power or, in reality, the collective power of the national security state and the White House. The essential explanation offered for the Somali strike, for instance, is that the U.S. had a small set of advisers stationed with African Union forces in that country and it was just faintly possible that those guerrilla graduates might soon prepare to attack some of those forces (and hence U.S. military personnel). It seems that if the U.S. puts advisers in place anywhere on the planet — and any day of any year they are now in scores of countries — that’s excuse enough to validate acts of war based on the “imminent” threat of their attack.
Or just think of it this way: a new, informal constitution is being written in these years in Washington. No need for a convention or a new bill of rights. It’s a constitution focused on the use of power, especially military power, and it’s being written in blood.
These days, our government (the unparalyzed one) acts regularly on the basis of that informal constitution-in-the-making, committing Somalia-like acts across significant swathes of the planet. In these years, we’ve been marrying the latest in wonder technology, our Hellfire-missile-armed drones, to executive power and slaughtering people we don’t much like in majority Muslim countries with a certain alacrity. By now, it’s simply accepted that any commander-in-chief is also our assassin-in-chief, and that all of this is part of a wartime-that-isn’t-wartime system, spreading the principle of chaos and dissolution to whole areas of the planet, leaving failed states and terror movements in its wake.
When was it, by the way, that “the people” agreed that the president could appoint himself assassin-in-chief, muster his legal beagles to write new “law” that covered any future acts of his (including the killing of American citizens), and year after year dispatch what essentially is his own private fleet of killer drones to knock offthousands of people across the Greater Middle East and parts of Africa? Weirdly enough, after almost 14 years of this sort of behavior, with ample evidence that such strikes don’t suppress the movements Washington loathes (and often only fan the flames of resentment and revenge that help them spread), neither the current president and his top officials, nor any of the candidates for his office have the slightest intention of ever grounding those drones.
And when exactly did the people say that, within the country’s vast standing military, which now garrisons much of the planet, a force of nearly 70,000 Special Operations personnel should be birthed, or that it should conduct covert missions globally, essentially accountable only to the president (if him)? And what I find strangest of all is that few in our world find such developments strange at all.
A Planet in Decline?
In some way, all of this could be said to work. At the very least, it is a functioning new system-in-the-making that we have yet to truly come to grips with, just as we haven’t come to grips with a national security state that surveils the world in a way that even science fiction writers (no less totalitarian rulers) of a previous era could never have imagined, or the strange version of media overkill that we still call an election. All of this is by now both old news and mind-bogglingly new.
Do I understand it? Not for a second.
This is not war as we knew it, nor government as we once understood it, nor are these elections as we once imagined them, nor is this democracy as it used to be conceived of, nor is this journalism of a kind ever taught in a journalism school. This is the definition of uncharted territory. It’s a genuine American terra incognita and yet in some fashion that unknown landscape is already part of our sense of ourselves and our world. In this “election” season, many remain shocked that a leading candidate for the presidency is a demagogue with a visible authoritarian side and what looks like an autocratic bent. All such labels are pinned on Donald Trump, but the new American system that’s been emerging from its chrysalis in these years already has just those tendencies. So don’t blame it all on Donald Trump. He should be far less of a shock to this country than he continues to be. After all, a Trumpian world-in-formation has paved the way for him.
Who knows? Perhaps what we’re watching is the new iteration of a very old story: a twenty-first-century version of an ancient tale of a great imperial power, perhaps the greatest ever — the “lone superpower” — sinking into decline. It’s a tale humanity has experienced often enough in the course of our long history. But lest you think once again that there’s nothing new under the sun, the context for all of this, for everything now happening in our world, is so new as to be quite literally outside of thousands of years of human experience. As the latest heat records indicate, we are, for the first time, on a planet in decline. And if that isn’t uncharted territory, what is?
National wildlife refuges such as the one at Malheur near Burns, Oregon, have importance far beyond the current furor over who manages our public lands. Such refuges are becoming increasingly critical habitat for migratory birds because 95 percent of the wetlands along the Pacific Flyway have already been lost to development.
In some years, 25 million birds visit Malheur, and if the refuge were drained and converted to intensive cattle grazing – which is something the “occupiers” threatened to do – entire populations of ducks, sandhill cranes, and shorebirds would suffer. With their long-distance flights and distinctive songs, the migratory birds visiting Malheur’s wetlands now help to tie the continent together.
This was not always the case. By the 1930s, three decades of drainage, reclamation, and drought had decimated high-desert wetlands and the birds that depended upon them. Out of the hundreds of thousands of egrets that once nested on Malheur Lake, only 121 remained. The American population of the birds had dropped by 95 percent. It took the federal government to restore Malheur’s wetlands and recover waterbird populations, bringing back healthy populations of egrets and many other species.
Yet despite the importance of wildlife refuges to America’s birds, not everyone appreciates them. At one recent news conference, Ammon Bundy called the creation of Malheur National Wildlife refuge “an unconstitutional act” that removed ranchers from their lands and plunged the county into an economic depression. This is not a new complaint. Since the Sagebrush Rebellion of the 1980s, rural communities in the West have blamed their poverty on the 640 million acres of federal public lands, which make up 52 percent of the land in Western states.
Rural Western communities are indeed suffering, but the cause is not the wildlife refuge system. Conservation of bird habitat did not lead to economic devastation, nor were refuge lands “stolen” from ranchers. If any group has prior claims to Malheur refuge, it is the Paiute Indian Tribe.
For at least 6,000 years, Malheur was the Paiutes’ home. It took a brutal Army campaign to force the people from their reservation, marching them through the snow to the state of Washington in 1879. Homesteaders and cattle barons then moved onto Paiute lands, squeezing as much livestock as possible onto dwindling pastures, and warring with each other over whose land was whose. Scars from this era persist more than a century later.
In 1908, President Roosevelt established the Malheur Lake Bird Reservation on the lands of the former Malheur Indian Reservation. But the refuge included only the lake itself, not the rivers that fed into it. Deprived of water, the lake shrank during droughts, and squatters moved onto the drying lakebed. Conservationists, realizing they needed to protect the Blitzen River that fed the lake, began a campaign to expand the refuge.
But the federal government never forced the ranchers to sell, as the occupiers at Malheur claimed, and the sale did not impoverish the community. In fact, it was just the opposite: During the Depression years of the 1930s, the federal government paid the Swift Corp. $675,000 for ruined grazing lands. Impoverished homesteaders who had squatted on refuge lands eventually received payments substantial enough to set them up as cattle ranchers nearby.
John Scharff, Malheur’s manager from 1935 to 1971, sought to transform local suspicion into acceptance by allowing local ranchers to graze cattle on the refuge. Yet some tension persisted. In the 1970s, when concern about overgrazing reduced – but did not eliminate – refuge grazing, violence erupted again. Some environmentalists denounced ranchers as parasites who destroyed wildlife habitat. A few ranchers responded with death threats against environmentalists and federal employees.
But violence is not the basin’s most important historical legacy. Through the decades, community members have come together to negotiate a better future. In the 1920s, poor homesteaders worked with conservationists to save the refuge from irrigation drainage. In the 1990s, Paiute tribal members, ranchers, environmentalists and federal agencies collaborated on innovative grazing plans to restore bird habitat while also giving ranchers more flexibility. In 2013, such efforts resulted in a landmark collaborative conservation plan for the refuge, and it offers great hope for the local economy and for wildlife.
The poet Gary Snyder wrote, “We must learn to know, love, and join our place even more than we love our own ideas. People who can agree that they share a commitment to the landscape – even if they are otherwise locked in struggle with each other – have at least one deep thing to share.”
Collaborative processes are difficult and time-consuming. Yet they have proven that they have the potential to peacefully sustain both human and wildlife communities.
Nancy Langston is a contributor to Writers on the Range, the opinion service of High Country News. She is a professor of environmental history at Michigan Technological University, and the author of a history of Malheur Refuge, Where Land and Water Meet: A Western Landscape Transformed.
Reprinted from the New Economic Perspectives blog at the University of Missouri-Kansas City
Editor’s Note: William K. Black, author of “The Best Way to Rob a Bank is to Own One,” is Associate Professor of Law and Economics at the University of Missouri-Kansas City, where — according to James Galbraith — “the best economics is now being done.”
In the latest example of the New York Times’ reporters’ inability to read Paul Krugman, we have an article claiming that the “Growing Imbalance Between Germany and France Strains Their Relationship.” The article begins with Merkel’s major myth accepted as if it were unquestionable reality.
“It was a clear illustration of the dysfunction of the French-German partnership, the axis that for decades kept Europe on a united and dynamic track.
In Berlin this month, Chancellor Angela Merkel, riding high after nine years in power, delivered a strident defense in Parliament of austerity, which she has been pushing on Europe ever since a debt crisis broke out in 2009.”
No, not true on multiple grounds. First, the so-called “debt crisis” was a symptom rather than a cause. The reader will note that the year 2008, when the Great Recession became terrifying, has somehow been removed from the narrative because it would expose the misapprehension in Merkel’s myth. Prior to 2008, only Greece had debt levels given its abandonment of a sovereign currency that posed a material risk. The EU nations had unusually low budgetary deficits leading into the Great Recession. Indeed, that along with the extremely low budgetary deficits of the Clinton administration (the budget went into surplus near the end of his term) is likely one of the triggers for the Great Recession.
The Great Recession caused sharp increases in deficits – as we have long known will happen as part of the “automatic stabilizers.” This is normal and speeds recovery. The eurozone and the U.S. began to come out of the Great Recession in 2009. The U.S. recovery accelerated with the addition of stimulus. In the eurozone, however, the abandonment of sovereign currencies and adoption of the euro exposed the periphery to recurrent attacks by the “bond vigilantes.” The ECB could have stopped these attacks at any time, but it was very late intervening – largely because of German resistance. Instead, Merkel used the leverage provided by the bond vigilantes and the refusal of the ECB to act to end their attacks to force increasing austerity upon the eurozone and demands for severe cuts in workers’ wages in the periphery.
Merkel’s actions in forcing austerity and efforts to force sharp drops in workers’ wages in the periphery were not required to stop any “debt crisis.” The ECB had the ability to end the bond vigilantes’ attacks and reestablish the ability of the periphery to borrow at low cost, as it demonstrated. Merkel’s austerity demands and demands that (largely) left governments in the periphery slash workers’ wages promptly threw the entire Eurozone back into a second Great Recession – and much of the periphery into a Second Great Depression. It had the desired purpose of discrediting the governing parties of the left, particularly in Spain, Portugal, and Greece; that gave in to Merkel’s mandates that they adopt masochistic macroeconomic policies.
It is also false that Merkel began demanding that eurozone inflict austerity only in 2009. Merkel wanted to inflict austerity and her war on the workers and the parties they primarily supported long before 2009. What changed in 2009 was that the ECB, the Great Recession, and the bond vigilantes gave her the leverage to successfully extort the members of the eurozone who opposed austerity and her war on workers and the parties of the left.
But it is what is left out of the quoted passage above that is most amazing. The fact that Merkel’s orders that the eurozone leaders bleed their economies through austerity and the war on workers’ wages led to a gratuitous Second Great Recession in the eurozone – and Great Depression levels of unemployment in much of the periphery disappears. The fact that inflicting austerity and wage cuts in response to a Great Recession is economically illiterate and cruel disappears. The fact that the overall eurozone – six years after the financial crisis of 2008 and eight years after the financial bubbles popped in 2006 – has stagnated and caused tens of trillions of dollars in lost GDP and well over 10 million lost jobs is treated by the NYT article as if it were unrelated to Merkel’s infliction of austerity.
“But the French economy has grown stagnant, with unemployment stubbornly stuck near 11 percent and an unpopular government pledging to cut tens of billions in taxes on business, which many French fear will unravel their prized welfare state.”
No, the eurozone economy “has grown stagnant” and produced a Second Great Depression in much of the periphery. If France had a sovereign currency or if the EU were to make the euro and into a true sovereign currency France could simultaneously “cut tens of billions in taxes on business” while preserving the social safety net and speeding the recovery. The same is true of the rest of the eurozone – including Germany where Merkel’s policies have made the wealthy far wealthier and deepened the economic crisis in other eurozone nations by cutting German worker’s wages. The NYT article is disingenuous about both aspects of the German economy, noting only that “the German economy has shown signs of slowing down.” German growth was actually negative in the last quarter and the treatment of its workers weakens the German and overall eurozone recovery.
It continues to be obvious that it is a condition of employment for NYT reporters covering the eurozone’s economic policies that they never read Paul Krugman (or most any other American economist). Consider this claim in the article:
“[Prime Minister Manuel Valls] and Mr. Hollande have alienated many members of the Socialist Party by taking a more centrist approach to economic policy, stoking suspicions that the government is favoring business at the expense of the welfare state.”
I will take this part very slow. By my count Krugman has written at least six columns in the NYT explaining that there actually is a powerful consensus among economists. The “centrist approach” is that austerity in response to a Great Recession is self-destructive. We have known this for at least 75 years. Modern Republicans, when they hold the presidency, always respond to a recession with a stimulus package. Valls and Hollande are moving away from a “centrist approach to economic policy.” They are doing so despite observing first-hand the self-destructive nature of austerity (and proclaiming that it is self-destructive). They do so despite the demonstrated success of stimulus in responding to the financial crisis. They do so despite the fact that the results of the faux left parties adopting these economically illiterate neo-liberal economic policies is the destruction of the parties that betray their principles and the workers. Valls and Hollande are spectacularly unpopular in France because of these betrayals. It is clear why Valls and Hollande wish to avoid reading Krugman’s critique of their betrayals, but theNYT reporters have no excuse.
The reporters do not simply ignore the insanity of austerity and the plight of the eurozone’s workers – they assert that it is obvious that Merkel is correct and that the French reluctance to slash workers’ wages is obviously economically illiterate.
“Just over a decade ago, as Ms. Merkel is fond of noting, Germany was Europe’s sick economy. It recovered partly because of changes to labor laws and social welfare. Mr. Hollande now faces a similar task in an era of low or no growth.”
No. These two sentences propound multiple Merkel myths and assume (1) that France’s (and the rest of the eurozone’s) problems are the same as Germany’s issues “just over a decade ago,” (2) that Germany “recovered” due to slashing workers’ wages and social programs, and (3) that the German “solutions” would work for the eurozone as a whole.
Germany’s “reforms,” which included increasing financial deregulation, have proven disastrous. German banks finished third in the regulatory “race to the bottom” (“behind” Wall Street and the worst of the worst – the City of London). The officers that controlled Deutsche Bank and various state-owned German banks were among the leading causes of the financial crisis. German workers had lost ground even before the financial crisis and have lost even more ground since the crisis began. Inequality has also become increasingly more extreme in Germany.
The current problem in the eurozone is a critical shortage of demand exacerbated by the insanity of austerity and Merkel’s war on workers’ wages. The word “demand” and the concept, the centerpiece of the macroeconomics of recession, never appear in the article. An individual nation in which the wealthy have the political power to lower workers’ wages can increase its exports and employ more of its citizens. This obviously does not prove that the workers were overpaid. Merkel and the NYT ignore the “fallacy of composition,” which is particularly embarrassing because they are neo-mercantilists pushing the universal goal of being a net exporter. As Adam Smith emphasized, we can’t all be net exporters. A strategy that can work (for the elites) of one nation cannot logically be assumed to work for large numbers of nations.
The last thing a society should want in a recession is rapidly falling wages and prices that can create deflation (another word expunged from the NYT article because it would refute their ode to Merkel, austerity, and her war on the worker). If France were to slash workers’ wages to try to take exports from Ireland while Ireland slashed workers’ wages to try to take exports from Spain, which did the same to take exports from Italy the result would be deflation, a massive increase in inequality, the political destruction of any (allegedly) progressive political party that joined in the war on the worker, and a “race to Bangladesh” dynamic.
Germany’s “success” in being a very large net exporter makes it far more difficult – not easier – for any other eurozone nation to copy its export strategy successfully. As a group, the strategy cannot work for the eurozone. The strategy has, of course, not simply “not succeeded.” It has failed catastrophically. Merkel’s eurozone policies have caused trillions of dollars in extra losses in productivity, the gratuitous loss of over 10 million jobs, increased inequality, and the loss through emigration of many of the best educated young citizens of the periphery.
Hollande does not face “a similar task” to Merkel. He faces different problems and Merkel’s “solutions” are the chief causes of France’s economic stagnation rather than the answers to France’s problems.
I repeat my twin suggestions to the NYT reporters that cover the eurozone’s economy. The paper’s management should host a seminar in which Krugman educates his colleagues. Alternatively, come to UMKC and we’ll provide that seminar without charge. None of us can afford the cost of the reporters’ continuing willful ignorance of economics and their indifference to the victims of austerity and Merkel’s war on workers.
Editor’s Note: Rebecca Solnit, is one of the best writers in America because she’s one of the most original thinkers. Here she reminds us of the revolutionary power of hope, and how hope overturns old regimes from the bottom up.
There have undoubtedly been stable periods in human history, but you and your parents, grandparents, and great-grandparents never lived through one, and neither will any children or grandchildren you may have or come to have. Everything has been changing continuously, profoundly — from the role of women to the nature of agriculture. For the past couple of hundred years, change has been accelerating in both magnificent and nightmarish ways.
Yet when we argue for change, notably changing our ways in response to climate change, we’re arguing against people who claim we’re disrupting a stable system. They insist that we’re rocking the boat unnecessarily.
I say: rock that boat. It’s a lifeboat; maybe the people in it will wake up and start rowing. Those who think they’re hanging onto a stable order are actually clinging to the wreckage of the old order, a ship already sinking, that we need to leave behind.
As you probably know, the actual oceans are rising — almost eight inches since 1880, and that’s only going to accelerate. They’re also acidifying, because they’re absorbing significant amounts of the carbon we continue to pump into the atmosphere at record levels. The ice that covers the polar seas is shrinking, while the ice shields that cover Antarctica and Greenland are melting. The water locked up in all the polar ice, as it’s unlocked by heat, is going to raise sea levels staggeringly, possibly by as much as 200 feet at some point in the future, how distant we do not know. In the temperate latitudes, warming seas breed fiercer hurricanes.
The oceans are changing fast, and for the worse. Fish stocks are dying off, as are shellfish. In many acidified oceanic regions, their shells are actually dissolving or failing to form, which is one of the scariest, most nightmarish things I’ve ever heard. So don’t tell me that we’re rocking a stable boat on calm seas. The glorious 10,000-year period of stable climate in which humanity flourished and then exploded to overrun the Earth and all its ecosystems is over.
But responding to these current cataclysmic changes means taking on people who believe, or at least assert, that those of us who want to react and act are gratuitously disrupting a stable system that’s working fine. It isn’t stable. It isworking fine — in the short term and the most limited sense — for oil companies and the people who profit from them and for some of us in the particularly cushy parts of the world who haven’t been impacted yet by weather events like, say, the recent torrential floods in Japan or southern Nevada and Arizona, or the monsoon versions of the same that have devastated parts of India and Pakistan, or the drought that has mummified my beloved California, or the wildfires of Australia.
The problem, of course, is that the people who most benefit from the current arrangements have effectively purchased a lot of politicians, and that a great many of the rest of them are either hopelessly dim or amazingly timid. Most of the Democrats recognize the reality of climate change but not the urgency of doing something about it. Many of the Republicans used to — John McCain has done an amazing about-face from being a sane voice on climate to a shrill denier — and they present a horrific obstacle to any international treaties.
Put it this way: in one country, one party holding 45 out of 100 seats in one legislative house, while serving a minority of the very rich, can basically block what quite a lot of the other seven billion people on Earth want and need, because a two-thirds majority in the Senate must consent to any international treaty the U.S. signs. Which is not to say much for the president, whose drill-baby-drill administration only looks good compared to the petroleum servants he faces, when he bothers to face them and isn’t just one of them. History will despise them all and much of the world does now, but as my mother would have said, they know which side their bread is buttered on.
As it happens, the butter is melting and the bread is getting more expensive. Global grain production is already down several percent thanks to climate change, says a terrifying new United Nations report. Declining crops cause food shortages and rising food prices, creating hunger and even famine for the poorest on Earth, and also sometimes cause massive unrest. Rising bread prices were one factor that helped spark the Arab Spring in 2011. Anyone who argues that doing something about global warming will be too expensive is dodging just how expensive unmitigated climate change is already proving to be.
It’s only a question of whether the very wealthy or the very poor will pay. Putting it that way, however, devalues all the nonmonetary things at stake, from the survival of myriad species to our confidence in the future. And yeah, climate change is here, now. We’ve already lost a lot and we’re going to lose more, but there’s a difference between terrible and apocalyptic. We still have some control over how extreme it gets. That’s not a great choice, but it’s the choice we have. There’s still a window open for action, but it’s closing. As the Secretary-General of the World Meteorological Society, Michel Jarraud, bluntly put it recently, “We are running out of time.”
New and Renewable Energies
The future is not yet written. Look at the world we’re in at this very moment. The Keystone XL tar sands pipeline was supposed to be built years ago, but activists catalyzed by the rural and indigenous communities across whose land it would go have stopped it so far, and made what was supposed to be a done deal a contentious issue. Activists changed the outcome.
Fracking has been challenged on the state level, and banned in townships and counties from upstate New York to central California. (It has also been banned in two Canadian provinces, France, and Bulgaria.) The fossil-fuel divestment movement has achieved a number of remarkable victories in its few bare years of existence and more are on the way. The actual divestments and commitments to divest fossil fuel stocks by various institutions ranging from the city of Seattle to the British Medical Association are striking. But the real power of the movement lies in the way it has called into question the wisdom of investing in fossil fuel corporations. Even mainstream voices like the British Parliament’s Environmental Audit Committee and publications like Forbes are now beginning to question whether they are safe places to put money. That’s a sea change.
Renewable energy has become more efficient, technologically sophisticated, and cheaper — the price of solar power in relation to the energy it generates has plummeted astonishingly over the past three decades and wind technology keeps getting better. While Americans overall are not yet curtailing their fossil-fuel habits, many individuals and communities are choosing other options, and those options are becoming increasingly viable. A Stanford University scientist has proposed a plan to allow each of the 50 states to run on 100% renewable energy by 2050.
Since, according to the latest report of the U.N.’s Intergovernmental Panel on Climate Change, fossil fuel reserves still in the ground are “at least four times larger than could safely be burned if global warming is to be kept to a tolerable level,” it couldn’t be more important to reach global agreements to do things differently on a planetary scale. Notably, most of those carbon reserves must be left untapped and the modest steps already taken locally andad hoc show that such changes are indeed possible and that an encouraging number of us want to pursue them.
We can do it. And we is the key word here. The world is not going to be saved by individual acts of virtue; it’s going to be saved, if it is to be saved, by collective acts of social and political change. That’s why I’m marching this Sunday with tens or maybe hundreds of thousands of others in New York City — to pressure the United Nations as it meets to address climate change. That’s why people who care about the future state of our planet will also be marching and demonstrating in New Delhi, Rio de Janeiro, Paris, Berlin, Melbourne, Kathmandu, Dublin, Manila, Seoul, Mumbai, Istanbul, and so many smaller places.
Mass movements work. Unarmed citizens have changed the course of history countless times in the modern era. When we come together as civil society, we have the capacity to transform policies, change old ways of doing things, and sometimes even topple regimes. And it is about governments. Like it or not, the global treaties, compacts, and agreements we need can only be made by governments, and governments will make those agreements when the pressure to do so is greater than the pressure not to. We can and must be that pressure.
The Long View from One Window
I lived in the same apartment for 25 years, moving into a poor but thriving black community in 1981 and out of the far more affluent, paler, and less neighborly place it had become in 2006. A lot of people moved in and out in that period, many of them staying only a year or two. Those transients always seemed to believe that the neighborhood they were passing through was a stable one. You had to be slower than change and stick around to see it. I saw it and it helped me learn how to take a historical view of things.
It’s crazy that anyone speaks as if our world is not undergoing rapid change, when the view from the window called history shows nothing but transformation, both incremental and dramatic. Exactly 25 years ago this month, Eastern Europe was astir. Remember that back then there was still a Soviet bloc, and a Soviet Union, and an Iron Curtain, and a Berlin Wall, and a Cold War. Most people thought those were permanent fixtures, but in the summer of 1989, Hungary decided to let East Germans (who were permitted to travel freely to that communist country) stream over to the West.
Thousands of people, tired of life in the totalitarian east, fled. Poland, Czechoslovakia, and Hungary, as well as East Germany, were already electrified by a resurgent civil society and activist communities that had dared to organize in the face of repression. At the time, politicians and pundits in the West were making careers out of explaining, among so many other things, why German reunification wasn’t going to happen in anyone’s lifetime. And they probably would have been proven right if people had stayed home and done nothing, if they hadn’t begun to hope and acted on that hope.
The bureaucrats on both sides of the Berlin Wall were still talking about the possibility of demilitarizing it when citizens showed up en masse and the guards began abandoning their posts. On that epochal night of November 9, 1989, the people made whole what had been broken. The lesson: showing up is half the battle.
British Prime Minister Margaret Thatcher had been so unnerved by developments in the Soviet Union’s Eastern European holdings that she went to Moscow, two months before the fall of the wall, to implore Soviet leader Mikhail Gorbachev to prevent any such thing. That was early September 1989. “No dramatic change in the situation in Czechoslovakia can be expected,” predicted a Czech official two months before a glorious popular uprising, remembered as the Velvet Revolution, erupted and abolished the government in which he was an official.
There are three things to note about those changes in 1989. First, most people in power dismissed the possibility that such extraordinary change could happen or deplored what it might bring. They were comfortable enough with things as they were, even though the status quo was several kinds of scary and awful. In other words, the status quo likes the status quo and dislikes change. Second, everything changed despite them, thanks to grassroots organizing and civil society, forces that — we are now regularly assured — are pointless and irrelevant. Third, the world that existed then has been largely swept away: the Soviet Union, the global alignments of that time, the idea of a binary world of communism and capitalism, and the policies that had kept us on the brink of nuclear annihilation for decades. We live in a very different world now (though nuclear weapons are still a terrible problem). Things do change.
Maybe, in fact, there’s a fourth point to note as well. That, important as they were, the front-page stories about the liberation of Eastern Europe weren’t what mattered most all those years ago. After all, hidden away deep inside theNew York Times that autumn, you can find a dozen or so articles about global warming, as the newly recognized phenomenon was then called. And small as they were, anyone reading them now can see that so long ago the essential problem and peril to our world was already clear.
The thought of what might have been accomplished, had a people’s movement arisen then to face global warming, could break your heart. That, after all, was still a time when the Earth’s atmosphere held just above 350 parts per million of carbon dioxide, the maximum safe level for a sustainable survivable planet, not the 400 parts per million of the present moment (“142% of the pre-industrial era” level of carbon, the World Meteorological Organization notes). In other words, we’ve been steadily filling the atmosphere with greenhouse gases and so imperiling the planet and humanity since we knew what we were doing.
The Great Smog and the Big Wind
In that fall a quarter of a century ago, the world changed profoundly right before our eyes. Then we settled back into the short-term, ahistorical view that things are really pretty stable, that ordinary people have no power, and that the world can’t be changed. With that in mind, it’s worth looking at Germany today. Maybe because Germans know better than us that things can change for the worse or the better fast, that the world is not a stable and settled place, and that we do shape it, they have been willing to change.
At one point last spring, cold, cloudy Germany managed to get almost 75% of its electricity from renewable sources. Scotland — cold, gray, oil-rich Scotland! — is on track to achieve 100% renewable electrical generation by 2020 and has already hit the 40% mark. Spain now generates about half its electricity through clean and renewable sources. Other European countries have similar accomplishments. In fact, many of the changes that we in the United States will be marching for this Sunday have already begun happening, sometimes on a significant scale, elsewhere.
To remember how radical this new Europe is, recall that most of these places were burning coal not just in power plants or factories but in homes, too, not so many decades ago. Everyone deplores the horrific air of Beijing and other Chinese cities now, but few remember that many European cities were similarly foul with smoke and smog from the industrial revolution into the postwar era. In December 1952, for instance, the “Great Smog” of London reduced daytime visibility to a few yards and killed about 4,000 people in three days.
A decade before that, in response to the war Germany started, North Americans radically reduced their use of private vehicles and gasoline and planted more than 20 million victory gardens, producing vast quantities of food by non-industrial means. We have done that; we could (and must) do it again.
At least, we don’t burn coal in our homes any more, and in the U.S. we’ve retired 178 coal-fired power plants, phasing out many more, and prevented many new ones from being built. The renewable energy sources that were, people insisted, too minor or unreliable or expensive or new are now beginning to work well, and the price to produce energy in such a fashion is dropping rapidly. UBS, the European investment giant, recently counseled that power plants and centralized power generation are no longer good investments, since decentralized renewables are likely to replace them.
Of course, Germany and Britain are still burning coal, and Poland remains a giant coal mine. Europe is not a perfect renewable energy paradise, just a part of the world that demonstrates the viability of changing how we produce and consume energy. We are already changing, even if not fast enough, not by a long shot, at least not yet. The same goes for divesting from fossil-fuel investments, even though dozens of universities, cities, religious institutions, and foundations have already committed to doing so, and some have by now actually purged their portfolios. The excuse that change is impossible is no longer available, because many places and entities have already changed.
If you want to know how potentially powerful you are, ask your enemies. The misogynists who attack feminism and try to intimidate feminists into silence only demonstrate in a roundabout way that feminism really is changing the world; they are the furious backlash and so the proof that something meaningful is at stake. The climate movement is similarly upsetting a lot of powerful people and institutions; to grasp that, you just have to look at the tsunamis of money spent opposing specific measures and misinforming the public. The carbon barons are demonstrating that we could change the world and that they don’t want us to.
We are powerful and need to become more so in the next year as a major conference in Paris approaches in December 2015 where the climate agreements we need could be hammered out. Or not. This is, after all, a sequel to the Copenhagen conference of 2009, where representatives of many smaller and more vulnerable nations, as well as citizens’ groups, were eager for a treaty that took on climate change in significant ways, only to have their hopes crushed by the recalcitrant governments of the United States and China.
Right now, we are in a churning sea of change, of climate change, of subtle changes in everyday life, of powerful efforts by elites to serve themselves and damn the rest of us, and of increasingly powerful activist and social-movement campaigns to make a world that benefits more beings, human and otherwise, in the longer term. Every choice you make aligns you with one set of these forces or another. That includes doing nothing, which means aligning yourself with the worst of the status quo dragging us down in that ocean of carbon and consumption.
To make personal changes is to do too little. Only great movements, only collective action can save us now. Only is a scary word, but when the ship is sinking, it can be an encouraging one as well. It can hold out hope. The world has changed again and again in ways that, until they happened, would have been considered improbable by just about everyone on the planet. It is changing now and the direction is up to us.
There will be another story to be told about what we did a quarter century after civil society toppled the East Bloc regimes, what we did in the pivotal years of 2014 and 2015. All we know now is that it is not yet written, and that we who live at this very moment have the power to write it with our lives and acts.
In his novel The Plague, Albert Camus describes how death comes to an ugly French port in Algeria.
Thanks to an infestation of rats and the fleas they carry, the bubonic plague descends upon the city in the spring and intensifies during the hot summer. After a short period of denial, the residents panic, then sink into despondency and alcoholism. The port is put under quarantine. Undeterred by the apathy of the population and the danger of exposure, a small number of courageous individuals mobilize to fight the epidemic and eventually beat back the invader.
Camus took great care to detail the symptoms of the disease. But for all his medical exactitude, the French writer was not primarily interested in epidemiology. His inspiration was a different kind of infection. The novel is set some time in the 1940s. The plague is Nazism, and those who fight the disease stand in for the heroes of the French Resistance. It is a supremely apt allegory, for did not the Nazis claim that their victims were vermin? Camus surely must have enjoyed reincarnating the German fascists as the lowest of the low: bloodsucking fleas and desperate rats.
The twin plagues of Nazism and bubonic plague, except for some isolated cases, are behind us. But now it seems that a different pair of plagues is in our midst.
Today’s headlines are filled with similar stories of the spread of death and destruction in the Middle East and Africa. American commentators worry that these plagues will burst their borders and somehow spread to these shores. And, as in Camus’s novel, these diseases point to something larger, not the imposition of a new malignant system but the breakdown of the existing order.
In West Africa, the plague is Ebola, a terrifying fever that ends in massive hemorrhaging. The mortality rate, if untreated, is as high as bubonic plague. But at least with the modern version of the Black Death, treatment brings the mortality rate down to 15 percent. Ebola, by contrast, resists treatment. There are no vaccines for this hemorrhagic fever—though there’s promising news out of Canada—and the few treatments that have been used remain highly experimental. Doctors and officials establish quarantines and hope the disease will burn itself out. With airlines shutting down service to the infected region, hampering efforts to deliver medical supplies, the disease continues to rage on.
Ebola has so far claimed around 1,500 lives. This is terrible, of course, but it pales in comparison to how many children succumb to diarrhea in Africa. According to a 2010 report, 2,000 African children die every day of a disease that can be prevented through relatively cheap methods: safe water and hygiene. But diarrhea is not a communicable disease in the same sense as the plague or Ebola. And no one in the United States worries that a summit of African leaders or the repatriation of infected patients will spread an epidemic of diarrhea stateside. Ebola monopolizes the headlines because what grabs attention is fear (along with the usual colonial images of Africans as dirty and irresponsible).
The panic is, of course, more acute in the areas hardest hit by Ebola. Consider the case of Kandeh Kamara, a brave 21-year-old who volunteered to help fight the disease in Sierra Leone. He was promptly drafted to become a “burial boy” responsible for dealing with the corpses of the infected. “In doing their jobs, the burial boys have been cast out of their communities because of fear that they will bring the virus home with them,” writes Adam Nossiter and Ben Solomon in apowerful piece in The New York Times. Talk about thankless tasks. Kandeh Kamara initially received no payment for his work and had to beg for food on the street. He now gets $6 a day and hopes to rent an apartment, though landlords often refuse to lease to the burial boys.
Ebola is bad news, but it hasn’t generated the same kind of fury as that other fast-spreading scourge, namely the Islamic State (IS). The recent beheading of U.S. journalist James Foley has ratcheted up the outrage of U.S. observers.
It’s certainly not the first beheading that IS has done. The group specializes inmeting out barbarous punishments—decapitation, crucifixion, amputations. But just as Ebola’s impact became real for Americans when it infected people “like us”—two U.S. missionaries in Liberia—the United States was prompted to act against IS when it began killing non-Muslims, first the stranded Yazidis and then the abducted journalist.
IS has spread quickly, and so has the panic that has accompanied its territorial acquisition. There have been the inevitable analogies to Nazism. But even those who don’t invoke Hitler are quick to use Manichean language to describe the IS challenge.
“We can see evil through the eye slits of the ski mask worn by Foley’s killer,” writes David Ignatius in a Washington Post commentary entitled The New Battle Against Evil. “But stopping that evil is a harder task.”
The IS killers are a nasty piece of work, and their ideology is thoroughly malign. But I hesitate to use the language of good and evil. Such moralistic terminology presumes that they, the beheaders, are a Satanic force that can only be exorcised with whatever version of holy water our angelic forces dispense—air strikes, boots on the ground, military aid to the Kurdish peshmerga, efforts in the community to dissuade angry young men from taking the next flight to Mosul.
We, on the other hand, are good. We would never behead anyone. Those we execute “deserve” their punishment (though the occasional innocent person might inadvertently fall through the cracks). And the civilian casualties from our military offensives, because we are by definition good, are simply mistakes. After all, we don’t publicly celebrate the deaths of Afghan civilians from our drone strikes (45 in 2013 alone) or the deaths of over 400 children in Gaza. But our protestations of innocence are little consolation to the families of the victims.
At what point do mistakes aggregate into something evil? At the very least, do they prevent us from claiming the mantle of good? And, of course, it’s not just the mistakes that are problematic but also the deliberate policies that, for instance, align Washington with dictators and other murderous actors. U.S. disgust with IS may already have prompted intelligence sharing with the regime in Damascus, though the Obama administration has denied such deals.
Camus had some choice words for those who are reluctant to call evil by its name. “Our townsfolk were like everybody else, wrapped up in themselves; in other words they were humanists: they disbelieved in pestilences,” he wrote in The Plague. “A pestilence isn’t a thing made to man’s measure; therefore we tell ourselves that pestilence is a mere bogy of the mind, a bad dream that will pass away. But it doesn’t always pass away and, from one bad dream to another, it is men who pass away, and the humanists first of all, because they haven’t taken their precautions.”
Humanists perhaps disbelieve in pestilences. “I used to not believe in evil,” confesses Richard Cohen this week in a Washington Post column declaring a “return of evil” with ISIS. Once a liberal humanist, Cohen long ago remade himself into a liberal hawk.
I still consider myself a humanist. But my brand of humanism sees pestilence everywhere. Indeed, I tend to see pestilence not only in the acts of individuals but in the structures within which the plague takes root and spreads. And this is where the two plagues intersect, Ebola and IS. They both prosper where the immune system is weak.
When it comes to medical infrastructure, Africa definitely has a compromised immune system. The continent has been hit hard by HIV/AIDS (70 percent of those living with HIV are in Africa), cholera (major outbreaks took place recently in Senegal, Zimbabwe, and Sierra Leone), and malaria (an African child dies every minute from this disease). Ebola has spread rapidly because of critical shortages in medical staff and supplies.
But the deeper reason is environmental: the clear-cutting of forests that have served as a traditional barrier to pathogens. West Africa has one of the fastest rates of deforestation in the world, losing nearly a million hectares a year. The forests are Africa’s natural defenses, and Ebola is a sign that these defenses have been fatally weakened. What used to stay in remote villages now spreads quickly to urban areas.
The recent victories of IS in Syria and Iraq, meanwhile, suggest not a breakdown in the environmental system but in the political one. IS is not simply a band of serial killers. They have a distinct ideology and set of political motives. Nor does it matter whether they are operating in a formally dictatorial or democratic environment. IS thrives both where Assad rules with an iron fist and where Saddam is long gone.
The common denominator is chaos. IS has ruthlessly expanded in the grey areas beyond the reach of the rule of law. In Syria, it has prospered in regions that already broke loose from the country during the uprising. In Iraq, it took advantage of a paralyzing conflict between Shi’a and Sunni that left the northern reaches of the country tenuously connected to the central government.
Local governance, whether it’s democratic or authoritarian, serves the same function as the forests of West Africa. Such governance holds society together. When it deteriorates, the very cellular structure breaks down. In Ebola, the cell walls fray and the patient bleeds out. With a virus like IS, the fibers of the social fabric fray and large sections of the country bleed out.
There are, of course, many differences between a pestilence like Ebola and a movement like IS. But they are both the result of systemic breakdown. They are opportunistic infections.
In both cases there are no magic pills. Even if we come up with an antidote to this version of Ebola, as long as we continue to cut down the forests of Africa, more potent versions will continue to appear and spread. And if we attempt to obliterate IS only with bombs or boots on the ground, it will simply pop up somewhere else where the conditions favor such desperate efforts to create a totalitarian order. Instead we should focus on the conditions that give rise to these phenomena—and our role in helping to perpetuate these conditions.
Camus recommended vigilance. Pestilence, he concluded, “bides its time in bedrooms, cellars, trunks, and bookshelves and…perhaps the day would come when, for the bane and the enlightening of men, it would rouse up its rats again and send them forth to die in a happy city.” The current plagues have certainly been a bane. Whether they also help to enlighten us remains to be seen.
John Feffer is the director of Foreign Policy In Focus.
Here are some tough words about the Obama presidency from Cornell West, who argues persuasively that the fetish for the middle ground in politics often makes for poor leadership.
In the interview Thomas Frank asks West, “What on earth ails the man? Why can’t he fight the Republicans? Why does he need to seek a grand bargain?”
“I think Obama, his modus operandi going all the way back to when he was head of the [Harvard] Law Review, first editor of the Law Review and didn’t have a piece in the Law Review. He was chosen because he always occupied the middle ground. He doesn’t realize that a great leader, a statesperson, doesn’t just occupy middle ground. They occupy higher ground or the moral ground or even sometimes the holy ground. But the middle ground is not the place to go if you’re going to show courage and vision. And I think that’s his modus operandi. He always moves to the middle ground. It turned out that historically, this was not a moment for a middle-ground politician. We needed a high-ground statesperson and it’s clear now he’s not the one.”
West also says:
“He posed as a progressive and turned out to be counterfeit. We ended up with a Wall Street presidency, a drone presidency, a national security presidency. The torturers go free. The Wall Street executives go free. The war crimes in the Middle East, especially now in Gaza, the war criminals go free. And yet, you know, he acted as if he was both a progressive and as if he was concerned about the issues of serious injustice and inequality and it turned out that he’s just another neoliberal centrist with a smile and with a nice rhetorical flair. And that’s a very sad moment in the history of the nation because we are—we’re an empire in decline. Our culture is in increasing decay. Our school systems are in deep trouble. Our political system is dysfunctional. Our leaders are more and more bought off with legalized bribery and normalized corruption in Congress and too much of our civil life. You would think that we needed somebody—a Lincoln-like figure who could revive some democratic spirit and democratic possibility.”
The great novelist Wallace Stegner sorted the conflicting impulses in his beloved American West into two camps. There were the “boomers” who saw the frontier as an opportunity to get rich quick and move on: the conquistadors, the gold miners, the buffalo hunters, the land scalpers, and the dam-building good ol’ boys. They are still with us, trying to drill and frack their way to Easy Street across our public lands. Then there were those Stegner called the “nesters” or “stickers” who came to stay and struggled to understand the land and its needs. Their quest was to become native.
That division between boomers and nesters is, of course, too simple. All of us have the urge to consume and move on, as well as the urge to nest, so our choices are rarely clear or final. Today, that old struggle in the American West is intensifying as heat-parched, beetle-gnawed forests ignite in annual epic firestorms,reservoirs dry up, and Rocky Mountain snow is ever more stained with blowing desert dust.
The modern version of nesters are the conservationists who try to partner with the ecosystems where they live. Wounded landscapes, for example, can often be restored by unleashing nature’s own self-healing powers. The new nesters understand that you cannot steer and control an ecosystem but you might be able to dance with one. Sage Sorensen dances with beavers.
Dances with Beavers
The dance floor is my Utah backyard, which, like most backyards out here, is a watershed. At its top is the Aquarius Plateau, the horizon I see from my deck, a gracefully rolling forest of pines and aspens that stretches for 50 miles to the south, 20 miles wide at its midpoint, and reaches 11,300 feet at its highest ridge.
The forest on top of the plateau is unique, as trees rarely grow almost two miles above sea level. That high forest is heated by the deserts that fall away around the plateau’s shoulders, culminating in the amber, bone, and honey-toned canyons of Capitol Reef National Park on its eastern flank and on the west by Grand Staircase Escalante National Monument.
During a long career with the Bureau of Land Management, Sage Sorenson saw firsthand how beavers created rich green habitat out of overgrazed and burned-over land. Now retired, he calls himself a “beaver believer” and devotes his days to monitoring and protecting scattered “remnant” beaver colonies in our region. Quietly but persistently, he advocates for their reintroduction onto stressed landscapes that need their services.
Beavers are the original geo-engineers. It’s no exaggeration to credit them for their major role in building the North American landscape. In pre-colonial times, there were as many as 400 million of them. They used their big buckteeth and tough paddle-tails to build dams across every stream imaginable, spreading water to a Noah’s Ark-worth of creatures that thrive in the wet habitats they create. Now, of course, they are mostly long gone from the land, and conservationists want them back.
Sorenson recently trained and got certified to trap and transport beavers in anticipation of restocking the streams that tumble down the Aquarius Plateau. He is convinced that it is only a matter of time before they are reintroduced. After all, several of those streams have already been scientifically assessed and identified as prime candidates for such a reintroduction program. But when I talked to him at a café in the small hamlet of Boulder, Utah, he was feeling discouraged.
A remnant colony of beavers along North Creek, he told me, is just about gone. Over the last two years, at least 34 of them have been illegally shot or legally trapped by a local irrigation company. Although beaver reintroduction is getting rave reviews in places like Scotland where the last one had been trapped out hundreds of years ago and Oregon where they are healing land hammered by logging, in Utah the road back will be rough.
Flat-Tail Climate Hero
Beavers were once abundant across the Aquarius Plateau, but they have now retreated to its high headwaters where they do not compete with cattle or cowboys with guns. Visiting them requires strong lungs for steep hikes and sturdy boots to navigate flooded meadows. Up close, beavers look like especially large rodents that swim. Call them cute if you care to, but a wet mammal that smells like its mud hut is neither cuddly nor charismatic. They are not, in other words, like the penguins or polar bears that adorn fundraising appeals from wildlife advocates.
Nevertheless, as Sage patiently explains, they are key to the restoration of damaged watersheds. First, their dams create ponds and wetlands for diverse plants, amphibians, fish, and fowl. Eventually, those ponds fill with silt and become meadows, creating yet more habitat for another round of plants and animals.
Letting beavers do their work is one powerful way to make the land and its creatures resilient in a time of climatological stress. For example, across the planet a wide range of amphibians, including frogs and salamanders, are declining fast, becoming rare or extinct. Their sudden decline may be due to habitat loss, pollution, viruses enabled by a warming climate, or all of the above, but their disappearance is one more measure of the ecological catastrophe now underway. Beavers make wet habitat where amphibians can recover and thrive.
The aquatic insects that bloom in wetlands feed populations of stressed songbirds. Their ponds shelter fingerling fish — beavers are vegetarians — and baby ducks. Beavers are ecological servants par excellence who give life to the land. They are not only beneficial agents of biodiversity, however: humans benefit, too.
In Western forests, the beaver’s stick-in-the-mud architecture spreads, slows, and deepens the flow of water from spring runoff so that it recharges underground aquifers, springs, and seeps. Slowing that runoff means that the streams feeding reservoirs last longer, possibly all summer. That’s important for local agriculture, which depends on irrigation. Beaver dams improve water quality by trapping sediment that filters pollution. A lush-green landscape also inhibits landslides, floods, and fire. So beavers are not only good for the usual crew of endangered species, but also for millions of humans whose drinking water originates in heat-stressed watersheds that could be restored by the beaver’s hydrological habits.
Considering all the benefits beavers bring with them, why haven’t we rushed to return them to their keystone role in the Western landscape? The simple answer to a complicated question is one word: cows.
When beavers re-occupy their historic homelands, they compete with the human economy that once drove them deep into the wilds. Farmers and ranchers who irrigate their fields via ditches and culverts hate them. There are simple techniques to guard against beavers clogging irrigation systems but they are either unlearned or resisted as yet another example of unwanted government intrusion on Western life. Across the rural West, ranchers have power and influence way beyond their numbers or their contribution to the economy.
The Elephant in the Room Is a Cow
One man’s keystone species is another’s varmint. For conservationists like Sorenson who are devoted to bringing beavers back, seeing one with a bullet hole in it is not just sad, but taken as a very personal warning. Despite the popularity and success of beaver reintroduction elsewhere, in much of the American West it runs into an outsized obstacle — the iconic western cow. Not ol’ Bossy chewing a cud in Wisconsin, but the wild steer chased by a cowboy with a lasso yelling “yeeha!” That cow is sacred.
In reality, cattle ranching is a tough, marginal business in this part of America and grazing on public lands makes it possible. In other words, it’s heavily subsidized by distant taxpayers. Those grazing fees Cliven Bundy objects to cost less than a buck and a half per cow per month for all it can eat on federal land — food stamps for cows, indeed. Cattle ranchers, whose families have been on the land for generations, think of grazing allotments on federal land as an entitlement, even if that attitude contradicts the image of the independent cowboy they cherish. About 250 million acres — or more than half of the federal lands administered by the Forest Service and Bureau of Land Management — are open to cattle grazing, and that’s a large arena where cowboys and conservationists compete.
Moving cows out of sensitive riparian areas (streams and springs) or putting competitors like wolves and beavers onto the land with them is seen by ranchers as the start of a slippery slope that might lead to removing cows altogether. That is, however, unlikely. In the West, cows rule. The soundtrack of Manifest Destiny may once have been the sharp crack of gunfire aimed at Indians and wolves, but it was followed by a mellow moo. Cows graze over the bones of bison and the other creatures we eliminated to make room for them.
Our Dams, Not Theirs
Like the beavers they replaced, cows have reshaped the land — not, in their case, by creating habitat but by destroying it. The pioneers who first came upon southern Utah described the vast grasslands they found there. That grass is long gone. The soil blew away, too, and rusting fences now swing above gullies or are buried under dunes. When millions of cows and sheep were let loose on that fragile soil, massive erosion and the disappearance of that vast native grassland followed. It never came back. When Congress finally stepped in and passed grazing regulations in 1934, improvements followed.
Conservationists claim that cows are today contributing to the die-off of the West’s beloved aspen groves by eating tree seedlings and short-circuiting forest succession. They also spread highly flammable cheat grass in their voluminous poop. But whatever damage cows do directly to public lands pales in comparison to the way the infrastructure necessary for the cattle business has captured western water sources and de-watered western lands.
Stegner’s boomers dammed thousands of rivers and streams, while building pipelines through our national forests down to valley floors. Aqueducts, canals, and tunnels followed. The growth of many western towns is rooted in the building of a water infrastructure that has allowed us to suck the forests dry in order to irrigate the fields of alfalfa that feed those cows. And yet — hold onto your hats for this — only a miniscule 3% of the nation’s beef is raised in the West.
Yet at least 80% of the water out here goes to alfalfa and other cow-food crops. When you get those dire warnings about the Colorado River going dry and Phoenix and Vegas blowing away, remember this: because the cattlemen own the rights, cows get a lion’s share of whatever water is left after the western watersheds are baked and burned. We grow so much cow-food that we now essentially export our precious water to China in the form of alfalfa.
Beavers as Underdogs
Now maybe you’re beginning to see just whythe odds are so stacked against the lowly beaver. Americans have forgotten the formative nature of our relationship with that creature. Not only did European explorers encounter a landscape that had been thoroughly carved out and watered by them, but a robust trade in beaver pelts drove settlement. Pelts that were made into warm hats for wealthy people were a kind of rodent gold and trappers couldn’t get enough of them.
Under the grinding wheel of a voracious commerce in furs, beavers were so trapped-out that they seemed to be headed for the fate of the once plentiful but now extinct passenger pigeon. This precipitous decline was reversed by one of North America’s earliest conservation campaigns.
In the 1920s, through the new medium of film the public imagination was captured by a Canadian Indian named Grey Owl. He lived on a lake with his wife, Anahareo, and raised orphaned beaver kits, explaining their ecological importance and the consequences of their loss to a public unfamiliar with the beaver’s role in keeping forests healthy. As the original beaver-believers cuddled their kits, audiences ooohed and aaahed.
Eventually Grey Owl was exposed as Archie Belaney, an Englishman posing as an Indian, but by then the message he had delivered had been translated into governance. Beaver trapping was strictly regulated across most of the West and eventually many colonies recovered. Today, there are far more beavers in North America, perhaps 10 million, than at their near-extinction moment, but their distribution on the land remains thin and uneven. Once upon a time, hundreds of millions of them helped create the American landscape. It would be fitting if, in the era of global warming, the beaver’s influence came full circle, this time as a means of making heat-stressed landscapes more resilient.
Are Beavers a Plot Against Humanity?
Most of the land in the American West is federally owned and managed, despite recent schemes by local tea-hadis to take it over and sell it to the highest bidder (or closest crony). Because federal lands are a national treasure that we own together, there are rules for the sustainable use of it and sanctions for abuse. Those rules and policies are negotiated by stakeholders and change over time. That is happening now as our forests and grasslands are baked by prolonged drought.
In 2009, a Utah Beaver Advisory Committee composed of wildlife biologists, forest rangers, ranchers, trappers, farmers, and conservationists hammered out a plan to restore healthy beaver populations to their historic range across Utah “where appropriate.” The beaver’s ecological service was finally acknowledged, but with the proviso that it be balanced against “human needs.” Getting such an endorsement for restoration and protection, however qualified, was an important first step and a catalyst for a grassroots campaign to “leave it to beavers.”
An agreement had been reached among stakeholders traditionally at odds. It was a rare feat of consensus building in a political environment where acrimony generally reigns supreme and it could have been a model for resolving other conflicts over land use and regulation. Instead, local politicians, in a panic that beavers might “steal” water, have effectively resisted it.
Joe Wheaton, who teaches watershed hydrology and restoration at Utah State University, says the science on this is clear: there is no net water loss downstream from beaver dams. If anything, they only increase a watershed’s capacity by capturing water that would otherwise be lost to floods. But the cattlemen aren’t buying it. Science, you see, is just another liberal ideology. As a Kane County commissioner put it succinctly, “Beavers are an environmentalist plot.” Think of those dead beavers along North Creek that Sage Sorenson described to me as collateral damage in the ideological civil war now raging across the region.
You Can’t Drink an F-35
The Grand Canyon Trust and a local citizens group, Boulder Community Alliance, have tried to fill the gap between the advisory group’s clear intention and the state’s hesitance to overrule obstructionist county commissioners and actually implement the plan. The Trust recruited local volunteers and trained them to assess canyon drainages using the best scientific criteria and methods available. Several streams were identified as candidates for beaver reintroduction.
Volunteers monitor and report on the few existing beaver settlements like the one being decimated in North Creek. Through education and advocacy they are building a constituency for putting beavers back on the land to do their job. They have faith that the benefits of beaver reintroduction will become obvious as re-habitation happens. When the time comes to move beavers into new streams, they will be ready.
The kind of homegrown resilience practiced by Sage Sorenson and thousands of other backyard conservationists gets a paltry piece of the taxpayer pie compared, say, to homeland security. I used to say that in the long run we’d be wiser to invest in restoring watersheds than putting a camera on every corner. As it happens, given the tenacious drought now spreading across the West and Southwest, the long run seems to be here, sooner than expected. Even the Pentagon now acknowledges that ecological catastrophe sows human turmoil and suffering that eventually blows back our way. For the cost of just one of the 2,400 F-35 fighter jets we are committed to buying at historic prices, we could restore the stressed Aquarius watershed.
But the beavers don’t care what we do. They just do their own thing. They are like their human partners: persistent and oh so local.
Saving The World, Stick by Stick
Each ecosystem has its own particular dynamic. There are endless variables to understand. That’s why conservation work is ultimately local. It focuses on improvements in this river and that forest, specific habitats and watersheds with specific conditions and a set of specific inhabitants and users.
The world we aim to save is a planet of mundane dirt, air, and water that, when woven together, somehow becomes a transcendent whole. It’s a diverse universe of living plants and critters not well-suited for one big solution. Rather, it calls forth a million small solutions that add up, like the natural world itself, to a whole greater than the sum of its parts. Or perhaps there are no parts at all, just participants.
Will introducing beavers onto wounded watersheds save the world? The answer is: yes. That and all the other acts of restoration, protection, and restraint, small and large, individual and collective, taken together over time. Sure, it’s not the same as the U.S. taxing carbon or China abandoning coal. Restoring a watershed doesn’t curb the corporations that reduce communities to commodities. But in addition to the global goals we support, our responses to ecological crisis must be grounded in the places where we live, especially in the watersheds that nourish our bodies.
Rewilding tattered land is holistic because it sees and honors connectivity. It trades hubris for humility by acknowledging complexity and limitations. Its ultimate goal is landscape health and resilience, not the well-being of a small handful of stakeholders.
If we want to construct a healthy and resilient world for ourselves and our fellow creatures, we could do worse than look to the lowly beavers for hints on how it can be done. They build a vibrant world for themselves and so many others by weaving one small limb into another, stick by stick by stick.