Jon Ralston, the dean of political reporting in Nevada, has spread nothing less than a pack of lies about what went down at the state’s Democratic convention on Saturday. And the fact averse oligarchic national media has run completely riot with the provable falsehoods. No chairs were thrown at the convention Saturday. No death threats were made against the chair of the convention Roberta Lange. And Bernie Sanders delegates were not simply mad because their louder shouting was ignored.
Rachel Maddow ran a deceptive clip on MSNBC saying chairs were thrown while reportedly showing footage of chairs thrown at a wrestling production. (I cannot find the original Maddow clip with this as of yet). People on social media then insisted that networks had shown actual footage of chairs thrown at the convention. Maddow retreated only a bit by having Ralston on to say that even though he had not seen the chairs thrown, other eyewitnesses have told him the video is wrong. CNN had Debbie Wasserman-Schultz on to denounce Bernie Bros throwing chairs at the stage.
The biggest truth I’ve learned during this election season:
It’s not about the candidates. It’s never just about the candidates and the “horse race.” It’s always about something much much bigger. And most of us — because of our fascination with celebrity and the seemingly larger-than-life individuals in history — end up acting like enablers of the pernicious lie that it’s just about Bernie and Hillary (and their personal ambitions).
It’s difficult to conceive of a candidate who’s committed to the more prosaic truth that it’s really about all of us … and about the hard work of building a movement, and about our responsibility to work for the success of our principles down-ballot, and in our neighborhood.
It goes without saying that in a democracy everyone is entitled to his or her own opinions. The trouble starts when people think they are also entitled to their own facts.
Away out West, on the hundreds of millions of acres of public lands that most Americans take for granted (if they are aware of them at all), the trouble is deep, widespread, and won’t soon go away. Last winter’s armed take-over and 41-day occupation of Malheur National Wildlife Refuge in southeastern Oregon is a case in point. It was carried out by people who, if they hadn’t been white and dressed as cowboys, might have been called “terrorists” and treated as such. Their interpretation of the history of western lands and of the judicial basis for federal land ownership — or at least that of their leaders, since they weren’t exactly a band of intellectuals — was only loosely linked to reality.
At least some of them took inspiration from the notion that Jesus Christ wrote the Constitution (which would be news to the Deists, like James Madison, who were its actual authors) and that it prohibits federal ownership of any land excepting administrative sites within the United States — a contention that more than two centuries of American jurisprudence has emphatically repudiated.
The troubling thing is that similar delusions infect pockets of unrest throughout the West, lending a kind of twisted legitimacy to efforts at both the state and national level to transfer western public lands to states and counties. To be sure, not all the proponents of this liquidation of America’s national patrimony subscribe to wing-nut doctrines; sometimes they just use them.
Greed can suffice to motivate those who lust for the real estate bonanzas and resource giveaways that would result if states gained title to, say, the 264 million acres presently controlled by the Bureau of Land Management (BLM). General combativeness and hostility toward government also play their roles, and the usual right-wing mega-donors, including the Koch brothers, pump money into a bewildering array of agitator groups to help keep the fires of resentment burning.
The louder the drum chant of crazy “facts” gets, the more the Alice-in-Wonderland logic behind them threatens to seize the popular narrative about America’s public lands — how they came to be and what they represent. This, in turn, prepares the way for the betrayal of one of the nation’s deepest traditions and for the loss of yet more of its natural heritage. Conversely, those who value American public lands have been laggard in articulating an updated vision for those open spaces appropriate to the twenty-first century and capable of expressing what the unsettled “fruited plains” and “purple mountain majesties” of the West still mean for our national experience and our capacity to meet the challenges of the future.
The Malice at Malheur
The leaders of the Malheur occupation, Ammon and Ryan Bundy, are the sons of Cliven Bundy, a Nevada rancher and public lands scofflaw who gained notoriety two years ago following a standoff with federal law enforcement officers. Back in the 1990s, the elder Bundy had stopped paying grazing fees, claiming that the federal government had no authority to regulate the public lands where his cattle fed. In 2014, with Bundy $1.1 million in arrears and his grazing permits transferred to the local county government, the Bureau of Land Management moved to round up and confiscate his 400 head of cattle.
Via social media, Bundy appealed to militia and “patriot” groups for support, and hundreds of armed resisters rallied to his ranch 90 miles north of Las Vegas. When the ensuing showdown threatened to become a bloodbath like the Waco siege of 1993, the authorities withdrew.
The government’s retreat and its failure to arrest members of the Bundy family or their allies for acts of armed resistance set the stage for the Malheur takeover, but the roots of the incident go back to the Sagebrush Rebellion of the 1970s and 1980s and the Wise Use Movement that succeeded it. The Sagebrush Rebellion was triggered by a national inventory of public lands to identify areas appropriate for designation as “wilderness” (under the National Wilderness Preservation System). Its advocates also protested the enforcement of government protections for archaeological sites and endangered species. Wise Use groups echoed those complaints and essentially argued against anything the environmental movement was for, urging the amped-up exploitation of natural resources on western lands.
Ammon Bundy put his own rogue-Mormon spin on that message by claiming divine inspiration and sanction for his actions. Ostensibly, the Malheur occupation was intended to show support for nearby ranchers Dwight and Stephen Hammond, who faced jail terms for setting illegal range fires (and who immediately distanced themselves from the occupation). But Bundy didn’t stop there. He called on “patriots all over the country” to join his cause and help “free up” federal land for ranching, mining, and logging, pointedly adding, “We need you to bring your guns.”
Malheur was an odd place for white guys to make a stand in favor of “returning” federal land to its “rightful owners” — that is, themselves. The refuge was established in 1908 when Teddy Roosevelt declared a modest area of public domain to be a wildlife refuge. If anyone then occupied the land, it was members of the Burns Paiute tribe, not white settlers. In the 1930s, the refuge expanded when the government bought the bankrupt remnants of a former cattle baron’s empire. At the time, Malheur was its own mini-Dust Bowl. The purchase, which enlarged protection for once-fabulous wetlands supporting thousands of migrating birds, was essentially a bailout.
The people who joined the Bundys in the Malheur occupation were a strange lot. Few had any relationship to ranching or actual cows, aside from sitting down to eat a hamburger. Some were ex-military; others claimed to be (but weren’t). Quite a few had links to Tea Party groups or to “patriot” organizations including the Oath Keepers, theThree Percenters, and an assortment of other militia outfits. One described himself as “an old hippie from San Francisco,” jazzed by the excitement of the occupation and uncaring about its purposes. He also happened to be a convicted murderer (second degree) — of his father.
Straight thinking was not a requirement for admission to the occupiers’ cause. The fellow who photogenically rode his horse around the refuge while displaying a large American flag, for example, turned out to be acutely concerned lest the federal government divest itself of public lands. He feared the loss of access to cherished places where he liked to ride his horse. Because of that, he joined an armed effort aimed at forcing the government to do exactly what he didn’t want. Go figure.
Following the shooting death of LaVoy Finicum, the Malheur occupier who committed suicide-by-cop at a roadblock on January 26th, the occupation unraveled. At last count, the Bundy brothers and 24 others had been arrested and charged with a laundry list of crimes, including conspiracy to prevent federal employees from carrying out their duties and destruction of public property. All but one or two of them are still in jail.
Nor did the feds stop there. They finally nabbed Cliven Bundy at an airport after he attended a memorial service for Finicum, and also charged 18 others in connection with the 2014 Nevada standoff. Some of the 18 were already in custody for their involvement at Malheur. Bundy’s illegal cattle, which the government unsuccessfully tried to confiscate in 2014, remain at large.
More Mad Cowboy Disease in Utah
Despite the government’s thorough, if belated, crackdown, the hostility toward public lands on display at Malheur has hardly been contained. Such resentments are of a piece with the anger suffusing the presidential campaigns, although paradoxically enough Donald Trump has spoken out in favor of retaining federal lands. (Ted Cruz, by contrast, campaigned against Trump in Nevada by promising to “fight day and night to return full control of Nevada’s lands to its rightful owners, its citizens.”)
The darkest side of this “movement” is undoubtedly its well-documented association with armed militia groups and their persistent threat of violence. Gunmen from the Oath Keepers, for instance, obstructed federal officials from shutting down mines violating environmental regulations in both Oregon and Montana. According to the Southern Poverty Law Center, the current, rapid growth of militia groups is unprecedented and appears to have been spurred by the 2014 standoff at the Bundy ranch. Notices for “meet-ups” among “patriots” to show support for the incarcerated Bundys and the “martyred” Finicum are abundant on social media.
A similar virus has infected several western state legislatures, including those of Montana, Oregon, Wyoming, and Nevada. Representative Michele Fiore, who hovered at the fringes of the Malheur occupation, for instance, introduced a bill in the Nevada legislature to transfer federal lands there to state control, irrespective of federal wishes. Considered patently unconstitutional, it was quickly dismissed. A Nevada senate resolution calling on Washington D.C. to initiate action to transfer those lands received more serious consideration.
The game is being played more cagily in Utah. There, lawmakers approved legislation in March that authorized and partly funded the state’s attorney general to sue the federal government for title to approximately 30 million acres of Utah public lands. The suit would pursue strategies advanced via a study produced by a New Orleans law firm outlining “legitimate legal theories” that, it contended, might lead to the wholesale transfer of lands to the state.
The expected cost of the litigation has been estimated at $14 million and Utah has sought allies among other western states. So far, they’ve found no takers willing to join the suit, possibly because other attorneys general have concluded that the legal theories behind it are rubbish.
Utah has also exported its anti-federalism to Capitol Hill. One of its congressmen, Rob Bishop, currently chairs the House Natural Resources Committee and sympathetically held hearings in February on several bills, introduced by representatives from Alaska, Idaho, and Utah, that would place federal lands under state control. Lisa Murkowski, a Republican from Alaska and chair of the Energy and Natural Resources Committee, has promoted similar bills in the Senate.
Hanging on to “the Solace of Open Spaces”
Lost among the headlines, sound bites, and posturing is any serious discussion of America’s public lands and their purposes. Ammon Bundy was completely correct, early in the occupation of Malheur, when he said, “This refuge is rightfully owned by the people.” His problem was that his definition of “people” only included people like him. The Burns Paiute tribe, whose ancestral homeland includes Malheur and whose sacred sites are protected by federal law, certainly did not figure into his plans. The thousands of annual visitors to Malheur, who appreciate its 320 bird species and other wildlife, and the millions more who support the National Wildlife Refuge System, also seem not to be the “people” Bundy had in mind. The same might be said for anyone attracted to the idea of intact natural landscapes and functioning ecosystems.
The greatest vulnerability of America’s public lands is that the millions of their rightful owners scarcely know they exist. Ask the average New Yorker what the Bureau of Land Management is, and the odds are that you’ll get a confused stare. Even many people in the West, who live close to those public lands, have trouble differentiating the National Parks from the National Forests, though those two classes of land are administered for substantially different purposes by two different government departments, Interior and Agriculture. Yet most people agree that the wild open spaces of the nation’s grandest landscapes constitute a collective treasure.
In essence, they are our national commons, our shared resource, not just for material goods, like timber, clean water, and minerals, but for recreation and inspiration. Seventy percent of all hunters are said to use public lands, and the percentages of birders, campers, hikers, and other recreationists must be at least as high. Public lands also help buffer us against the uncertainties of the future. Only public lands, for instance, spread unbroken over great enough distances to offer the connectedness that many plants and animals will require to adapt, to the extent possible, to a warming climate. Moreover, as the struggle to wean the economy away from fossil fuels continues, only public lands, with their unified federal ownership, are susceptible to the kind of sweeping shift in national energy policy necessary to “keep it in the ground.”
For all these reasons, the future of the nation’s 640 million acres of public lands deserves a more prominent place in our national discourse. The patterns of the past, emphasizing extractive, industrial uses of those lands, have long been in decline. An alternate path focused on restoration and biodiversity conservation has instead steadily gained traction, and indeed, its priorities — which include making room for endangered species — have inspired many of the objections of the Malheur occupiers.
Two things are certain: when large acreages of public domain are transferred to the states, significant portions of them end up being sold off to private interests. That creates a new kind of inequality that, in the natural world, parallels this era’s growing economic gap between rich and poor. It is an inequality of access to big, wild lands and to the ineffable something that Wyoming writer Gretel Ehrlich called the “solace of open spaces” and Pulitzer-winning novelist Wallace Stegner termed “the native home of hope.”
Thanks to the great western commons, which the Bundys and their legislative champions would like to dismantle, all Americans still enjoy the freedom to roam on some of the most spectacular lands on the planet. That access and that connection have been part of the American experience from Plymouth Rock through the westward migration to the present day. It is part of what makes us Americans.
The Depression-era folksinger Woody Guthrie understood the issues attending the privatization of common land. He offered his opinion of them in the least sung verse of his most famous song:
“There was a big high wall there that tried to stop me
Sign was painted, said: “Private Property”
But on the back side it didn’t say nothing —
This land was made for you and me.”
In January 2011, I presented a futures research project to the Progressive Caucus in Congress, then the largest of all the caucuses in that body. The report, Progressives 2040 — which was sponsored by ProgressiveCongress.org and published by Demos — analyzed a large set of major trends that would shape the future of the progressive movement for the next three decades, and offered a set of scenarios that illustrated how these trends might work together to create a range of possible futures that the movement will need to be prepared for.
This is the first post in “What We Know About The Progressive Future” — a series that I imagine will be a long (probably 10-12 post) look at that research five years on, updating my conclusions and taking a fresh look at the big drivers and high leverage points that will determine the future of our movement.
For most pundits, the most striking thing about the Iowa Caucus was the virtual tie between the two Democratic candidates (which portends a longer and perhaps more exciting election season and higher ratings for those in the media to look forward to), and the surprising 1-2-and-3 order of Cruz, Trump, and Rubio. I’m writing this less than 24 hours after the caucuses ended, and more than enough on both these topics has already been written by others (for God’s sake, people, it’s just Iowa), so I’m going to spare you another analysis ex cathedra from my belly button as to What It All Means For November.
I’m far more interested in another trend that emerged last night — a small detail that will almost certainly have a much longer historical tail than anything else that might happen between now and Election Day. This trend was crystallized by the stunning fact that Bernie Sanders got 85% of the votes of caucus-goers under 30.
That’s not a typo. Eighty-five percent.
That’s a number that strategists from every end of the political spectrum need to be paying attention to, because it is heralding the arrival of the Millennial Generation as a political force to be reckoned with.
My report saw this coming. Back in January 2011, I wrote this about them:
The Millennial generation (born 1980-2000) is the largest and most ethnically diverse generation in American history, with 44% identifying as members of a racial minority. They are the most globally connected generation to date: they travel more, speak more languages, and have friends all over the world. They are more progressive in their core values and attitudes that any cohort we’ve seen in at least a century. And they are rising fast: by 2020, they will be outvoting their elders, dominating elections and bringing their own priorities to the table. We can expect the Millennials to launch their first serious presidential candidate in 2020, and elect their first president probably no later than 2024.
Perhaps the most important fact about the Millennials is the sheer size of this generation. They’re the first cohort we’ve seen in the past 40 years that’s actually big enough to swamp their Boomer parents, whose interests and worldviews have dominated American politics ever since the youngest of them hit voting age in the late 1970s. The Boom was the biggest generation in American history, to the point where their sheer size itself was transformative (as they say: quantity has a quality all its own). But the Millennials are even bigger. And between now and 2020, the youngest edge of this generation will finally turn 18 and register to vote. The results stand to be at least as transformative for us as a nation as the moment when the Boomers themselves arrived.
Conservative Millennials? Don’t hold your breath
Any number of GOP pundits have written thumb-sucking articles explaining how this cohort is going to become more conservative as it ages (because every generation does, right?) Feel free to rip those up: it’s not likely to happen, for several reasons — starting with the fact that no, not every generation does. The Boomers did, because from left to right and youth through their approaching old age, they’ve shared a belief in radical individualism — the primacy of the individual over any claims made by society — that fed everything from Evangelicalism and free market fundamentalism on the right to New Age religions and social experimentation on the left. That individualism is the one shining through-line that defines everything that generation has ever embraced. It made them hippies. And it also made them vote for Reagan.
The Millennials are their historical opposite number — a generation raised from babyhood to cooperate, share, include, network, and self-organize. They value conformity (Boomers and Xers are horrified by the “calling out” ritual that Millennials run on each other constantly as they vigilantly police each other’s behavior. We’d have choked on our own spit before telling each other what to say, think or do; and would have rightly expected to be told to fuck off if we tried it), and as this pervades their politics in the coming decades, it’s going to involve a lot of telling other people how they should live. That’s how their GI grandparents created and enforced the great American Consensus of the ’40s, ’50s, and ’60s, and it’s how they’re going to re-create a new consensus about the Next America they’re going to build.
That bred-in-the-bone collectivism is likely to be as durable a lifelong feature as Boomer individualism has been; but it stands in stark opposition to conservatism as it’s currently constituted. It’s possible to imagine another, distinctively Millennial form of conservatism emerging in time — but it would have to be rooted in the idea of a strong social contract, one that obligates individuals to cede some of their desires to the greater good, represented by trusted authorities — and is willing to use social shame as an enforcement mechanism. The GOP is a long way from offering any narratives along these lines now. If they do emerge, it could take another 20 years or more, becoming something today’s Millennials embrace as they age on through their 40s, 50s and 60s.
Other conservatives hold out hope that that all-time-high number of Millennials from immigrant families will benefit them in time, since the usual pattern has been for second-generation immigrants (the first generation born here) to do very well educationally and economically, and to vote more conservatively than either their parents or their third-gen kids. That might be a very plausible scenario — except for the nasty fact that Millennials have already grown up scarred and terrorized by a GOP that has never been able to lay off immigrant-bashing. Again, it’s going to take a radical change within that party — plus another 15 years of over-the-top effort — to win even the grudging trust of a generation that’s already marinated in decades of conservative anti-immigrant hysteria before that’s even remotely likely.
In any event: anybody waiting for the Great Millennial Conservative Revival probably shouldn’t hold their breath. If it comes at all, it’s going to be a very long while indeed. In the meantime, these young adults have a revolution to pull off. And that moment is coming — much sooner than anybody seems ready to think.
Millennials and Elections
Obama, to his credit, was the first candidate to recognize the raw political power and profound unrest of this rising cohort in 2008. Even though fully half of the Millennial generation was still too young to vote, his overt efforts to capture the energy and attention of the half that could was a conscious strategy. The Millennials ended up supplying him with the margin that put him over the top in the election — support he later rewarded by bringing home the troops (most of whom were Millennials) and restructuring the federal student loan program to make over $30 billion more in Pell Grants available and reduce the loan burden on new graduates (both of which were policies I pointed to in my original 2011 report).
But the Millennials want more. They’re looking into a future that most of them understand is a fatal dead end without a radical, rip-up-the-floorboards restructuring of how the entire planet works — how we do everything from energy and money to community and education to transportation and agriculture. This yearning for a different kind of world even has the potential to upend our current understandings of “right” and “left,” as I wrote in my report:
Some research suggests that this generation’s politics lean toward the “independent” and the “centrist.” However, those words don’t mean the same thing to under-30 Americans that they do to older ones. The self-described “independents” also express core values that are deeply collectivist and inclusive, which gives them a strong affinity for progressive ideas and solutions. (Studies by Pew and Barna have even found these same affinities among self-identified conservatives in this cohort.) Likewise, these “centrists” see their generation’s communal focus on a shared future and shared prosperity as a matter of plain common sense. To them, “we’re in this together” is not a radical idea; indeed, it stands at the center of their politics.
The Millennials spurned Hillary in 2008 because they were craving a true change candidate — and Obama promised to be that. But in the end, the change he could deliver wasn’t enough. And that’s why this generation is going, overwhelmingly, for Bernie Sanders, whom they see as sitting entirely outside the corrupt party system that made it impossible for Obama to give them the goods, unbeholden to Wall Street, uncontaminated by party cronyism, unfiltered in the media — someone who seems to be entirely their own. This is what their candidates look like — and are going to continue to look like for the next several election cycles.
Given that the youngest 15% or so of the Millennial cohort is still too young to vote, it’s not clear that the Millennials will get their revolution this year. My prediction above that they’d dominate our elections by 2020 was based on the fact that that’s when the very tail end of the cohort — the ones born in 2000 — will all have reached adulthood, putting them finally at their full political strength. Whether or not they show up for 2016 is also complicated by a few other factors, including:
How disillusioned the older ones are following their experience with Obama, whom many of them feel very disappointed by — a real problem that surfaced in 2012, when many of them didn’t return to the polls.
The general tendency of young adults in their 20s to not vote. Voting is a behavior that becomes more reliable with age. By 2020, the oldest Millennials will be 40, and half will be over 30 — which means they should start showing up far more regularly.
Persistent efforts on the part of the GOP to disenfranchise students, which have large effects in some parts of the country.
How well Sanders survives the onslaught of conservative attacks that we all know are coming.
It’s safe to say that the Millennials will be a vastly bigger factor in 2016 than they were in either 2008 or 2012 — and that Sanders’ success to date can and should be interpreted as this generation’s announcement of its growing political presence with far louder and more insistent authority than we’ve ever heard from them before.
However, in this election cycle, it’s not at all clear that it will be enough to get them what they want. We are tantalizingly close to a generational tipping point, but have not completely arrived at it just quite yet. But by the next cycle, that point will almost certainly be well behind us — and from then on, for the next 40 years, our politics will be pretty much entirely dominated, owned, and determined by the Millennials’ collectivist worldviews, interests, desires, and priorities. They will, this time or next, succeed in voting themselves the transformation they seek. It’s not a question of if, but when.
What we’re seeing when we look at the Bernie Sanders phenomenon is a direct window into our own political future. When will it emerge? Maybe not today, and maybe not this November — but it’s coming soon, and it or something like it will be the dominant political reality for the rest of our lives.
Photo: Ian Buck via Flickr
Sara Robinson is a Seattle-based futurist and veteran blogger on culture, politics, and religion. Since 2006, her work (gathered in the Archive section of her blog) regularly appeared at Orcinus, Our Future, Group News Blog, and Alternet. She’s also written for Salon, Huffington Post, Grist, the New Republic, New York Magazine, Firedoglake, and many other sites.
Robinson holds an MS in Futures Studies from the University of Houston, and a BA in Journalism from the USC Annenberg School of Communication. She was a Schumann Fellow, and also held senior fellowships at the Campaign for America’s Future and the Commonweal Foundation. She currently serves on the national board of NARAL Pro-Choice America.
A crowning achievement of the historic March on Washington, where Dr. Martin Luther King gave his “I have a dream” speech, was pushing through the landmark Voting Rights Act of 1965. Recognizing the history of racist attempts to prevent Black people from voting, that federal law forced a number of southern states and districts to adhere to federal guidelines allowing citizens access to the polls.
But in 2013 the Supreme Court effectively gutted many of these protections. As a result, states are finding new ways to stop more and more people—especially African-Americans and other likely Democratic voters—from reaching the polls.
Several states are requiring government-issued photo IDs—like drivers licenses—to vote even though there’s no evidence of the voter fraud this is supposed to prevent. But there’s plenty of evidence that these ID measures depress voting, especially among communities of color, young voters, and lower-income Americans.
Alabama, after requiring photo IDs, has practically closed driver’s license offices in counties with large percentages of black voters. Wisconsin requires a government-issued photo ID but hasn’t provided any funding to explain to prospective voters how to secure those IDs.
Other states are reducing opportunities for early voting.
And several state legislatures—not just in the South—are gerrymandering districts to reduce the political power of people of color and Democrats, and thereby guarantee Republican control in Congress.
We need to move to the next stage of voting rights—a new Voting Rights Act—that renews the law that was effectively repealed by the conservative activists on the Supreme Court.
That new Voting Rights Act should also set minimum national standards—providing automatic voter registration when people get driver’s licenses, allowing at least 2 weeks of early voting, and taking districting away from the politicians and putting it under independent commissions.
Voting isn’t a privilege. It’s a right. And that right is too important to be left to partisan politics. We must not allow anyone’s votes to be taken away.
ROBERT B. REICH is Chancellor’s Professor of Public Policy at the University of California at Berkeley and Senior Fellow at the Blum Center for Developing Economies. He served as Secretary of Labor in the Clinton administration, for which Time Magazine named him one of the ten most effective cabinet secretaries of the twentieth century. He has written fourteen books, including the best sellers “Aftershock, “The Work of Nations,” and”Beyond Outrage,” and, his most recent, “Saving Capitalism.” He is also a founding editor of the American Prospect magazine, chairman of Common Cause, a member of the American Academy of Arts and Sciences, and co-creator of the award-winning documentary, INEQUALITY FOR ALL.
When you press Democrats on their uninspiring deeds — their lousy free trade deals, for example, or their flaccid response to Wall Street misbehavior — when you press them on any of these things, they automatically reply that this is the best anyone could have done. After all, they had to deal with those awful Republicans, and those awful Republicans wouldn’t let the really good stuff get through. They filibustered in the Senate. They gerrymandered the congressional districts. And besides, change takes a long time. Surely you don’t think the tepid-to-lukewarm things Bill Clinton and Barack Obama have done in Washington really represent the fiery Democratic soul.
So let’s go to a place that does. Let’s choose a locale where Democratic rule is virtually unopposed, a place where Republican obstruction and sabotage can’t taint the experiment.
Let’s go to Boston, Massachusetts, the spiritual homeland of the professional class and a place where the ideology of modern liberalism has been permitted to grow and flourish without challenge or restraint. As the seat of American higher learning, it seems unsurprising that Boston should anchor one of the most Democratic of states, a place where elected Republicans (like the new governor) are highly unusual. This is the city that virtually invented the blue-state economic model, in which prosperity arises from higher education and the knowledge-based industries that surround it.
The coming of post-industrial society has treated this most ancient of American cities extremely well. Massachusetts routinely occupies the number one spot on the State New Economy Index, a measure of how “knowledge-based, globalized, entrepreneurial, IT-driven, and innovation-based” a place happens to be. Boston ranks high on many of Richard Florida’s statistical indices of approbation — in 2003, it was number one on the “creative class index,” number three in innovation and in high tech — and his many books marvel at the city’s concentration of venture capital, its allure to young people, or the time it enticed some firm away from some unenlightened locale in the hinterlands.
Boston’s knowledge economy is the best, and it is the oldest. Boston’s metro area encompasses some 85 private colleges and universities, the greatest concentration of higher-ed institutions in the country — probably in the world. The region has all the ancillary advantages to show for this: a highly educated population, an unusually large number of patents, and more Nobel laureates than any other city in the country.
The city’s Route 128 corridor was the original model for a suburban tech district, lined ever since it was built with defense contractors and computer manufacturers. The suburbs situated along this golden thoroughfare are among the wealthiest municipalities in the nation, populated by engineers, lawyers, and aerospace workers. Their public schools are excellent, their downtowns are cute, and back in the seventies their socially enlightened residents were the prototype for the figure of the “suburban liberal.”
Another prototype: the Massachusetts Institute of Technology, situated in Cambridge, is where our modern conception of the university as an incubator for business enterprises began. According to a report on MIT’s achievements in this category, the school’s alumni have started nearly 26,000 companies over the years, including Intel, Hewlett Packard, and Qualcomm. If you were to take those 26,000 companies as a separate nation, the report tells us, its economy would be one of the most productive in the world.
Then there are Boston’s many biotech and pharmaceutical concerns, grouped together in what is known as the “life sciences super cluster,” which, properly understood, is part of an “ecosystem” in which PhDs can “partner” with venture capitalists and in which big pharmaceutical firms can acquire small ones. While other industries shrivel, the Boston super cluster grows, with the life-sciences professionals of the world lighting out for the Athens of America and the massive new “innovation centers” shoehorning themselves one after the other into the crowded academic suburb of Cambridge.
To think about it slightly more critically, Boston is the headquarters for two industries that are steadily bankrupting middle America: big learning and big medicine, both of them imposing costs that everyone else is basically required to pay and which increase at a far more rapid pace than wages or inflation. A thousand dollars a pill, 30 grand a semester: the debts that are gradually choking the life out of people where you live are what has madethis city so very rich.
Perhaps it makes sense, then, that another category in which Massachusetts ranks highly is inequality. Once the visitor leaves the brainy bustle of Boston, he discovers that this state is filled with wreckage — with former manufacturing towns in which workers watch their way of life draining away, and with cities that are little more than warehouses for people on Medicare. According to one survey, Massachusetts has the eighth-worst rate of income inequality among the states; by another metric it ranks fourth. However you choose to measure the diverging fortunes of the country’s top 10% and the rest, Massachusetts always seems to finish among the nation’s most unequal places.
Seething City on a Cliff
You can see what I mean when you visit Fall River, an old mill town 50 miles south of Boston. Median household income in that city is $33,000, among the lowest in the state; unemployment is among the highest, 15% in March 2014, nearly five years after the recession ended. Twenty-three percent of Fall River’s inhabitants live in poverty. The city lost its many fabric-making concerns decades ago and with them it lost its reason for being. People have been deserting the place for decades.
Many of the empty factories in which their ancestors worked are still standing, however. Solid nineteenth-century structures of granite or brick, these huge boxes dominate the city visually — there always seems to be one or two of them in the vista, contrasting painfully with whatever colorful plastic fast-food joint has been slapped up next door.
Most of the old factories are boarded up, unmistakable emblems of hopelessness right up to the roof. But the ones that have been successfully repurposed are in some ways even worse, filled as they often are with enterprises offering cheap suits or help with drug addiction. A clinic in the hulk of one abandoned mill has a sign on the window reading simply “Cancer & Blood.”
The effect of all this is to remind you with every prospect that this is a place and a way of life from which the politicians have withdrawn their blessing. Like so many other American scenes, this one is the product of decades of deindustrialization, engineered by Republicans and rationalized by Democrats. This is a place where affluence never returns — not because affluence for Fall River is impossible or unimaginable, but because our country’s leaders have blandly accepted a social order that constantly bids down the wages of people like these while bidding up the rewards for innovators, creatives, and professionals.
Even the city’s one real hope for new employment opportunities — an Amazon warehouse that is now in the planning stages — will serve to lock in this relationship. If all goes according to plan, and if Amazon sticks to the practices it has pioneered elsewhere, people from Fall River will one day get to do exhausting work with few benefits while being electronically monitored for efficiency, in order to save the affluent customers of nearby Boston a few pennies when they buy books or electronics.
But that is all in the future. These days, the local newspaper publishes an endless stream of stories about drug arrests, shootings, drunk-driving crashes, the stupidity of local politicians, and the lamentable surplus of “affordable housing.” The town is up to its eyeballs in wrathful bitterness against public workers. As in: Why do they deserve a decent life when the rest of us have no chance at all? It’s every man for himself here in a “competition for crumbs,” as a Fall River friend puts it.
The Great Entrepreneurial Awakening
If Fall River is pocked with empty mills, the streets of Boston are dotted with facilities intended to make innovation and entrepreneurship easy and convenient. I was surprised to discover, during the time I spent exploring the city’s political landscape, that Boston boasts a full-blown Innovation District, a disused industrial neighborhood that has actually been zoned creative — a projection of the post-industrial blue-state ideal onto the urban grid itself. The heart of the neighborhood is a building called “District Hall” — “Boston’s New Home for Innovation” — which appeared to me to be a glorified multipurpose room, enclosed in a sharply angular façade, and sharing a roof with a restaurant that offers “inventive cuisine for innovative people.” The Wi-Fi was free, the screens on the walls displayed famous quotations about creativity, and the walls themselves were covered with a high-gloss finish meant to be written on with dry-erase markers; but otherwise it was not much different from an ordinary public library. Aside from not having anything to read, that is.
This was my introduction to the innovation infrastructure of the city, much of it built up by entrepreneurs shrewdly angling to grab a piece of the entrepreneur craze. There are “co-working” spaces, shared offices for startups that can’t afford the real thing. There are startup “incubators” and startup “accelerators,” which aim to ease the innovator’s eternal struggle with an uncaring public: the Startup Institute, for example, and the famous MassChallenge, the “World’s Largest Startup Accelerator,” which runs an annual competition for new companies and hands out prizes at the end.
And then there are the innovation Democrats, led by former Governor Deval Patrick, who presided over the Massachusetts government from 2007 to 2015. He is typical of liberal-class leaders; you might even say he is their most successful exemplar. Everyone seems to like him, even his opponents. He is a witty and affable public speaker as well as a man of competence, a highly educated technocrat who is comfortable in corporate surroundings. Thanks to his upbringing in a Chicago housing project, he also understands the plight of the poor, and (perhaps best of all) he is an honest politician in a state accustomed to wide-open corruption. Patrick was also the first black governor of Massachusetts and, in some ways, an ideal Democrat for the era of Barack Obama — who, as it happens, is one of his closest political allies.
As governor, Patrick became a kind of missionary for the innovation cult. “The Massachusetts economy is an innovation economy,” he liked to declare, and he made similar comments countless times, slightly varying the order of the optimistic keywords: “Innovation is a centerpiece of the Massachusetts economy,” et cetera. The governor opened “innovation schools,” a species of ramped-up charter school. He signed the “Social Innovation Compact,” which had something to do with meeting “the private sector’s need for skilled entry-level professional talent.” In a 2009 speech called “The Innovation Economy,” Patrick elaborated the political theory of innovation in greater detail, telling an audience of corporate types in Silicon Valley about Massachusetts’s “high concentration of brainpower” and “world-class” universities, and how “we in government are actively partnering with the private sector and the universities, to strengthen our innovation industries.”
What did all of this inno-talk mean? Much of the time, it was pure applesauce — standard-issue platitudes to be rolled out every time some pharmaceutical company opened an office building somewhere in the state.
On some occasions, Patrick’s favorite buzzword came with a gigantic price tag, like the billion dollars in subsidies and tax breaks that the governor authorized in 2008 to encourage pharmaceutical and biotech companies to do business in Massachusetts. On still other occasions, favoring inno has meant bulldozing the people in its path — for instance, the taxi drivers whose livelihoods are being usurped by ridesharing apps like Uber. When these workers staged a variety of protests in the Boston area, Patrick intervened decisively on the side of the distant software company. Apparently convenience for the people who ride in taxis was more important than good pay for people who drive those taxis. It probably didn’t hurt that Uber had hired a former Patrick aide as a lobbyist, but the real point was, of course, innovation: Uber was the future, the taxi drivers were the past, and the path for Massachusetts was obvious.
A short while later, Patrick became something of an innovator himself. After his time as governor came to an end last year, he won a job as a managing director of Bain Capital, the private equity firm that was founded by his predecessor Mitt Romney — and that had been so powerfully denounced by Democrats during the 2012 election. Patrick spoke about the job like it was just another startup: “It was a happy and timely coincidence I was interested in building a business that Bain was also interested in building,” he told theWall Street Journal. Romney reportedly phoned him with congratulations.
At a 2014 celebration of Governor Patrick’s innovation leadership, Google’s Eric Schmidt announced that “if you want to solve the economic problems of the U.S., create more entrepreneurs.” That sort of sums up the ideology in this corporate commonwealth: Entrepreneurs first. But how has such a doctrine become holy writ in a party dedicated to the welfare of the common man? And how has all this come to pass in the liberal state of Massachusetts?
The answer is that I’ve got the wrong liberalism. The kind of liberalism that has dominated Massachusetts for the last few decades isn’t the stuff of Franklin Roosevelt or the United Auto Workers; it’s the Route 128/suburban-professionals variety. (Senator Elizabeth Warren is the great exception to this rule.) Professional-class liberals aren’t really alarmed by oversized rewards for society’s winners. On the contrary, this seems natural to them — because they are society’s winners. The liberalism of professionals just does not extend to matters of inequality; this is the area where soft hearts abruptly turn hard.
Innovation liberalism is “a liberalism of the rich,” to use the straightforward phrase of local labor leader Harris Gruman. This doctrine has no patience with the idea that everyone should share in society’s wealth. What Massachusetts liberals pine for, by and large, is a more perfect meritocracy — a system where the essential thing is to ensure that the truly talented get into the right schools and then get to rise through the ranks of society. Unfortunately, however, as the blue-state model makes painfully clear, there is no solidarity in a meritocracy. The ideology of educational achievement conveniently negates any esteem we might feel for the poorly graduated.
This is a curious phenomenon, is it not? A blue state where the Democrats maintain transparent connections to high finance and big pharma; where they have deliberately chosen distant software barons over working-class members of their own society; and where their chief economic proposals have to do with promoting “innovation,” a grand and promising idea that remains suspiciously vague. Nor can these innovation Democrats claim that their hands were forced by Republicans. They came up with this program all on their own.
The other week, feeling sick, I spent a day on my couch with the TV on and was reminded of an odd fact of American life. More than seven months before Election Day, you can watch the 2016 campaign for the presidency at any moment of your choosing, and that’s been true since at least late last year. There is essentially never a time when some network or news channel isn’t reporting on, discussing, debating, analyzing, speculating about, or simply drooling over some aspect of the primary campaign, of Hillary, Bernie, Ted, and above all — a million times above all — The Donald (from the violence at his rallies to the size of his hands). In case you’re young and think this is more or less the American norm, it isn’t. Or wasn’t.
Truly, there is something new under the sun. Of course, in 1994 with O.J. Simpson’s white Ford Bronco chase (95 million viewers!), the 24/7 media event arrived full blown in American life and something changed when it came to the way we focused on our world and the media focused on us. But you can be sure of one thing: never in the history of television, or any other form of media, has a single figure garnered the amount of attention — hour after hour, day after day, week after week — as Donald Trump. If he’s the O.J. Simpson of twenty-first-century American politics and his run for the presidency is the eternal white Ford Bronco chase of our moment, then we’re in a truly strange world.
Or let me put it another way: this is not an election. I know the word “election” is being used every five seconds and somewhere along the line significant numbers of Americans (particularly, this season, Republicans) continue to enter voting booths or in the case of primary caucuses, school gyms and the like, to choose among various candidates, so it’s all still election-like. But take my word for it as a 71-year-old guy who’s been watching our politics for decades: this is not an election of the kind the textbooks once taught us was so crucial to American democracy. If, however, you’re sitting there waiting for me to tell you what it is, take a breath and don’t be too disappointed. I have no idea, though it’s certainly part bread-and-circuses spectacle, part celebrity obsession, and part media money machine.
Actually, before we go further, let me hedge my bets on the idea that Donald Trump is a twenty-first-century O.J. Simpson. It’s certainly a reasonable enough comparison, but I’ve begun to wonder about the usefulness of just about any comparison in our present situation. Even the most nightmarish of them — Donald Trump is Adolf Hitler, Benito Mussolini, or any past extreme demagogue of your choice — may actually prove to be covert gestures of consolation, reassurance, and comfort. Yes, what’s happening in our world is increasingly extreme and could hardly be weirder, we seem to have the urge to say, but it’s still recognizable. It’s something we’ve encountered before, something we’ve made sense of in the past and, in the process, overcome.
Round Up the Usual SuspectsBut what if that’s not true? In some ways, the most frightening, least acceptable thing to say about our American world right now — even if Donald Trump’s overwhelming presence all but begs us to say it — is that we’ve entered uncharted territory and, under the circumstances, comparisons might actually impair our ability to come to grips with our new reality. My own suspicion: Donald Trump is only the most obvious instance of this, the example no one can miss.In these first years of the twenty-first century, we may be witnessing a new world being born inside the hollowed-out shell of the American system. As yet, though we live with this reality every day, we evidently just can’t bear to recognize it for what it might be. When we survey the landscape, what we tend to focus on is that shell — the usual elections (in somewhat heightened form), the usual governmental bodies (a little tarnished) with the usual governmental powers (a little diminished or redistributed), including the usual checks and balances (a little out of whack), and the same old Constitution (much praised in its absence), and yes, we know that none of this is working particularly well, or sometimes at all, but it still feels comfortable to view what we have as a reduced, shabbier, and more dysfunctional version of the known.
Perhaps, however, it’s increasingly a version of the unknown. We say, for instance, that Congress is “paralyzed,” and that little can be done in a country where politics has become so “polarized,” and we wait for something to shake us loose from that “paralysis,” to return us to a Washington closer to what we remember and recognize. But maybe this is it. Maybe even if the Republicans somehow lost control of the House of Representatives and the Senate, we would still be in a situation something like what we’re now labeling paralysis. Maybe in our new American reality, Congress is actually some kind of glorified, well-lobbied, and well-financed version of a peanut gallery.
Of course, I don’t want to deny that much of what is “new” in our world has a long history. The present yawning inequality gap between the 1% and ordinary Americans first began to widen in the 1970s and — as Thomas Frank explains so brilliantly in his new book, Listen, Liberal — was already a powerful and much-discussed reality in the early 1990s, when Bill Clinton ran for president. Yes, that gap is now more like an abyss and looks ever more permanently embedded in the American system, but it has a genuine history, as for instance do 1% elections and the rise and self-organization of the “billionaire class,” even if no one, until this second, imagined that government of the billionaires, by the billionaires, and for the billionaires might devolve into government of the billionaire, by the billionaire, and for the billionaire — that is, just one of them.
Indeed, much of our shape-shifting world can be written about as a set of comparisons and in terms of historical reference points. Inequality has a history. The military-industrial complex and the all-volunteer military, like the warrior corporation, weren’t born yesterday; neither was our state of perpetual war, nor the national security state that now looms over Washington, nor its surveilling urge, the desire to know far too much about the private lives of Americans. (A little bow of remembrance to FBI Director J. Edgar Hoover is in order here.)
And yet, true as all that may be, Washington increasingly seems like a new land, sporting something like a new system in the midst of our much-described polarized and paralyzed politics. The national security state doesn’t seem faintly paralyzed or polarized to me. Nor does the Pentagon. On certain days when I catch the news, I can’t believe how strange and yet humdrum this uncharted new territory is. Remind me, for instance, where in the Constitution the Founding Fathers wrote about that national security state? And yet there it is in all its glory, all its powers, an ever more independent force in our nation’s capital. In what way, for instance, did those men of the revolutionary era prepare the ground for the Pentagon to loose its spy drones from our distant war zones over the United States? And yet, so it has. And no one even seems disturbed by the development. The news, barely noticed or noted, was instantly absorbed into what’s becoming the new normal.
Graduation Ceremonies in the Imperium
Let me mention here the almost random piece of news that recently made me wonder just what planet I was actually on. And I know you won’t believe it, but it had absolutely nothing to do with Donald Trump.
Given the carnage of America’s wars and conflicts across the Greater Middle East and Africa, which I’ve been following closely these last years, I’m unsure why this particular moment even got to me. Best guess? Maybe that, of all the once-obscure places — from Afghanistan to Yemen to Libya — in which the U.S. has been fighting recently, Somalia, where this particular little slaughter took place, seems to me like the most obscure of all. Yes, I’ve been half-attending to events there from the 1993 Blackhawk Down moment to the disastrous U.S.-backed Ethiopian invasion of 2006 to the hardly less disastrous invasion of that country by Kenyan and other African forces. Still, Somalia?
Recently, U.S. Reaper drones and manned aircraft launched a set of strikes against what the Pentagon claimed was a graduation ceremony for “low-level” foot soldiers in the Somali terror group al-Shabab. It was proudly announced that more than 150 Somalis had died in this attack. In a country where, in recent years, U.S. drones and special ops forces had carried out a modest number of strikes against individual al-Shabab leaders, this might be thought of as a distinct escalation of Washington’s endless low-level conflict there (with a raid involving U.S. special ops forces following soon after).
Now, let me try to put this in some personal context. Since I was a kid, I’ve always liked globes and maps. I have a reasonable sense of where most countries on this planet are. Still, Somalia? I have to stop and give that one some thought to truly locate it on a mental map of eastern Africa. Most Americans? Honestly, I doubt they’d have a clue. So the other day, when this news came out, I stopped a moment to take it in. If accurate, we killed 150 more or less nobodies (except to those who knew them) and maybe even a top leader or two in a country most Americans couldn’t locate on a map.
I mean, don’t you find that just a little odd, no matter how horrible the organization they were preparing to fight for? 150 Somalis? Blam!
Remind me: On just what basis was this modest massacre carried out? After all, the U.S. isn’t at war with Somalia or with al-Shabab. Of course, Congress no longer plays any real role in decisions about American war making. It no longer declares war on any group or country we fight. (Paralysis!) War is now purely a matter of executive power or, in reality, the collective power of the national security state and the White House. The essential explanation offered for the Somali strike, for instance, is that the U.S. had a small set of advisers stationed with African Union forces in that country and it was just faintly possible that those guerrilla graduates might soon prepare to attack some of those forces (and hence U.S. military personnel). It seems that if the U.S. puts advisers in place anywhere on the planet — and any day of any year they are now in scores of countries — that’s excuse enough to validate acts of war based on the “imminent” threat of their attack.
Or just think of it this way: a new, informal constitution is being written in these years in Washington. No need for a convention or a new bill of rights. It’s a constitution focused on the use of power, especially military power, and it’s being written in blood.
These days, our government (the unparalyzed one) acts regularly on the basis of that informal constitution-in-the-making, committing Somalia-like acts across significant swathes of the planet. In these years, we’ve been marrying the latest in wonder technology, our Hellfire-missile-armed drones, to executive power and slaughtering people we don’t much like in majority Muslim countries with a certain alacrity. By now, it’s simply accepted that any commander-in-chief is also our assassin-in-chief, and that all of this is part of a wartime-that-isn’t-wartime system, spreading the principle of chaos and dissolution to whole areas of the planet, leaving failed states and terror movements in its wake.
When was it, by the way, that “the people” agreed that the president could appoint himself assassin-in-chief, muster his legal beagles to write new “law” that covered any future acts of his (including the killing of American citizens), and year after year dispatch what essentially is his own private fleet of killer drones to knock offthousands of people across the Greater Middle East and parts of Africa? Weirdly enough, after almost 14 years of this sort of behavior, with ample evidence that such strikes don’t suppress the movements Washington loathes (and often only fan the flames of resentment and revenge that help them spread), neither the current president and his top officials, nor any of the candidates for his office have the slightest intention of ever grounding those drones.
And when exactly did the people say that, within the country’s vast standing military, which now garrisons much of the planet, a force of nearly 70,000 Special Operations personnel should be birthed, or that it should conduct covert missions globally, essentially accountable only to the president (if him)? And what I find strangest of all is that few in our world find such developments strange at all.
A Planet in Decline?
In some way, all of this could be said to work. At the very least, it is a functioning new system-in-the-making that we have yet to truly come to grips with, just as we haven’t come to grips with a national security state that surveils the world in a way that even science fiction writers (no less totalitarian rulers) of a previous era could never have imagined, or the strange version of media overkill that we still call an election. All of this is by now both old news and mind-bogglingly new.
Do I understand it? Not for a second.
This is not war as we knew it, nor government as we once understood it, nor are these elections as we once imagined them, nor is this democracy as it used to be conceived of, nor is this journalism of a kind ever taught in a journalism school. This is the definition of uncharted territory. It’s a genuine American terra incognita and yet in some fashion that unknown landscape is already part of our sense of ourselves and our world. In this “election” season, many remain shocked that a leading candidate for the presidency is a demagogue with a visible authoritarian side and what looks like an autocratic bent. All such labels are pinned on Donald Trump, but the new American system that’s been emerging from its chrysalis in these years already has just those tendencies. So don’t blame it all on Donald Trump. He should be far less of a shock to this country than he continues to be. After all, a Trumpian world-in-formation has paved the way for him.
Who knows? Perhaps what we’re watching is the new iteration of a very old story: a twenty-first-century version of an ancient tale of a great imperial power, perhaps the greatest ever — the “lone superpower” — sinking into decline. It’s a tale humanity has experienced often enough in the course of our long history. But lest you think once again that there’s nothing new under the sun, the context for all of this, for everything now happening in our world, is so new as to be quite literally outside of thousands of years of human experience. As the latest heat records indicate, we are, for the first time, on a planet in decline. And if that isn’t uncharted territory, what is?
“Bernie did well last weekend but he can’t possibly win the nomination,” a friend told me for what seemed like the thousandth time, attaching an article from the Washington Post that shows how far behind Bernie remains in delegates.
Wait a minute. Last Tuesday, Sanders won 78 percent of the vote in Idaho and 79 percent in Utah. This past Saturday, he took 82 percent of the vote in Alaska, 73 percent in Washington, and 70 percent in Hawaii.
In fact, since March 15, Bernie has won six out of the seven Democratic primary contests with an average margin of victory of 40 points. Those victories have given him roughly a one hundred additional pledged delegates.
As of now, Hillary Clinton has 54.9 percent of the pledged delegates to Bernie Sanders’s 45.1 percent.That’s still a sizable gap – but it doesn’t make Bernie an impossibility.
Moreover, there are 22 states to go with nearly 45 percent of pledged delegates still up for grabs – and Bernie has positive momentum in almost all of them.
Hillary Clinton’s lead in superdelegates will vanish if Bernie gains a majority of pledged delegates.
Bernie is outpacing Hillary Clinton in fundraising. In February, he raised $42 million (from 1.4 million contributions, averaging $30 each), compared to her $30 million. In January he raised $20 million to her $15 million.
By any measure, the enthusiasm for Bernie is huge and keeps growing. He’s packing stadiums, young people are flocking to volunteer, support is rising among the middle-aged and boomers.
In Idaho and Alaska he exceeded the record primary turnout in 2008, bringing thousands of new voters. He did the same thing in Colorado, Kansas, Maine, and Michigan as well.
Yet if you read the Washington Post or the New York Times, or watch CNN or even MSNBC, or listen to the major pollsters and pundits, you’d come to the same conclusion as my friend. Every success by Bernie is met with a story or column or talking head whose message is “but he can’t possibly win.”
Some Sanders supporters speak in dark tones about a media conspiracy against Bernie. That’s baloney. The mainstream media are incapable of conspiring with anyone or anything. They wouldn’t dare try. Their reputations are on the line. If the public stops trusting them, their brands are worth nothing.
The real reason the major media can’t see what’s happening is because the national media exist inside the bubble of establishment politics, centered in Washington, and the bubble of establishment power, centered in New York.
As such, the major national media are interested mainly in personalities and in the money behind the personalities. Political reporting is dominated by stories about the quirks and foibles of the candidates, and about the people and resources behind them.
Within this frame of reference, it seems nonsensical that a 74-year-old Jew from Vermont, originally from Brooklyn, who calls himself a Democratic socialist, who’s not a Democratic insider and wasn’t even a member of the Democratic Party until recently, who has never been a fixture in the Washington or Manhattan circles of power and influence, and who has no major backers among the political or corporate or Wall Street elites of America, could possibly win the nomination.
But precisely because the major media are habituated to paying attention to personalities, they haven’t been attending to Bernie’s message – or to its resonance among Democratic and independent voters (as well as many Republicans). The major media don’t know how to report on movements.
In addition, because the major media depend on the wealthy and powerful for revenues, because their reporters and columnists rely on the establishment for news and access, because their top media personalities socialize with the rich and powerful and are themselves rich and powerful, and because their publishers and senior executives are themselves part of the establishment, the major media have come to see much of America through the eyes of the establishment.
So it’s understandable, even if unjustifiable, that the major media haven’t noticed how determined Americans are to reverse the increasing concentration of wealth and political power that have been eroding our economy and democracy. And it’s understandable, even if unjustifiable, that they continue to marginalize Bernie Sanders.
ROBERT B. REICH is Chancellor’s Professor of Public Policy at the University of California at Berkeley and Senior Fellow at the Blum Center for Developing Economies. He served as Secretary of Labor in the Clinton administration, for which Time Magazine named him one of the ten most effective cabinet secretaries of the twentieth century. He has written fourteen books, including the best sellers “Aftershock, “The Work of Nations,” and”Beyond Outrage,” and, his most recent, “Saving Capitalism.” He is also a founding editor of the American Prospect magazine, chairman of Common Cause, a member of the American Academy of Arts and Sciences, and co-creator of the award-winning documentary, INEQUALITY FOR ALL.
National wildlife refuges such as the one at Malheur near Burns, Oregon, have importance far beyond the current furor over who manages our public lands. Such refuges are becoming increasingly critical habitat for migratory birds because 95 percent of the wetlands along the Pacific Flyway have already been lost to development.
In some years, 25 million birds visit Malheur, and if the refuge were drained and converted to intensive cattle grazing – which is something the “occupiers” threatened to do – entire populations of ducks, sandhill cranes, and shorebirds would suffer. With their long-distance flights and distinctive songs, the migratory birds visiting Malheur’s wetlands now help to tie the continent together.
This was not always the case. By the 1930s, three decades of drainage, reclamation, and drought had decimated high-desert wetlands and the birds that depended upon them. Out of the hundreds of thousands of egrets that once nested on Malheur Lake, only 121 remained. The American population of the birds had dropped by 95 percent. It took the federal government to restore Malheur’s wetlands and recover waterbird populations, bringing back healthy populations of egrets and many other species.
Yet despite the importance of wildlife refuges to America’s birds, not everyone appreciates them. At one recent news conference, Ammon Bundy called the creation of Malheur National Wildlife refuge “an unconstitutional act” that removed ranchers from their lands and plunged the county into an economic depression. This is not a new complaint. Since the Sagebrush Rebellion of the 1980s, rural communities in the West have blamed their poverty on the 640 million acres of federal public lands, which make up 52 percent of the land in Western states.
Rural Western communities are indeed suffering, but the cause is not the wildlife refuge system. Conservation of bird habitat did not lead to economic devastation, nor were refuge lands “stolen” from ranchers. If any group has prior claims to Malheur refuge, it is the Paiute Indian Tribe.
For at least 6,000 years, Malheur was the Paiutes’ home. It took a brutal Army campaign to force the people from their reservation, marching them through the snow to the state of Washington in 1879. Homesteaders and cattle barons then moved onto Paiute lands, squeezing as much livestock as possible onto dwindling pastures, and warring with each other over whose land was whose. Scars from this era persist more than a century later.
In 1908, President Roosevelt established the Malheur Lake Bird Reservation on the lands of the former Malheur Indian Reservation. But the refuge included only the lake itself, not the rivers that fed into it. Deprived of water, the lake shrank during droughts, and squatters moved onto the drying lakebed. Conservationists, realizing they needed to protect the Blitzen River that fed the lake, began a campaign to expand the refuge.
But the federal government never forced the ranchers to sell, as the occupiers at Malheur claimed, and the sale did not impoverish the community. In fact, it was just the opposite: During the Depression years of the 1930s, the federal government paid the Swift Corp. $675,000 for ruined grazing lands. Impoverished homesteaders who had squatted on refuge lands eventually received payments substantial enough to set them up as cattle ranchers nearby.
John Scharff, Malheur’s manager from 1935 to 1971, sought to transform local suspicion into acceptance by allowing local ranchers to graze cattle on the refuge. Yet some tension persisted. In the 1970s, when concern about overgrazing reduced – but did not eliminate – refuge grazing, violence erupted again. Some environmentalists denounced ranchers as parasites who destroyed wildlife habitat. A few ranchers responded with death threats against environmentalists and federal employees.
But violence is not the basin’s most important historical legacy. Through the decades, community members have come together to negotiate a better future. In the 1920s, poor homesteaders worked with conservationists to save the refuge from irrigation drainage. In the 1990s, Paiute tribal members, ranchers, environmentalists and federal agencies collaborated on innovative grazing plans to restore bird habitat while also giving ranchers more flexibility. In 2013, such efforts resulted in a landmark collaborative conservation plan for the refuge, and it offers great hope for the local economy and for wildlife.
The poet Gary Snyder wrote, “We must learn to know, love, and join our place even more than we love our own ideas. People who can agree that they share a commitment to the landscape – even if they are otherwise locked in struggle with each other – have at least one deep thing to share.”
Collaborative processes are difficult and time-consuming. Yet they have proven that they have the potential to peacefully sustain both human and wildlife communities.
Nancy Langston is a contributor to Writers on the Range, the opinion service of High Country News. She is a professor of environmental history at Michigan Technological University, and the author of a history of Malheur Refuge, Where Land and Water Meet: A Western Landscape Transformed.
If you’ve been following the DNC broken firewall scandal(*) of yesterday with an open mind, here’s a question that should interest you: Why didn’t the DNC fix the software problem when it was first reported to them by the Sanders campaign itself back in October? Is this just a case of incompetence, or did someone have something to gain by keeping the firewall weak? (Remember: the Sanders campaign suspected that their own private data had been accessed at that time).
And here’s a related question: If the Sanders campaign intended to make illicit use of this broken firewall, why did it come forth itself and report the problem again yesterday? Or for that matter in October in the first place? If there’s some sort of malfeasance in this case, where is it likely to reside? And finally, has the real scandal yet been fully revealed?
Perhaps all these questions have completely banal answers, but in our political house of mirrors it’s often difficult to recognize the real Occam and his razor.
* Note: The “scandal” erupted yesterday, whereas the real problem — the firewall bug and the DNC’s failure to fix it — goes back months, if not years.
Nearly 70,000 [Muslim] clerics [from around the world] came together and passed a fatwa [i.e. Islamic legal decree] against terrorist organizations, including IS, Taliban and al-Qaida. These are “not Islamic organizations,” the clerics said to a sea of followers, adding that the members of these outfits were “not Muslims”.
As documented by Metrocosm, what Americans assume about Muslim support for ISIS is very different from reality:
According to a Brookings report from last January:
40% of Americans believe most Muslims oppose ISIS.
14% think most Muslims support ISIS.
And 44% (the plurality) of Americans believe Muslim views are evenly balanced on the issue.
In Lebanon, a victim of one of the most recent attacks, almost every person surveyed who gave an opinion had an unfavorable view of ISIS, including 99% with a veryunfavorable opinion. Distaste toward ISIS was shared by Lebanese Sunni Muslims (98% unfavorable) and 100% of Shia Muslims and Lebanese Christians.
Israelis (97%) and Jordanians (94%) were also strongly opposed to ISIS as of spring 2015, including 91% of Israeli Arabs. And 84% in the Palestinian territories had a negative view of ISIS, both in the Gaza Strip (92%) and the West Bank (79%).
Indeed, as we’ve previously point out, Muslim leaders have been speaking out against Islamic terrorismforyears … but we never hear about it from the mainstream American media.
Father Elias Mallon of the Catholic Near East Welfare Association remarks:
“Why aren’t Muslims speaking out against these atrocities?” The answer is: Muslimshave been speaking out in the strongest terms, condemning the crimes against humanity committed by [extremists] in the name of Islam.
And Rabbi Marc Schneier notes in the Washington Post that the moderate Muslim majority isspeaking out against the extremists … but “we’re just not listening.”