Nevada City-based non-profit Sierra Streams Institute is partnering with the Cancer Prevention Institute of California to launch an important new study on the health consequences of living in a mining-impacted community.
Sierra Streams Institute is currently seeking women over the age of 18 years, with a history of breast cancer and currently living in western Nevada County to participate in this exciting research project. Participants will be asked to provide a urine sample, toenail clippings, and complete a brief questionnaire. They are also planning a subsequent study involving in-home environmental sampling and are waiting for final approval of the study protocols.
This study, funded by state tobacco taxes through the California Breast Cancer Research Program, will focus on the amount of cadmium and arsenic in the bodies of women with and without breast cancer residing in historical Gold Country.
These two metals are of interest because they are found at high levels throughout Gold Country, are known carcinogens and may play a role in developing breast cancer. The three most populous counties in Gold Country, including Nevada County, have breast cancer rates that rank in the top ten counties in California.
To volunteer for the CHIME (Community Health Impacts of Mining Exposure) study or to learn more about this ground-breaking study, please visit Sierra Streams Institute website at: http://www.
How the Raid on Malheur Screened a Future Raid on Real Estate
By William deBuys
Reprinted with permission from Tomdispatch.com
It goes without saying that in a democracy everyone is entitled to his or her own opinions. The trouble starts when people think they are also entitled to their own facts.
Away out West, on the hundreds of millions of acres of public lands that most Americans take for granted (if they are aware of them at all), the trouble is deep, widespread, and won’t soon go away. Last winter’s armed take-over and 41-day occupation of Malheur National Wildlife Refuge in southeastern Oregon is a case in point. It was carried out by people who, if they hadn’t been white and dressed as cowboys, might have been called “terrorists” and treated as such. Their interpretation of the history of western lands and of the judicial basis for federal land ownership — or at least that of their leaders, since they weren’t exactly a band of intellectuals — was only loosely linked to reality.
At least some of them took inspiration from the notion that Jesus Christ wrote the Constitution (which would be news to the Deists, like James Madison, who were its actual authors) and that it prohibits federal ownership of any land excepting administrative sites within the United States — a contention that more than two centuries of American jurisprudence has emphatically repudiated.
The troubling thing is that similar delusions infect pockets of unrest throughout the West, lending a kind of twisted legitimacy to efforts at both the state and national level to transfer western public lands to states and counties. To be sure, not all the proponents of this liquidation of America’s national patrimony subscribe to wing-nut doctrines; sometimes they just use them.
Greed can suffice to motivate those who lust for the real estate bonanzas and resource giveaways that would result if states gained title to, say, the 264 million acres presently controlled by the Bureau of Land Management (BLM). General combativeness and hostility toward government also play their roles, and the usual right-wing mega-donors, including the Koch brothers, pump money into a bewildering array of agitator groups to help keep the fires of resentment burning.
The louder the drum chant of crazy “facts” gets, the more the Alice-in-Wonderland logic behind them threatens to seize the popular narrative about America’s public lands — how they came to be and what they represent. This, in turn, prepares the way for the betrayal of one of the nation’s deepest traditions and for the loss of yet more of its natural heritage. Conversely, those who value American public lands have been laggard in articulating an updated vision for those open spaces appropriate to the twenty-first century and capable of expressing what the unsettled “fruited plains” and “purple mountain majesties” of the West still mean for our national experience and our capacity to meet the challenges of the future.
The Malice at Malheur
The leaders of the Malheur occupation, Ammon and Ryan Bundy, are the sons of Cliven Bundy, a Nevada rancher and public lands scofflaw who gained notoriety two years ago following a standoff with federal law enforcement officers. Back in the 1990s, the elder Bundy had stopped paying grazing fees, claiming that the federal government had no authority to regulate the public lands where his cattle fed. In 2014, with Bundy $1.1 million in arrears and his grazing permits transferred to the local county government, the Bureau of Land Management moved to round up and confiscate his 400 head of cattle.
Via social media, Bundy appealed to militia and “patriot” groups for support, and hundreds of armed resisters rallied to his ranch 90 miles north of Las Vegas. When the ensuing showdown threatened to become a bloodbath like the Waco siege of 1993, the authorities withdrew.
The government’s retreat and its failure to arrest members of the Bundy family or their allies for acts of armed resistance set the stage for the Malheur takeover, but the roots of the incident go back to the Sagebrush Rebellion of the 1970s and 1980s and the Wise Use Movement that succeeded it. The Sagebrush Rebellion was triggered by a national inventory of public lands to identify areas appropriate for designation as “wilderness” (under the National Wilderness Preservation System). Its advocates also protested the enforcement of government protections for archaeological sites and endangered species. Wise Use groups echoed those complaints and essentially argued against anything the environmental movement was for, urging the amped-up exploitation of natural resources on western lands.
Ammon Bundy put his own rogue-Mormon spin on that message by claiming divine inspiration and sanction for his actions. Ostensibly, the Malheur occupation was intended to show support for nearby ranchers Dwight and Stephen Hammond, who faced jail terms for setting illegal range fires (and who immediately distanced themselves from the occupation). But Bundy didn’t stop there. He called on “patriots all over the country” to join his cause and help “free up” federal land for ranching, mining, and logging, pointedly adding, “We need you to bring your guns.”
Malheur was an odd place for white guys to make a stand in favor of “returning” federal land to its “rightful owners” — that is, themselves. The refuge was established in 1908 when Teddy Roosevelt declared a modest area of public domain to be a wildlife refuge. If anyone then occupied the land, it was members of the Burns Paiute tribe, not white settlers. In the 1930s, the refuge expanded when the government bought the bankrupt remnants of a former cattle baron’s empire. At the time, Malheur was its own mini-Dust Bowl. The purchase, which enlarged protection for once-fabulous wetlands supporting thousands of migrating birds, was essentially a bailout.
The people who joined the Bundys in the Malheur occupation were a strange lot. Few had any relationship to ranching or actual cows, aside from sitting down to eat a hamburger. Some were ex-military; others claimed to be (but weren’t). Quite a few had links to Tea Party groups or to “patriot” organizations including the Oath Keepers, theThree Percenters, and an assortment of other militia outfits. One described himself as “an old hippie from San Francisco,” jazzed by the excitement of the occupation and uncaring about its purposes. He also happened to be a convicted murderer (second degree) — of his father.
Straight thinking was not a requirement for admission to the occupiers’ cause. The fellow who photogenically rode his horse around the refuge while displaying a large American flag, for example, turned out to be acutely concerned lest the federal government divest itself of public lands. He feared the loss of access to cherished places where he liked to ride his horse. Because of that, he joined an armed effort aimed at forcing the government to do exactly what he didn’t want. Go figure.
Following the shooting death of LaVoy Finicum, the Malheur occupier who committed suicide-by-cop at a roadblock on January 26th, the occupation unraveled. At last count, the Bundy brothers and 24 others had been arrested and charged with a laundry list of crimes, including conspiracy to prevent federal employees from carrying out their duties and destruction of public property. All but one or two of them are still in jail.
Nor did the feds stop there. They finally nabbed Cliven Bundy at an airport after he attended a memorial service for Finicum, and also charged 18 others in connection with the 2014 Nevada standoff. Some of the 18 were already in custody for their involvement at Malheur. Bundy’s illegal cattle, which the government unsuccessfully tried to confiscate in 2014, remain at large.
More Mad Cowboy Disease in Utah
Despite the government’s thorough, if belated, crackdown, the hostility toward public lands on display at Malheur has hardly been contained. Such resentments are of a piece with the anger suffusing the presidential campaigns, although paradoxically enough Donald Trump has spoken out in favor of retaining federal lands. (Ted Cruz, by contrast, campaigned against Trump in Nevada by promising to “fight day and night to return full control of Nevada’s lands to its rightful owners, its citizens.”)
The darkest side of this “movement” is undoubtedly its well-documented association with armed militia groups and their persistent threat of violence. Gunmen from the Oath Keepers, for instance, obstructed federal officials from shutting down mines violating environmental regulations in both Oregon and Montana. According to the Southern Poverty Law Center, the current, rapid growth of militia groups is unprecedented and appears to have been spurred by the 2014 standoff at the Bundy ranch. Notices for “meet-ups” among “patriots” to show support for the incarcerated Bundys and the “martyred” Finicum are abundant on social media.
A similar virus has infected several western state legislatures, including those of Montana, Oregon, Wyoming, and Nevada. Representative Michele Fiore, who hovered at the fringes of the Malheur occupation, for instance, introduced a bill in the Nevada legislature to transfer federal lands there to state control, irrespective of federal wishes. Considered patently unconstitutional, it was quickly dismissed. A Nevada senate resolution calling on Washington D.C. to initiate action to transfer those lands received more serious consideration.
The game is being played more cagily in Utah. There, lawmakers approved legislation in March that authorized and partly funded the state’s attorney general to sue the federal government for title to approximately 30 million acres of Utah public lands. The suit would pursue strategies advanced via a study produced by a New Orleans law firm outlining “legitimate legal theories” that, it contended, might lead to the wholesale transfer of lands to the state.
The expected cost of the litigation has been estimated at $14 million and Utah has sought allies among other western states. So far, they’ve found no takers willing to join the suit, possibly because other attorneys general have concluded that the legal theories behind it are rubbish.
Utah has also exported its anti-federalism to Capitol Hill. One of its congressmen, Rob Bishop, currently chairs the House Natural Resources Committee and sympathetically held hearings in February on several bills, introduced by representatives from Alaska, Idaho, and Utah, that would place federal lands under state control. Lisa Murkowski, a Republican from Alaska and chair of the Energy and Natural Resources Committee, has promoted similar bills in the Senate.
Hanging on to “the Solace of Open Spaces”
Lost among the headlines, sound bites, and posturing is any serious discussion of America’s public lands and their purposes. Ammon Bundy was completely correct, early in the occupation of Malheur, when he said, “This refuge is rightfully owned by the people.” His problem was that his definition of “people” only included people like him. The Burns Paiute tribe, whose ancestral homeland includes Malheur and whose sacred sites are protected by federal law, certainly did not figure into his plans. The thousands of annual visitors to Malheur, who appreciate its 320 bird species and other wildlife, and the millions more who support the National Wildlife Refuge System, also seem not to be the “people” Bundy had in mind. The same might be said for anyone attracted to the idea of intact natural landscapes and functioning ecosystems.
The greatest vulnerability of America’s public lands is that the millions of their rightful owners scarcely know they exist. Ask the average New Yorker what the Bureau of Land Management is, and the odds are that you’ll get a confused stare. Even many people in the West, who live close to those public lands, have trouble differentiating the National Parks from the National Forests, though those two classes of land are administered for substantially different purposes by two different government departments, Interior and Agriculture. Yet most people agree that the wild open spaces of the nation’s grandest landscapes constitute a collective treasure.
In essence, they are our national commons, our shared resource, not just for material goods, like timber, clean water, and minerals, but for recreation and inspiration. Seventy percent of all hunters are said to use public lands, and the percentages of birders, campers, hikers, and other recreationists must be at least as high. Public lands also help buffer us against the uncertainties of the future. Only public lands, for instance, spread unbroken over great enough distances to offer the connectedness that many plants and animals will require to adapt, to the extent possible, to a warming climate. Moreover, as the struggle to wean the economy away from fossil fuels continues, only public lands, with their unified federal ownership, are susceptible to the kind of sweeping shift in national energy policy necessary to “keep it in the ground.”
For all these reasons, the future of the nation’s 640 million acres of public lands deserves a more prominent place in our national discourse. The patterns of the past, emphasizing extractive, industrial uses of those lands, have long been in decline. An alternate path focused on restoration and biodiversity conservation has instead steadily gained traction, and indeed, its priorities — which include making room for endangered species — have inspired many of the objections of the Malheur occupiers.
Two things are certain: when large acreages of public domain are transferred to the states, significant portions of them end up being sold off to private interests. That creates a new kind of inequality that, in the natural world, parallels this era’s growing economic gap between rich and poor. It is an inequality of access to big, wild lands and to the ineffable something that Wyoming writer Gretel Ehrlich called the “solace of open spaces” and Pulitzer-winning novelist Wallace Stegner termed “the native home of hope.”
Thanks to the great western commons, which the Bundys and their legislative champions would like to dismantle, all Americans still enjoy the freedom to roam on some of the most spectacular lands on the planet. That access and that connection have been part of the American experience from Plymouth Rock through the westward migration to the present day. It is part of what makes us Americans.
The Depression-era folksinger Woody Guthrie understood the issues attending the privatization of common land. He offered his opinion of them in the least sung verse of his most famous song:
“There was a big high wall there that tried to stop me
Sign was painted, said: “Private Property”
But on the back side it didn’t say nothing —
This land was made for you and me.”
William deBuys, a TomDispatch regular, is the author of eight books, the most recent of which is The Last Unicorn: A Search for One of Earth’s Rarest Creatures. He has written extensively on water, drought, and climate in the West, including A Great Aridness: Climate Change and the Future of the American Southwest. Based in New Mexico, he has managed ranches and devised cooperative grazing programs involving both ranchers and government land managers.
Follow TomDispatch on Twitter and join it on Facebook. Check out the newest Dispatch Book, Nick Turse’s Next Time They’ll Come to Count the Dead, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 William deBuys
Editor’s Note: Proponents of Modern Monetary Theory will appreciate this article as an example of their fundamental tenet: a “sovereign government is the monopoly supplier of its currency and can issue currency of any denomination in physical or non-physical forms. As such the government has an unlimited capacity to pay for the things it wishes to purchase and to fulfill promised future payments, and has an unlimited ability to provide funds to the other sectors. Thus, insolvency and bankruptcy of this government is not possible. It can always pay”
“Print the money” has been called crazy talk, but it may be the only sane solution to a $19 trillion federal debt that has doubled in the last 10 years. The solution of Abraham Lincoln and the American colonists can still work today.
By Ellen Brown
“Reckless,” “alarming,” “disastrous,” “swashbuckling,” “playing with fire,” “crazy talk,” “lost in a forest of nonsense”: these are a few of the labels applied by media commentators to Donald Trump’s latest proposal for dealing with the federal debt. On Monday, May 9th, the presumptive Republican presidential candidate said on CNN, “You print the money.”
The remark was in response to a firestorm created the previous week, when Trump was asked if the US should pay its debt in full or possibly negotiate partial repayment. He replied, “I would borrow, knowing that if the economy crashed, you could make a deal.” Commentators took this to mean a default. On May 9, Trump countered that he was misquoted:
People said I want to go and buy debt and default on debt – these people are crazy. This is the United States government. First of all, you never have to default because you print the money, I hate to tell you, okay? So there’s never a default.
That remark wasn’t exactly crazy. It echoed one by former Federal Reserve ChairmanAlan Greenspan, who said in 2011:
The United States can pay any debt it has because we can always print money to do that. So there is zero probability of default.
Paying the government’s debts by just issuing the money is as American as apple pie – if you go back far enough. Benjamin Franklin attributed the remarkable growth of the American colonies to this innovative funding solution. Abraham Lincoln revived the colonial system of government-issued money when he endorsed the printing of $450 million in US Notes or “greenbacks” during the Civil War. The greenbacks not only helped the Union win the war but triggered a period of robust national growth and saved the taxpayers about $14 billion in interest payments.
But back to Trump. He went on to explain:
I said if we can buy back government debt at a discount – in other words, if interest rates go up and we can buy bonds back at a discount – if we are liquid enough as a country we should do that.
Apparently he was referring to the fact that when interest rates go up, long-term bonds at the lower rate become available on the secondary market at a discount. Anyone who holds the bonds to maturity still gets full value, but many investors want to cash out early and are willing to take less. As explained on MorningStar.com:
If a bond with a 5% coupon and a ten-year maturity is sold on the secondary market today while newly issued ten-year bonds have a 6% coupon, then the 5% bond will sell for $92.56 (par value $100).
But critics still were not satisfied. In an article titled “Why Donald Trump’s Debt Proposal Is Reckless,” CNNMoney said:
[T]he federal government doesn’t have any money to buy debt back with. The U.S. already has $19 trillion in debt. Trump’s plan would require the U.S. Treasury to issue new debt to buy old debt.
Trump, however, was not talking about borrowing the money. He was talking about printing the money. CNNMoney’s response was:
That can cause inflation (or even hyperinflation), and send prices of everything from food to rent skyrocketing.
The Hyperinflation that Wasn’t
CNN was not alone in calling the notion of printing our way out of debt recklessly inflationary. But would it be? The Federal Reserve has already bought $4.5 trillion in assets, $2.7 trillion of which were federal securities, simply by “printing the money.”
When the Fed’s QE program was initiated, critics called it recklessly hyperinflationary. But it did not even create the modest 2% inflation the Fed was aiming for. QE was combined with ZIRP – zero interest rates for banks – encouraging borrowing for speculation, driving up the stock market and real estate. But the Consumer Price Index, productivity and jobs barely budged.
While the Fed has stopped its QE program for the time being, the European Central Bank and the Bank of Japan have jumped in, buying back massive amounts of their own governments’ debts by simply issuing the money. There too, the inflation needle has barely budged. As noted on CNBC in February:
Central banks have been pumping money into the global economy without a whole lot to show for it other than sharply higher stock prices, and even that has been on the downturn for the past year.
Growth remains anemic, and worries are escalating that the U.S. and the rest of the world are on the brink of a recession, despite bargain-basement interest rates and trillions in liquidity.
Helicopter Money Goes Mainstream
European economists and central bankers are wringing their hands over what to do about a flagging economy despite radical austerity measures and increasingly unrepayable debt. One suggestion gaining traction is “helicopter money” – just issue money and drop it directly into the economy in some way. In QE as done today, the newly issued money makes it no further than the balance sheets of banks. It does not get into the producing economy or the pockets of consumers, where it would need to go in order to create the demand necessary to stimulate productivity. Helicopter money would create that demand. Proposed alternatives include a universal national dividend; zero or low interest loans to local governments; and “people’s QE” for infrastructure, job creation, student debt relief, etc.
Simply buying back federal securities with money issued by the central bank (or the U.S. Treasury) would also get money into the real economy, if Congress were allowed to increase its budget in tandem. As observed in The Economist on May 1, 2016:
Advocates of helicopter money do not really intend to throw money out of aircraft. Broadly speaking, they argue for fiscal stimulus—in the form of government spending, tax cuts or direct payments to citizens—financed with newly printed money rather than through borrowing or taxation. Quantitative easing (QE) qualifies, so long as the central bank buying the government bonds promises to hold them to maturity, with interest payments and principal remitted back to the government like most central-bank profits.
As Dean Baker, co-director of the Center for Economic and Policy Research in Washington, wrote in response to the debt ceiling crisis in November 2010:
There is no reason that the Fed can’t just buy this debt (as it is largely doing) and hold it indefinitely. If the Fed holds the debt, there is no interest burden for future taxpayers. The Fed refunds its interest earnings to the Treasury every year. Last year the Fed refunded almost $80 billion in interest to the Treasury, nearly 40 percent of the country’s net interest burden. And the Fed has other tools to ensure that the expansion of the monetary base required to purchase the debt does not lead to inflation.
An even cleaner solution would be to simply void out the debt held by the Fed. That was the 2011 proposal of then-presidential candidate Ron Paul for dealing with the debt ceiling crisis. As his proposal was explained in Time Magazine, today the Treasury pays interest on its securities to the Fed, which returns 90% of these payments to the Treasury. Despite this shell game of payments, the $1.7 trillion in US bonds owned by the Fed is still counted toward the debt ceiling. Paul’s plan:
Get the Fed and the Treasury to rip up that debt. It’s fake debt anyway. And the Fed is legally allowed to return the debt to the Treasury to be destroyed.
Congressman Alan Grayson, a Democrat, also endorsed this proposal.
Financial author Richard Duncan makes a strong case for going further than just monetizing existing debt. He argues that under current market conditions, the US could actually rebuild its collapsing infrastructure by just printing the money, without causing price inflation. Prices go up when demand (money) exceeds supply (goods and services); and with automation and the availability of cheap labor in vast global markets today, supply can keep up with demand for decades to come. Duncan observes:
The combination of fiat money and Globalization creates a unique moment in history where the governments of the developed economies can print money on an aggressive scale without causing inflation. They should take advantage of this once-in-history opportunity . . . .
Returning the Power to Create Money to the People
The right of government to issue its own money was one of the principles for which the American Revolution was fought. Americans are increasingly waking up to the fact that the vast majority of the money supply is no longer issued by the government but is created by private banks when they make loans; and that with that power goes enormous power over the economy itself.
The issue that should be debated is one that dominated political discussion in the 19thcentury but that few candidates are even aware of today: should creation and control of the money supply be public or private? Donald Trump’s willingness to transgress the conservative taboo against public money creation is a welcome step in opening that debate.
The March/April 2016 issue of the beautiful environmental magazine, Orion, features an article about “a revolution in the science of dirt,” which — it claims — “is transforming American agriculture.” The article is called “Dirt First” and is written by Kristin Ohlson. Like most good stories, this one has a hero, in this case Rick Haney, a USDA soil scientist.
“Our entire agriculture industry is based on chemical inputs, but soil is not a chemistry set,” Haney explains. “It’s a biological system. We’ve treated it like a chemistry set because the chemistry is easier to measure than the soil biology.”
In nature, of course, plants grow like mad without added synthetic fertilizer, thanks to a multimillion-year-old partnership with soil microorganisms. Plants pull carbon dioxide from the atmosphere through photosynthesis and create a carbon syrup. About 60 percent of this fuels the plant’s growth, with the remaining exuded through the roots to soil microorganisms, which trade mineral nutrients they’ve liberated from rocks, sand, silt, and clay—in other words, fertilizer—for their share of the carbon bounty. Haney insists that ag scientists are remiss if they don’t pay more attention to this natural partnership.
“I’ve had scientific colleagues tell me they raised 300 bushels of corn [per acre] with an application of fertilizer, and I ask how the control plots, the ones without the fertilizer, did,” Haney says. “They tell me 220 bushels of corn. How is that not the story? How is raising 220 bushels of corn without fertilizer not the story?” If the natural processes at work in even the tired soil of a test plot can produce 220 bushels of corn, he argues, the yields of farmers consciously building soil health can be much higher.
Less than 50 percent of the synthetic fertilizer that farmers apply to most crops is actually used by plants, with much of the rest running off into drainage ditches and streams and, later, concentrating with disastrous effects in lakes and oceans. Witness the oxygen-free dead zone in the Gulf of Mexico or tap water tainted by neurotoxin-producing algae in Ohio: both phenomena are tied to fertilizer runoff. Farmers often apply fertilizer based on advice from manufacturers and university extension agents who are faithful to the agrochemical mindset, using formulas that tie X amount of desired yield to Y pounds of fertilizer applied per acre. Or they apply fertilizer based on a standard test that gauges the amount of inorganic nitrogen, potassium, and phosphorus—the basic ingredients of chemical fertilizers, often referred to as NPK—in a soil sample. Or they apply what they put on the year before, or what their neighbor applied, and then maybe a little bit more, hoping for a jackpot combination of rain, sunshine, and a good market.
The standard soil test, developed some sixty years ago, focuses only on the chemical properties of soil. Haney began developing his test in the early 1990s to focus instead on the soil’s biology. Based on the vigor of the microscopic community in a farmer’s soil, his recommendations usually call for far less than what the farmer hears elsewhere. The yields of those who heed his advice often remain the same, or rise.”
Read the full article here: “Dirt First“
Scientists believe that simple land management techniques can increase the rate at which carbon is absorbed from the atmosphere and stored in soils.
For many climate change activists, the latest rallying cry has been, “Keep it in the ground,” a call to slow and stop drilling for fossil fuels. But for a new generation of land stewards, the cry is becoming, “Put it back in the ground!”
As an avid gardener and former organic farmer, I know the promise that soil holds: Every ounce supports a plethora of life. Now, evidence suggests that soil may also be a key to slowing and reversing climate change.
Evidence suggests that soil may also be a key to slowing and reversing climate change.
“I think the future is really bright,” said Loren Poncia, an energetic Northern Californian cattle rancher. Poncia’s optimism stems from the hope he sees in carbon farming, which he has implemented on his ranch. Carbon farming uses land management techniques that increase the rate at which carbon is absorbed from the atmosphere and stored in soils. Scientists, policy makers, and land stewards alike are hopeful about its potential to mitigate climate change.
Carbon is the key ingredient to all life. It is absorbed by plants from the atmosphere as carbon dioxide and, with the energy of sunlight, converted into simple sugars that build more plant matter. Some of this carbon is consumed by animals and cycled through the food chain, but much of it is held in soil as roots or decaying plant matter. Historically, soil has been a carbon sink, a place of long-term carbon storage.
But many modern land management techniques, including deforestation and frequent tilling, expose soil-bound carbon to oxygen, limiting the soil’s absorption and storage potential. In fact, carbon released from soil is estimated to contribute one-third of global greenhouse gas emissions, according to the Food and Agriculture Organization of the United Nations.
Ranchers and farmers have the power to address that issue. Pastures make up 3.3 billion hectares, or 67 percent, of the world’s farmland. Carbon farming techniques can sequester up to 50 tons of carbon per hectare over a pasture’s lifetime. This motivates some ranchers and farmers to do things a little differently.
“It’s what we think about all day, every day,” said Sallie Calhoun of Paicines Ranch on California’s central coast. “Sequestering soil carbon is essentially creating more life in the soil, since it’s all fed by photosynthesis. It essentially means more plants into every inch of soil.”
Carbon released from soil is estimated to make up to one-third of global greenhouse gas emissions.
Calhoun’s ranch sits in fertile, rolling California pastureland about an hour’s drive east of Monterey Bay. She intensively manages her cattle’s grazing, moving them every few days across 7,000 acres. This avoids compaction, which decreases soil productivity, and also allows perennial grasses to grow back between grazing. Perennial grasses, like sorghum and bluestems, have long root systems that sequester far more carbon than their annual cousins.
By starting with a layer of compost, Calhoun has also turned her new vineyard into an effective carbon sink. Compost is potent for carbon sequestration because of how it enhances otherwise unhealthy soil, enriching it with nutrients and microbes that increase its capacity to harbor plant growth. Compost also increases water-holding capacity, which helps plants thrive even in times of drought. She plans to till the land only once, when she plants the grapes, to avoid releasing stored carbon back into the atmosphere.
Managed grazing and compost application are just a few common practices of the 35 that the Natural Resources Conservation Service recommends for carbon sequestration. All 35 methods have been proven to sequester carbon, though some are better documented than others.
David Lewis, director of the University of California Cooperative Extension, says the techniques Calhoun uses, as well as stream restoration, are some of the most common. Lewis has worked with theMarin Carbon Project, a collaboration of researchers, ranchers, and policy makers, to study and implement carbon farming in Marin County, California. The research has been promising: They found that one application of compost doubled the production of grass and increased carbon sequestration by up to 70 percent. Similarly, stream and river ecosystems, which harbor lots of dense, woody vegetation, can sequester up to one ton of carbon, or as much as a car emits in a year, in just a few feet along their beds.
One application of compost doubled the production of grass and increased carbon sequestration by up to 70 percent.
On his ranch, Poncia has replanted five miles of streams with native shrubs and trees, and has applied compost to all of his 800 acres of pasture. The compost-fortified grasses are more productive and have allowed him to double the number of cattle his land supports. This has had financial benefits. Ten years ago, Poncia was selling veterinary pharmaceuticals to subsidize his ranch. But, with the increase in cattle, he has been able to take up ranching full time. Plus, his ranch sequesters the same amount of carbon each year as is emitted by 81 cars.
Much of the research on carbon farming focuses on rangelands, which are open grasslands, because they make up such a large portion of ecosystems across the planet. They are also, after all, where we grow a vast majority of our food.
“Many of the skeptics of carbon farming think we should be planting forests instead,” Poncia said. “I think forests are a no-brainer, but there are millions of acres of rangelands across the globe and they are not sequestering as much carbon as they could be.”
The potential of carbon farming lies in wide-scale implementation. The Carbon Cycle Institute, which grew out of the Marin Carbon Project with the ambition of applying the research and lessons to other communities in California and nationally, is taking up that task.
“It really all comes back to this,” said Torri Estrada, pointing to a messy white board with the words SOIL CARBON scrawled in big letters. Estrada is managing director of the Carbon Cycle Institute, where he is working to attract more ranchers and farmers to carbon farming. The white board maps the intricate web of organizations and strategies the institute works with. They provide technical assistance and resources to support land stewards in making the transition.
“If the U.S. government would buy carbon credits from farmers, we would produce them.”
For interested stewards, implementation, and the costs associated with it, are different. It could be as simple as a one-time compost application or as intensive as a lifetime of managing different techniques. But for all, the process starts by first assessing a land’s sequestration potential and deciding which techniques fit a steward’s budget and goals. COMET-Farm, an online tool produced by the U.S. Department of Agriculture, can help estimate a ranch’s carbon input and output.
The institute also works with state and national policy makers to provide economic incentives for these practices. “If the U.S. government would buy carbon credits from farmers, we would produce them,” Poncia said. These credits are one way the government could pay farmers to mitigate climate change. “Farmers overproduce everything. So, if they can fund that, we will produce them,” he said. While he is already sequestering carbon, Poncia says that he could do more, given the funding.
Estrada sees the bigger potential of carbon farming to help spur a more fundamental conversation about how we relate to the land. “We’re sitting down with ranchers and having a conversation, and carbon is just the medium for that,” he said. Through this work, Estrada has watched ranchers take a more holistic approach to their management.
On his ranch, Poncia has shifted from thinking about himself as a grass farmer growing feed for his cattle to a soil farmer with the goal of increasing the amount of life in every inch of soil.
Sunday, April 17th was the designated moment. The world’s leading oil producers were expected to bring fresh discipline to the chaotic petroleum market and spark a return to high prices. Meeting in Doha, the glittering capital of petroleum-rich Qatar, the oil ministers of the Organization of the Petroleum Exporting Countries (OPEC), along with such key non-OPEC producers as Russia and Mexico, were scheduled to ratify a draft agreement obliging them to freeze their oil output at current levels. In anticipation of such a deal, oil prices had begun to creep inexorably upward, from $30 per barrel in mid-January to $43 on the eve of the gathering. But far from restoring the old oil order, the meeting ended in discord, driving prices down again and revealing deep cracks in the ranks of global energy producers.
It is hard to overstate the significance of the Doha debacle. At the very least, it will perpetuate the low oil prices that have plagued the industry for the past two years, forcing smaller firms into bankruptcy and erasing hundreds of billions of dollars of investments in new production capacity. It may also have obliterated any future prospects for cooperation between OPEC and non-OPEC producers in regulating the market. Most of all, however, it demonstrated that the petroleum-fueled world we’ve known these last decades — with oil demand always thrusting ahead of supply, ensuring steady profits for all major producers — is no more. Replacing it is an anemic, possibly even declining, demand for oil that is likely to force suppliers to fight one another for ever-diminishing market shares.
The Road to Doha
Before the Doha gathering, the leaders of the major producing countries expressed confidence that a production freeze would finally halt the devastating slump in oil prices that began in mid-2014. Most of them are heavily dependent on petroleum exports to finance their governments and keep restiveness among their populaces at bay. Both Russia and Venezuela, for instance, rely on energy exports for approximately 50% of government income, while for Nigeria it’s more like 75%. So the plunge in prices had already cut deep into government spending around the world, causing civil unrest and even in some cases political turmoil.
No one expected the April 17th meeting to result in an immediate, dramatic price upturn, but everyone hoped that it would lay the foundation for a steady rise in the coming months. The leaders of these countries were well aware of one thing: to achieve such progress, unity was crucial. Otherwise they were not likely to overcome the various factors that had caused the price collapsein the first place. Some of these were structural and embedded deep in the way the industry had been organized; some were the product of their own feckless responses to the crisis.
On the structural side, global demand for energy had, in recent years, ceased to rise quickly enough to soak up all the crude oil pouring onto the market, thanks in part to new supplies from Iraq and especially from the expanding shale fields of the United States. This oversupply triggered the initial 2014 price drop when Brent crude — the international benchmark blend — went from a high of $115 on June 19th to $77 on November 26th, the day before a fateful OPEC meeting in Vienna. The next day, OPEC members, led by Saudi Arabia, failed to agree on either production cuts or a freeze, and the price of oil went into freefall.
The failure of that November meeting has been widely attributed to the Saudis’ desire to kill off new output elsewhere — especially shale production in the United States — and to restore their historic dominance of the global oil market. Many analysts were also convinced that Riyadh was seeking to punish regional rivals Iran and Russia for their support of the Assad regime in Syria (which the Saudis seek to topple).
The rejection, in other words, was meant to fulfill two tasks at the same time: blunt or wipe out the challenge posed by North American shale producers and undermine two economically shaky energy powers that opposed Saudi goals in the Middle East by depriving them of much needed oil revenues. Because Saudi Arabia could produce oil so much more cheaply than other countries — for as little as $3 per barrel — and because it could draw upon hundreds of billions of dollars in sovereign wealth funds to meet any budget shortfalls of its own, its leaders believed it more capable of weathering any price downturn than its rivals. Today, however, that rosy prediction is looking grimmer as the Saudi royals begin to feel the pinch of low oil prices, and find themselves cutting back on the benefits they had been passing on to an ever-growing, potentially restive population while still financing a costly, inconclusive, and increasingly disastrous war in Yemen.
Many energy analysts became convinced that Doha would prove the decisive moment when Riyadh would finally be amenable to a production freeze. Just days before the conference, participants expressed growing confidence that such a plan would indeed be adopted. After all, preliminary negotiations between Russia, Venezuela, Qatar, and Saudi Arabia had produced a draft document that most participants assumed was essentially ready for signature. The only sticking point: the nature of Iran’s participation.
The Iranians were, in fact, agreeable to such a freeze, but only after they were allowed to raise their relatively modest daily output to levels achieved in 2012 before the West imposed sanctions in an effort to force Tehran to agree to dismantle its nuclear enrichment program. Now that those sanctions were, in fact, being lifted as a result of the recently concluded nuclear deal, Tehran was determined to restore the status quo ante. On this, the Saudis balked, having no wish to see their arch-rival obtain added oil revenues. Still, most observers assumed that, in the end, Riyadh would agree to a formula allowing Iran some increase before a freeze. “There are positive indications an agreement will be reached during this meeting… an initial agreement on freezing production,” said Nawal Al-Fuzaia, Kuwait’s OPEC representative, echoing the views of other Doha participants.
But then something happened. According to people familiar with the sequence of events, Saudi Arabia’s Deputy Crown Prince and key oil strategist, Mohammed bin Salman, called the Saudi delegation in Doha at 3:00 a.m. on April 17th and instructed them to spurn a deal that provided leeway of any sort for Iran. When the Iranians — who chose not to attend the meeting — signaled that they had no intention of freezing their output to satisfy their rivals, the Saudis rejected the draft agreement it had helped negotiate and the assembly ended in disarray.
Geopolitics to the Fore
Most analysts have since suggested that the Saudi royals simply considered punishing Iran more important than raising oil prices. No matter the cost to them, in other words, they could not bring themselves to help Iran pursue its geopolitical objectives, including giving yet more support to Shiite forces in Iraq, Syria, Yemen, and Lebanon. Already feeling pressured by Tehran and ever less confident of Washington’s support, they were ready to use any means available to weaken the Iranians, whatever the danger to themselves.
“The failure to reach an agreement in Doha is a reminder that Saudi Arabia is in no mood to do Iran any favors right now and that their ongoing geopolitical conflict cannot be discounted as an element of the current Saudi oil policy,” said Jason Bordoff of the Center on Global Energy Policy at Columbia University.
Many analysts also pointed to the rising influence of Deputy Crown Prince Mohammed bin Salman, entrusted with near-total control of the economy and the military by his aging father, King Salman. As Minister of Defense, the prince has spearheaded the Saudi drive to counter the Iranians in a regional struggle for dominance. Most significantly, he is the main force behind Saudi Arabia’s ongoing intervention in Yemen, aimed at defeating the Houthi rebels, a largely Shia group with loose ties to Iran, and restoring deposed former president Abd Rabbuh Mansur Hadi. After a year of relentless U.S.-backed airstrikes (including the use of cluster bombs), the Saudi intervention has, in fact, failed to achieve its intended objectives, though it has produced thousands of civilian casualties, provoking fierce condemnation from U.N. officials, and created space for the rise of al-Qaeda in the Arabian Peninsula. Nevertheless, the prince seems determined to keep the conflict going and to counter Iranian influence across the region.
For Prince Mohammed, the oil market has evidently become just another arena for this ongoing struggle. “Under his guidance,” the Financial Timesnoted in April, “Saudi Arabia’s oil policy appears to be less driven by the price of crude than global politics, particularly Riyadh’s bitter rivalry with post-sanctions Tehran.” This seems to have been the backstory for Riyadh’s last-minute decision to scuttle the talks in Doha. On April 16th, for instance, Prince Mohammed couldn’t have been blunter to Bloomberg, even if he didn’t mention the Iranians by name: “If all major producers don’t freeze production, we will not freeze production.”
With the proposed agreement in tatters, Saudi Arabia is now expected to boost its own output, ensuring that prices will remain bargain-basement low and so deprive Iran of any windfall from its expected increase in exports. The kingdom, Prince Mohammed told Bloomberg, was prepared to immediately raise production from its current 10.2 million barrels per day to 11.5 million barrels and could add another million barrels “if we wanted to” in the next six to nine months. With Iranian and Iraqi oil heading for market in larger quantities, that’s the definition of oversupply. It would certainly ensure Saudi Arabia’s continued dominance of the market, but it might also wound the kingdom in a major way, if not fatally.
A New Global Reality
No doubt geopolitics played a significant role in the Saudi decision, but that’s hardly the whole story. Overshadowing discussions about a possible production freeze was a new fact of life for the oil industry: the past would be no predictor of the future when it came to global oil demand. Whatever the Saudis think of the Iranians or vice versa, their industry is being fundamentally transformed, altering relationships among the major producers and eroding their inclination to cooperate.
Until very recently, it was assumed that the demand for oil would continue to expand indefinitely, creating space for multiple producers to enter the market, and for ones already in it to increase their output. Even when supply outran demand and drove prices down, as has periodically occurred, producers could always take solace in the knowledge that, as in the past, demand would eventually rebound, jacking prices up again. Under such circumstances and at such a moment, it was just good sense for individual producers to cooperate in lowering output, knowing that everyone would benefit sooner or later from the inevitable price increase.
But what happens if confidence in the eventual resurgence of demand begins to wither? Then the incentives to cooperate begin to evaporate, too, and it’s every producer for itself in a mad scramble to protect market share. This new reality — a world in which “peak oil demand,” rather than “peak oil,” will shape the consciousness of major players — is what the Doha catastrophe foreshadowed.
At the beginning of this century, many energy analysts were convinced that we were at the edge of the arrival of “peak oil”; a peak, that is, in the output of petroleum in which planetary reserves would be exhausted long before the demand for oil disappeared, triggering a global economic crisis. As a result of advances in drilling technology, however, the supply of oil has continued to grow, while demand has unexpectedly begun to stall. This can be traced both to slowing economic growth globally and to an accelerating “green revolution” in which the planet will be transitioning to non-carbon fuel sources. With most nations now committed to measures aimed at reducing emissions of greenhouse gases under the just-signed Paris climate accord, the demand for oil is likely to experience significant declines in the years ahead. In other words, global oil demand will peak long before supplies begin to run low, creating a monumental challenge for the oil-producing countries.
This is no theoretical construct. It’s reality itself. Net consumption of oil in the advanced industrialized nations has already dropped from 50 million barrels per day in 2005 to 45 million barrels in 2014. Further declines are in store as strict fuel efficiency standards for the production of new vehicles and other climate-related measures take effect, the price of solar and wind power continues to fall, and other alternative energy sources come on line. While the demand for oil does continue to rise in the developing world, even there it’s not climbing at rates previously taken for granted. With such countries also beginning to impose tougher constraints on carbon emissions, global consumption is expected to reach a peak and begin an inexorable decline.According to experts Thijs Van de Graaf and Aviel Verbruggen, overall world peak demand could be reached as early as 2020.
In such a world, high-cost oil producers will be driven out of the market and the advantage — such as it is — will lie with the lowest-cost ones. Countries that depend on petroleum exports for a large share of their revenues will come under increasing pressure to move away from excessive reliance on oil. This may have been another consideration in the Saudi decision at Doha. In the months leading up to the April meeting, senior Saudi officials dropped hints that they were beginning to plan for a post-petroleum era and that Deputy Crown Prince bin Salman would play a key role in overseeing the transition.
On April 1st, the prince himself indicated that steps were underway to begin this process. As part of the effort, he announced, he was planning an initial public offering of shares in state-owned Saudi Aramco, the world’s number one oil producer, and would transfer the proceeds, an estimated $2 trillion, to its Public Investment Fund (PIF). “IPOing Aramco and transferring its shares to PIF will technically make investments the source of Saudi government revenue, not oil,” the prince pointed out. “What is left now is to diversify investments. So within 20 years, we will be an economy or state that doesn’t depend mainly on oil.”
For a country that more than any other has rested its claim to wealth and power on the production and sale of petroleum, this is a revolutionary statement. If Saudi Arabia says it is ready to begin a move away from reliance on petroleum, we are indeed entering a new world in which, among other things, the titans of oil production will no longer hold sway over our lives as they have in the past.
This, in fact, appears to be the outlook adopted by Prince Mohammed in the wake of the Doha debacle. In announcing the kingdom’s new economic blueprint on April 25th, he vowed to liberate the country from its “addiction” to oil.” This will not, of course, be easy to achieve, given the kingdom’s heavy reliance on oil revenues and lack of plausible alternatives. The 30-year-old prince could also face opposition from within the royal family to his audacious moves (as well as his blundering ones in Yemen and possibly elsewhere). Whatever the fate of the Saudi royals, however, if predictions of a future peak in world oil demand prove accurate, the debacle in Doha will be seen as marking the beginning of the end of the old oil order.
Michael T. Klare, a TomDispatch regular, is a professor of peace and world security studies at Hampshire College and the author, most recently, of The Race for What’s Left. A documentary movie version of his book Blood and Oil is available from the Media Education Foundation. Follow him on Twitter at @mklare1.
Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Nick Turse’s Tomorrow’s Battlefield: U.S. Proxy Wars and Secret Ops in Africa, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 Michael T. Klare
In January 2011, I presented a futures research project to the Progressive Caucus in Congress, then the largest of all the caucuses in that body. The report, Progressives 2040 — which was sponsored by ProgressiveCongress.org and published by Demos — analyzed a large set of major trends that would shape the future of the progressive movement for the next three decades, and offered a set of scenarios that illustrated how these trends might work together to create a range of possible futures that the movement will need to be prepared for.
This is the first post in “What We Know About The Progressive Future” — a series that I imagine will be a long (probably 10-12 post) look at that research five years on, updating my conclusions and taking a fresh look at the big drivers and high leverage points that will determine the future of our movement.
For most pundits, the most striking thing about the Iowa Caucus was the virtual tie between the two Democratic candidates (which portends a longer and perhaps more exciting election season and higher ratings for those in the media to look forward to), and the surprising 1-2-and-3 order of Cruz, Trump, and Rubio. I’m writing this less than 24 hours after the caucuses ended, and more than enough on both these topics has already been written by others (for God’s sake, people, it’s just Iowa), so I’m going to spare you another analysis ex cathedra from my belly button as to What It All Means For November.
I’m far more interested in another trend that emerged last night — a small detail that will almost certainly have a much longer historical tail than anything else that might happen between now and Election Day. This trend was crystallized by the stunning fact that Bernie Sanders got 85% of the votes of caucus-goers under 30.
That’s not a typo. Eighty-five percent.
That’s a number that strategists from every end of the political spectrum need to be paying attention to, because it is heralding the arrival of the Millennial Generation as a political force to be reckoned with.
My report saw this coming. Back in January 2011, I wrote this about them:
The Millennial generation (born 1980-2000) is the largest and most ethnically diverse generation in American history, with 44% identifying as members of a racial minority. They are the most globally connected generation to date: they travel more, speak more languages, and have friends all over the world. They are more progressive in their core values and attitudes that any cohort we’ve seen in at least a century. And they are rising fast: by 2020, they will be outvoting their elders, dominating elections and bringing their own priorities to the table. We can expect the Millennials to launch their first serious presidential candidate in 2020, and elect their first president probably no later than 2024.
Perhaps the most important fact about the Millennials is the sheer size of this generation. They’re the first cohort we’ve seen in the past 40 years that’s actually big enough to swamp their Boomer parents, whose interests and worldviews have dominated American politics ever since the youngest of them hit voting age in the late 1970s. The Boom was the biggest generation in American history, to the point where their sheer size itself was transformative (as they say: quantity has a quality all its own). But the Millennials are even bigger. And between now and 2020, the youngest edge of this generation will finally turn 18 and register to vote. The results stand to be at least as transformative for us as a nation as the moment when the Boomers themselves arrived.
Conservative Millennials? Don’t hold your breath
Any number of GOP pundits have written thumb-sucking articles explaining how this cohort is going to become more conservative as it ages (because every generation does, right?) Feel free to rip those up: it’s not likely to happen, for several reasons — starting with the fact that no, not every generation does. The Boomers did, because from left to right and youth through their approaching old age, they’ve shared a belief in radical individualism — the primacy of the individual over any claims made by society — that fed everything from Evangelicalism and free market fundamentalism on the right to New Age religions and social experimentation on the left. That individualism is the one shining through-line that defines everything that generation has ever embraced. It made them hippies. And it also made them vote for Reagan.
The Millennials are their historical opposite number — a generation raised from babyhood to cooperate, share, include, network, and self-organize. They value conformity (Boomers and Xers are horrified by the “calling out” ritual that Millennials run on each other constantly as they vigilantly police each other’s behavior. We’d have choked on our own spit before telling each other what to say, think or do; and would have rightly expected to be told to fuck off if we tried it), and as this pervades their politics in the coming decades, it’s going to involve a lot of telling other people how they should live. That’s how their GI grandparents created and enforced the great American Consensus of the ’40s, ’50s, and ’60s, and it’s how they’re going to re-create a new consensus about the Next America they’re going to build.
That bred-in-the-bone collectivism is likely to be as durable a lifelong feature as Boomer individualism has been; but it stands in stark opposition to conservatism as it’s currently constituted. It’s possible to imagine another, distinctively Millennial form of conservatism emerging in time — but it would have to be rooted in the idea of a strong social contract, one that obligates individuals to cede some of their desires to the greater good, represented by trusted authorities — and is willing to use social shame as an enforcement mechanism. The GOP is a long way from offering any narratives along these lines now. If they do emerge, it could take another 20 years or more, becoming something today’s Millennials embrace as they age on through their 40s, 50s and 60s.
Other conservatives hold out hope that that all-time-high number of Millennials from immigrant families will benefit them in time, since the usual pattern has been for second-generation immigrants (the first generation born here) to do very well educationally and economically, and to vote more conservatively than either their parents or their third-gen kids. That might be a very plausible scenario — except for the nasty fact that Millennials have already grown up scarred and terrorized by a GOP that has never been able to lay off immigrant-bashing. Again, it’s going to take a radical change within that party — plus another 15 years of over-the-top effort — to win even the grudging trust of a generation that’s already marinated in decades of conservative anti-immigrant hysteria before that’s even remotely likely.
In any event: anybody waiting for the Great Millennial Conservative Revival probably shouldn’t hold their breath. If it comes at all, it’s going to be a very long while indeed. In the meantime, these young adults have a revolution to pull off. And that moment is coming — much sooner than anybody seems ready to think.
Millennials and Elections
Obama, to his credit, was the first candidate to recognize the raw political power and profound unrest of this rising cohort in 2008. Even though fully half of the Millennial generation was still too young to vote, his overt efforts to capture the energy and attention of the half that could was a conscious strategy. The Millennials ended up supplying him with the margin that put him over the top in the election — support he later rewarded by bringing home the troops (most of whom were Millennials) and restructuring the federal student loan program to make over $30 billion more in Pell Grants available and reduce the loan burden on new graduates (both of which were policies I pointed to in my original 2011 report).
But the Millennials want more. They’re looking into a future that most of them understand is a fatal dead end without a radical, rip-up-the-floorboards restructuring of how the entire planet works — how we do everything from energy and money to community and education to transportation and agriculture. This yearning for a different kind of world even has the potential to upend our current understandings of “right” and “left,” as I wrote in my report:
Some research suggests that this generation’s politics lean toward the “independent” and the “centrist.” However, those words don’t mean the same thing to under-30 Americans that they do to older ones. The self-described “independents” also express core values that are deeply collectivist and inclusive, which gives them a strong affinity for progressive ideas and solutions. (Studies by Pew and Barna have even found these same affinities among self-identified conservatives in this cohort.) Likewise, these “centrists” see their generation’s communal focus on a shared future and shared prosperity as a matter of plain common sense. To them, “we’re in this together” is not a radical idea; indeed, it stands at the center of their politics.
The Millennials spurned Hillary in 2008 because they were craving a true change candidate — and Obama promised to be that. But in the end, the change he could deliver wasn’t enough. And that’s why this generation is going, overwhelmingly, for Bernie Sanders, whom they see as sitting entirely outside the corrupt party system that made it impossible for Obama to give them the goods, unbeholden to Wall Street, uncontaminated by party cronyism, unfiltered in the media — someone who seems to be entirely their own. This is what their candidates look like — and are going to continue to look like for the next several election cycles.
Given that the youngest 15% or so of the Millennial cohort is still too young to vote, it’s not clear that the Millennials will get their revolution this year. My prediction above that they’d dominate our elections by 2020 was based on the fact that that’s when the very tail end of the cohort — the ones born in 2000 — will all have reached adulthood, putting them finally at their full political strength. Whether or not they show up for 2016 is also complicated by a few other factors, including:
- How disillusioned the older ones are following their experience with Obama, whom many of them feel very disappointed by — a real problem that surfaced in 2012, when many of them didn’t return to the polls.
- The general tendency of young adults in their 20s to not vote. Voting is a behavior that becomes more reliable with age. By 2020, the oldest Millennials will be 40, and half will be over 30 — which means they should start showing up far more regularly.
- Persistent efforts on the part of the GOP to disenfranchise students, which have large effects in some parts of the country.
- How well Sanders survives the onslaught of conservative attacks that we all know are coming.
It’s safe to say that the Millennials will be a vastly bigger factor in 2016 than they were in either 2008 or 2012 — and that Sanders’ success to date can and should be interpreted as this generation’s announcement of its growing political presence with far louder and more insistent authority than we’ve ever heard from them before.
However, in this election cycle, it’s not at all clear that it will be enough to get them what they want. We are tantalizingly close to a generational tipping point, but have not completely arrived at it just quite yet. But by the next cycle, that point will almost certainly be well behind us — and from then on, for the next 40 years, our politics will be pretty much entirely dominated, owned, and determined by the Millennials’ collectivist worldviews, interests, desires, and priorities. They will, this time or next, succeed in voting themselves the transformation they seek. It’s not a question of if, but when.
What we’re seeing when we look at the Bernie Sanders phenomenon is a direct window into our own political future. When will it emerge? Maybe not today, and maybe not this November — but it’s coming soon, and it or something like it will be the dominant political reality for the rest of our lives.
Photo: Ian Buck via Flickr
Sara Robinson is a Seattle-based futurist and veteran blogger on culture, politics, and religion. Since 2006, her work (gathered in the Archive section of her blog) regularly appeared at Orcinus, Our Future, Group News Blog, and Alternet. She’s also written for Salon, Huffington Post, Grist, the New Republic, New York Magazine, Firedoglake, and many other sites.
Robinson holds an MS in Futures Studies from the University of Houston, and a BA in Journalism from the USC Annenberg School of Communication. She was a Schumann Fellow, and also held senior fellowships at the Campaign for America’s Future and the Commonweal Foundation. She currently serves on the national board of NARAL Pro-Choice America.
By Dilip Hiro
Reprinted from TomDispatch.com
Undoubtedly, for nearly two decades, the most dangerous place on Earth has been the Indian-Pakistani border in Kashmir. It’s possible that a small spark from artillery and rocket exchanges across that border might — given the known military doctrines of the two nuclear-armed neighbors — lead inexorably to an all-out nuclear conflagration. In that case the result would be catastrophic. Besides causing the deaths of millions of Indians and Pakistanis, such a war might bring on “nuclear winter” on a planetary scale, leading to levels of suffering and death that would be beyond our comprehension.
Alarmingly, the nuclear competition between India and Pakistan has now entered a spine-chilling phase. That danger stems from Islamabad’s decision to deploy low-yield tactical nuclear arms at its forward operating military bases along its entire frontier with India to deter possible aggression by tank-led invading forces. Most ominously, the decision to fire such a nuclear-armed missile with a range of 35 to 60 miles is to rest with local commanders. This is a perilous departure from the universal practice of investing such authority in the highest official of the nation. Such a situation has no parallel in the Washington-Moscow nuclear arms race of the Cold War era.
When it comes to Pakistan’s strategic nuclear weapons, their parts are stored in different locations to be assembled only upon an order from the country’s leader. By contrast, tactical nukes are pre-assembled at a nuclear facility and shipped to a forward base for instant use. In addition to the perils inherent in this policy, such weapons would be vulnerable to misuse by a rogue base commander or theft by one of the many militant groups in the country.
In the nuclear standoff between the two neighbors, the stakes are constantly rising as Aizaz Chaudhry, the highest bureaucrat in Pakistan’s foreign ministry, recently made clear. The deployment of tactical nukes, he explained, was meant to act as a form of “deterrence,” given India’s “Cold Start” military doctrine — a reputed contingency plan aimed at punishing Pakistan in a major way for any unacceptable provocations like a mass-casualty terrorist strike against India.
New Delhi refuses to acknowledge the existence of Cold Start. Its denials are hollow. As early as 2004, it was discussing this doctrine, which involved the formation of eight division-size Integrated Battle Groups (IBGs). These were to consist of infantry, artillery, armor, and air support, and each would be able to operate independently on the battlefield. In the case of major terrorist attacks by any Pakistan-based group, these IBGs would evidently respond by rapidly penetrating Pakistani territory at unexpected points along the border and advancing no more than 30 miles inland, disrupting military command and control networks while endeavoring to stay away from locations likely to trigger nuclear retaliation. In other words, India has long been planning to respond to major terror attacks with a swift and devastating conventional military action that would inflict only limited damage and so — in a best-case scenario — deny Pakistan justification for a nuclear response.
Islamabad, in turn, has been planning ways to deter the Indians from implementing a Cold-Start-style blitzkrieg on their territory. After much internal debate, its top officials opted for tactical nukes. In 2011, the Pakistanis tested one successfully. Since then, according to Rajesh Rajagopalan, the New Delhi-based co-author of Nuclear South Asia: Keywords and Concepts, Pakistan seems to have been assembling four to five of these annually.
All of this has been happening in the context of populations that view each other unfavorably. A typical survey in this period by the Pew Research Center found that 72% of Pakistanis had an unfavorable view of India, with 57% considering it as a serious threat, while on the other side 59% of Indians saw Pakistan in an unfavorable light.
This is the background against which Indian leaders have said that a tactical nuclear attack on their forces, even on Pakistani territory, would be treated as a full-scale nuclear attack on India, and that they reserved the right to respond accordingly. Since India does not have tactical nukes, it could only retaliate with far more devastating strategic nuclear arms, possibly targeting Pakistani cities.
According to a 2002 estimate by the U.S. Defense Intelligence Agency (DIA), a worst-case scenario in an Indo-Pakistani nuclear war could result in eight to 12 million fatalities initially, followed by many millions later from radiation poisoning. More recent studies have shown that up to a billion people worldwide might be put in danger of famine and starvation by the smoke and soot thrown into the troposphere in a major nuclear exchange in South Asia. The resulting “nuclear winter” and ensuing crop loss would functionally add up to a slowly developing global nuclear holocaust.
Last November, to reduce the chances of such a catastrophic exchange happening, senior Obama administration officials met in Washington with Pakistan’s army chief, General Raheel Sharif, the final arbiter of that country’s national security policies, and urged him to stop the production of tactical nuclear arms. In return, they offered a pledge to end Islamabad’s pariah status in the nuclear field by supporting its entry into the 48-member Nuclear Suppliers Group to which India already belongs. Although no formal communiqué was issued after Sharif’s trip, it became widely known that he had rejected the offer.
This failure was implicit in the testimony that DIA Director Lieutenant General Vincent Stewart gave to the Armed Services Committee this February. “Pakistan’s nuclear weapons continue to grow,” he said. “We are concerned that this growth, as well as the evolving doctrine associated with tactical [nuclear] weapons, increases the risk of an incident or accident.”
Strategic Nuclear Warheads
Since that DIA estimate of human fatalities in a South Asian nuclear war, the strategic nuclear arsenals of India and Pakistan have continued to grow. In January 2016, according to a U.S. congressional report, Pakistan’s arsenal probably consisted of 110 to 130 nuclear warheads. According to the Stockholm International Peace Research Institute, India has 90 to 110 of these. (China, the other regional actor, has approximately 260 warheads.)
As the 1990s ended, with both India and Pakistan testing their new weaponry, their governments made public their nuclear doctrines. The National Security Advisory Board on Indian Nuclear Doctrine, for example, stated in August 1999 that “India will not be the first to initiate a nuclear strike, but will respond with punitive retaliation should deterrence fail.” India’s foreign minister explained at the time that the “minimum credible deterrence” mentioned in the doctrine was a question of “adequacy,” not numbers of warheads. In subsequent years, however, that yardstick of “minimum credible deterrence” has been regularly recalibrated as India’s policymakers went on to commit themselves to upgrade the country’s nuclear arms program with a new generation of more powerful hydrogen bombs designed to be city-busters.
In Pakistan in February 2000, President General Pervez Musharraf, who was also the army chief, established the Strategic Plan Division in the National Command Authority, appointing Lieutenant General Khalid Kidwai as its director general. In October 2001, Kidwai offered an outline of the country’s updated nuclear doctrine in relation to its far more militarily and economically powerful neighbor, saying, “It is well known that Pakistan does not have a ‘no-first-use policy.’” He then laid out the “thresholds” for the use of nukes. The country’s nuclear weapons, he pointed out, were aimed solely at India and would be available for use not just in response to a nuclear attack from that country, but should it conquer a large part of Pakistan’s territory (the space threshold), or destroy a significant part of its land or air forces (the military threshold), or start to strangle Pakistan economically (the economic threshold), or politically destabilize the country through large-scale internal subversion (the domestic destabilization threshold).
Of these, the space threshold was the most likely trigger. New Delhi as well as Washington speculated as to where the red line for this threshold might lie, though there was no unanimity among defense experts. Many surmised that it would be the impending loss of Lahore, the capital of Punjab, only 15 miles from the Indian border. Others put the red line at Pakistan’s sprawling Indus River basin.
Within seven months of this debate, Indian-Pakistani tensions escalated steeply in the wake of an attack on an Indian military base in Kashmir by Pakistani terrorists in May 2002. At that time, Musharraf reiterated that he would not renounce his country’s right to use nuclear weapons first. The prospect of New Delhi being hit by an atom bomb became so plausible that U.S. Ambassador Robert Blackwill investigated building a hardened bunker in the Embassy compound to survive a nuclear strike. Only when he and his staff realized that those in the bunker would be killed by the aftereffects of the nuclear blast did they abandon the idea.
Unsurprisingly, the leaders of the two countries found themselves staring into the nuclear abyss because of a violent act in Kashmir, a disputed territory which had led to three conventional wars between the South Asian neighbors since 1947, the founding year of an independent India and Pakistan. As a result of the first of these in 1947 and 1948, India acquired about half of Kashmir, with Pakistan getting a third, and the rest occupied later by China.
Kashmir, the Root Cause of Enduring Enmity
The Kashmir dispute dates back to the time when the British-ruled Indian subcontinent was divided into Hindu-majority India and Muslim-majority Pakistan, and indirectly ruled princely states were given the option of joining either one. In October 1947, the Hindu maharaja of Muslim-majority Kashmir signed an “instrument of accession” with India after Muslim tribal raiders from Pakistan invaded his realm. The speedy arrival of Indian troops deprived the invaders of the capital city, Srinagar. Later, they battled regular Pakistani troops until a United Nations-brokered ceasefire on January 1, 1949. The accession document required that Kashmiris be given an opportunity to choose between India and Pakistan once peace was restored. This has not happened yet, and there is no credible prospect of it taking place.
Fearing a defeat in such a plebiscite, given the pro-Pakistani sentiments prevalent among the territory’s majority Muslims, India found several ways of blocking U.N. attempts to hold one. New Delhi then conferred a special status on the part of Kashmir it controlled and held elections for its legislature, while Pakistan watched with trepidation.
In September 1965, when its verbal protests proved futile, Pakistan attempted to change the status quo through military force. It launched a war that once again ended in stalemate and another U.N.-sponsored truce, which required the warring parties to return to the 1949 ceasefire line.
A third armed conflict between the two neighbors followed in December 1971, resulting in Pakistan’s loss of its eastern wing, which became an independent Bangladesh. Soon after, Indian Prime Minister Indira Gandhi tried to convince Pakistani President Zulfikar Ali Bhutto to agree to transform the 460-mile-long ceasefire line in Kashmir (renamed the “Line of Control”) into an international border. Unwilling to give up his country’s demand for a plebiscite in all of pre-1947 Kashmir, Bhutto refused. So the stalemate continued.
During the military rule of General Zia al Haq (1977-1988), Pakistan initiated a policy of bleeding India with a thousand cuts by sponsoring terrorist actions both inside Indian Kashmir and elsewhere in the country. Delhi responded by bolstering its military presence in Kashmir and brutally repressing those of its inhabitants demanding a plebiscite or advocating separation from India, committing in the process large-scale human rights violations.
In order to stop infiltration by militants from Pakistani Kashmir, India built a double barrier of fencing 12-feet high with the space between planted with hundreds of land mines. Later, that barrier would be equipped as well with thermal imaging devices and motion sensors to help detect infiltrators. By the late 1990s, on one side of the Line of Control were 400,000 Indian soldiers and on the other 300,000 Pakistani troops. No wonder President Bill Clinton called that border “the most dangerous place in the world.” Today, with the addition of tactical nuclear weapons to the mix, it is far more so.
Kashmir, the Toxic Bone of Contention
Even before Pakistan’s introduction of tactical nukes, tensions between the two neighbors were perilously high. Then suddenly, at the end of 2015, a flicker of a chance for the normalization of relations appeared. Indian Prime Minister Narendra Modi had a cordial meeting with his Pakistani counterpart, Nawaz Sharif, on the latter’s birthday, December 25th, in Lahore. But that hope was dashed when, in the early hours of January 2nd, four heavily armed Pakistani terrorists managed to cross the international border in Punjab, wearing Indian Army fatigues, and attacked an air force base in Pathankot. A daylong gun battle followed. By the time order was restored on January 5th, all the terrorists were dead, but so were seven Indian security personnel and one civilian. The United Jihad Council, an umbrella organization of separatist militant groups in Kashmir, claimed credit for the attack. The Indian government, however, insisted that the operation had been masterminded by Masood Azhar, leader of the Pakistan-based Jaish-e Muhammad (Army of Muhammad).
As before, Kashmir was the motivating drive for the anti-India militants. Mercifully, the attack in Pathankot turned out to be a minor event, insufficient to heighten the prospect of war, though it dissipated any goodwill generated by the Modi-Sharif meeting.
There is little doubt, however, that a repeat of the atrocity committed by Pakistani infiltrators in Mumbai in November 2008, leading to the death of 166 people and the burning of that city’s landmark Taj Mahal Hotel, could have consequences that would be dire indeed. The Indian doctrine calling for massive retaliation in response to a successful terrorist strike on that scale could mean the almost instantaneous implementation of its Cold Start strategy. That, in turn, would likely lead to Pakistan’s use of tactical nuclear weapons, thus opening up the real possibility of a full-blown nuclear holocaust with global consequences.
Beyond the long-running Kashmiri conundrum lies Pakistan’s primal fear of the much larger and more powerful India, and its loathing of India’s ambition to become the hegemonic power in South Asia. Irrespective of party labels, governments in New Delhi have pursued a muscular path on national security aimed at bolstering the country’s defense profile.
Overall, Indian leaders are resolved to prove that their country is entering what they fondly call “the age of aspiration.” When, in July 2009, Prime Minister Manmohan Singh officially launched a domestically built nuclear-powered ballistic missile submarine, the INS Arihant, it was hailed as a dramatic step in that direction. According to defense experts, that vessel was the first of its kind not to be built by one of the five recognized nuclear powers: the United States, Britain, China, France, and Russia.
India’s Two Secret Nuclear Sites
On the nuclear front in India, there was more to come. Last December, an investigation by the Washington-based Center for Public Integrity revealed that the Indian government was investing $100 million to build a top secret nuclear city spread over 13 square miles near the village of Challakere, 160 miles north of the southern city of Mysore. When completed, possibly as early as 2017, it will be “the subcontinent’s largest military-run complex of nuclear centrifuges, atomic-research laboratories, and weapons- and aircraft-testing facilities.” Among the project’s aims is to expand the government’s nuclear research, to produce fuel for the country’s nuclear reactors, and to help power its expanding fleet of nuclear submarines. It will be protected by a ring of garrisons, making the site a virtual military facility.
Another secret project, the Indian Rare Materials Plant, near Mysore is already in operation. It is a new nuclear enrichment complex that is feeding the country’s nuclear weapons programs, while laying the foundation for an ambitious project to create an arsenal of hydrogen (thermonuclear) bombs.
The overarching aim of these projects is to give India an extra stockpile of enriched uranium fuel that could be used in such future bombs. As a military site, the project at Challakere will not be open to inspection by the International Atomic Energy Agency or by Washington, since India’s 2008 nuclear agreement with the U.S. excludes access to military-related facilities. These enterprises are directed by the office of the prime minister, who is charged with overseeing all atomic energy projects. India’s Atomic Energy Act and its Official Secrets Act place everything connected to the country’s nuclear program under wraps. In the past, those who tried to obtain a fuller picture of the Indian arsenal and the facilities that feed it have been bludgeoned to silence.
Little wonder then that a senior White House official was recently quoted as saying, “Even for us, details of the Indian program are always sketchy and hard facts thin on the ground.” He added, “Mysore is being constantly monitored, and we are constantly monitoring progress in Challakere.” However, according to Gary Samore, a former Obama administration coordinator for arms control and weapons of mass destruction, “India intends to build thermonuclear weapons as part of its strategic deterrent against China. It is unclear, when India will realize this goal of a larger and more powerful arsenal, but they will.”
Once manufactured, there is nothing to stop India from deploying such weapons against Pakistan. “India is now developing very big bombs, hydrogen bombs that are city-busters,” said Pervez Hoodbhoy, a leading Pakistani nuclear and national security analyst. “It is not interested in… nuclear weapons for use on the battlefield; it is developing nuclear weapons for eliminating population centers.”
In other words, as the Kashmir dispute continues to fester, inducing periodic terrorist attacks on India and fueling the competition between New Delhi and Islamabad to outpace each other in the variety and size of their nuclear arsenals, the peril to South Asia in particular and the world at large only grows.
Dilip Hiro, a TomDispatch regular, is the author, among many other works, of The Longest August: The Unflinching Rivalry between India and Pakistan(Nation Books). His 36th and latest book is The Age of Aspiration: Money, Power, and Conflict in Globalizing India (The New Press).
Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Book, Nick Turse’s Tomorrow’s Battlefield: U.S. Proxy Wars and Secret Ops in Africa, and Tom Engelhardt’s latest book, Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World.
Copyright 2016 Dilip Hiro
By Robert Reich
Reprinted from Robert Reich’s blog at robertreich.org
A crowning achievement of the historic March on Washington, where Dr. Martin Luther King gave his “I have a dream” speech, was pushing through the landmark Voting Rights Act of 1965. Recognizing the history of racist attempts to prevent Black people from voting, that federal law forced a number of southern states and districts to adhere to federal guidelines allowing citizens access to the polls.
But in 2013 the Supreme Court effectively gutted many of these protections. As a result, states are finding new ways to stop more and more people—especially African-Americans and other likely Democratic voters—from reaching the polls.
Several states are requiring government-issued photo IDs—like drivers licenses—to vote even though there’s no evidence of the voter fraud this is supposed to prevent. But there’s plenty of evidence that these ID measures depress voting, especially among communities of color, young voters, and lower-income Americans.
Alabama, after requiring photo IDs, has practically closed driver’s license offices in counties with large percentages of black voters. Wisconsin requires a government-issued photo ID but hasn’t provided any funding to explain to prospective voters how to secure those IDs.
Other states are reducing opportunities for early voting.
And several state legislatures—not just in the South—are gerrymandering districts to reduce the political power of people of color and Democrats, and thereby guarantee Republican control in Congress.
We need to move to the next stage of voting rights—a new Voting Rights Act—that renews the law that was effectively repealed by the conservative activists on the Supreme Court.
That new Voting Rights Act should also set minimum national standards—providing automatic voter registration when people get driver’s licenses, allowing at least 2 weeks of early voting, and taking districting away from the politicians and putting it under independent commissions.
Voting isn’t a privilege. It’s a right. And that right is too important to be left to partisan politics. We must not allow anyone’s votes to be taken away.
ROBERT B. REICH is Chancellor’s Professor of Public Policy at the University of California at Berkeley and Senior Fellow at the Blum Center for Developing Economies. He served as Secretary of Labor in the Clinton administration, for which Time Magazine named him one of the ten most effective cabinet secretaries of the twentieth century. He has written fourteen books, including the best sellers “Aftershock, “The Work of Nations,” and”Beyond Outrage,” and, his most recent, “Saving Capitalism.” He is also a founding editor of the American Prospect magazine, chairman of Common Cause, a member of the American Academy of Arts and Sciences, and co-creator of the award-winning documentary, INEQUALITY FOR ALL.