Here come the Super Storms

Reprinted from High Country News

By Tim Lydon

Once again, we were all New Yorkers. Watching the heartbreak that continues in Staten Island, parts of Queens and along the pummeled Jersey Shore, our sympathies turned eastward toward the victims of this unusual “Super Storm.” But just how unusual was it?

Sandy’s devastation gives us the opportunity to remember another giant storm that barreled into the Bering coast of western Alaska last year. The two storms are strikingly similar, and both point to a climate destabilized by fossil fuel emissions.

The parallels between Sandy and Alaska’s storm begin with the language used to describe them. On Nov. 8, 2011, in a headline similar to those that would precede Sandy, the Alaska Dispatch newspaper warned, “North Pacific ‘Super Storm’ Bears Down on Western Alaska.” The National Weather Service in Alaska ominously spoke of “a life-threatening storm of a magnitude rarely experienced.” A year later, as Sandy approached the Northeast, the same agency warned of a “historic storm” capable of widespread damage and possible loss of life.

The size of the storms, each stretching over 1,000 miles, awed meteorologists. Their intensity was similarly unique, with the barometric pressure plunging to 943 millibars for Alaska’s storm and 940 for Sandy. The readings, more indicative of a Category Three hurricane, were rare in the Northern latitudes where they occurred.

The storms’ huge size and fierce winds kicked up deadly seas and drove surges that devastated low-lying coastal areas. Along the Bering Coast, waves towered between 30 and 40 feet, similar to the record setting 32-footer that charged across New York Harbor during Sandy’s peak.

In another similarity, both storms were quickly followed by “normally” powerful storms that brought added suffering to an already stricken area.

But you’ve probably never even heard of Alaska’s Super Storm. There’s an obvious reason for that: population. The Bering Coast supports one of the nation’s sparsest populations, with its largest community, Nome, inhabited by just 3,600 people. Point Hope, Kivalina and Savoonga are among a few dozen other settlements, mostly Alaska Native villages with populations in the hundreds.

In Alaska, there were no headline-making evacuations. After all, people can only evacuate so far when there’s no road out of the village and aircraft flights are halted during bad weather (let alone during Super Storms). So, as gusts approaching 100 mph drove blizzard conditions, ripped away roofs and knocked out power, residents hunkered down in schools and community centers, where the wind’s roar nearly drowned out their voices. After a punishing two days, they emerged to find roads and property ruined by wind and coastal flooding. Damages were in the mere millions, compared to Sandy’s billions, only because there was much less infrastructure to damage.

But the cost to individual people was equally painful. And, just as important, the storm’s connection to climate change was just as plain. For instance, the absence of sea ice, which used to limit big waves and storm surges this time of year, fueled the destructiveness of Alaska’s Super Storm. In recent years, the Bering Sea’s rough weather, combined with the rapidly dwindling ice, has caused massive coastal erosion, forcing villages such as Kivalina to plan for the permanent abandonment of their long-held homes. Here, climate change is a daily reality.

Similarly, Sandy was strengthened by warmer-than-usual ocean water, something that is quickly becoming the norm in the Atlantic. And Sandy’s destruction was worsened by sea-level rise. Research shows that sea-level rise is not uniform around the globe, and that ocean currents and other factors are making it more profound in the Northeast. That’s why we saw water pouring into the Ground Zero construction site and flooding New York’s tunnels and subways. Now, city planners in the Northeast are having the same discussions that occurred in Kivalina a few years ago, before they decided that moving the village was inevitable.

Of course, it’s still true that no single weather event can be directly blamed on human-caused climate change. But disappearing sea ice, warming oceans and rising sea levels are observable phenomena driven by the atmospheric buildup of greenhouse gases.

We can be assured that these are not the last Super Storms. More likely, they’re among the first, and the records they set will be dwarfed by the storms to come. Like the recent fires in the West — 2012 was another record-breaking year for Western wildfires — they will be back. That’s what climatologists predict. Meanwhile, our political leaders sidestep even the mention of climate change and mostly dismiss energy conservation. It’s a strange legacy — along with weird weather — we’re leaving today’s young people.

Tim Lydon is a contributor to Writers on the Range, a service of High Country News ( He writes in south-central Alaska.

The Budget Thugs: What Do They Know About the Economy?

Reprinted from the Center for Economic and Policy Research (CEPR) under a Creative Commons License

By Dean Baker

Ed Haislmaier, a senior scholar at the Heritage Foundation, made himself famous in this video where he appears to be assaulting people protesting a conference organized by Fix the Debt. While this act of bad temper may be uncharacteristic of the public behavior of this corporate-sponsored crusade to cut Social Security and Medicare, it does reflect the way in which they hope to bully their agenda through the political process.

The line from Fix the Debt, an organization that includes the CEOs of many of the country’s largest corporations, and allies like the Washington Post is that we better have cuts to Social Security and Medicare because they say so. Note that they did not try to push this line in the elections. Everyone knows that cuts to these programs are hugely unpopular across the political spectrum.

The Fix the Debt strategy was explicitly to wait until after the election. They would then go into high gear pushing their agenda of cutting Social Security and Medicare regardless of who won the elections. Remember, we need these cuts because they say so.

It is worth repeating the “they say so” part; because this is the only way we could know that cuts to Social Security and Medicare are necessary. It is possible to tell stories about countries where a meltdown in financial markets forced sharp budget cuts, but there is zero evidence of that for the United States. Investors are willing to lend the U.S. government vast amounts of money at extremely low interest rates. The only reason that we have for believing that financial markets will panic if we don’t have the Social Security and Medicare cuts the Debt Fixers want is because they say so.

For this reason it is worth considering what the Debt Fixers know or don’t know about the economy. This meaning bringing up a still fresh wound; why did none of these people see the housing bubble whose collapse wrecked the economy?

It is important to understand the bubble was not hard to see. Nor did it require much knowledge of economics to realize that its collapse would devastate the economy.

The bubble was an unprecedented nationwide run-up in house prices. For the 100 years from 1896 to 1996 nationwide house prices had on average just tracked the overall rate of inflation. In the decade from 1996 to 2006 house prices rose by more than 70 percentage points in excess of the rate of inflation.

How could anyone following the economy miss this? There are reports on house prices released every month; did none of the Debt Fixers ever look at them during the bubble years?

And there was no explanation for this extraordinary run up of prices in the fundamentals of the housing market. Population and income growth in the last decade were slow, not fast. And there was no corresponding increase in rents. If fundamentals were driving the explosion in house prices then there should have been some pressure on rents as well. And, vacancy rates were hitting all-time highs. How does that fit with a supply and demand story driving up house prices?

The fact that the housing bubble was driving the economy was also not hard to see. Typically housing construction is less than 4.0 percent of GDP. It peaked at more than 6.0 percent of GDP in 2005. Couldn’t the Debt Fixers find the GDP data released every month by the Commerce Department?

Housing wealth was also driving a consumption boom as the saving rate fell to nearly zero in the years from 2004-2007. Did the Debt Fixers think that people would keep borrowing against their homes when the equity in their homes disappeared?

The bursting of the bubble meant a loss in annual demand of more than $1 trillion when the construction and consumption boom both collapsed. What exactly did the Debt Fixers think would replace this demand?

Did they think that firms would suddenly double their investment as they saw their markets collapse? Did they think that consumers would just spend like crazy even as their housing wealth vanished? If they have a theory as to how the economy could quickly replace the demand generated by the housing bubble without large government budget deficits, it would be great if they would share.

The reality is that the Debt Fixers and their allied economists and policy wonks saw none of the above. They were completely out to lunch in their understanding of the economy.

The Debt Fixers and their allies will have to explain for themselves how they managed to miss something as huge and important to the economy as the housing bubble. However missing an $8 trillion housing bubble is not a small mistake. It is the sort of thing that in other lines of work gets you fired and sent looking for a new career.

What would Michelle Rhee, the hero of the “school reform” movement, do to a public school teacher if all their students had huge drops in scores from the prior year? The economic experts among the Debt Fixers all fit this failed teacher description.

This means that when we get a whole bunch of seemingly important and knowledgeable people telling us that we must cut Social Security and Medicare because the markets demand it, we have to remember that these are people who just recently were shown to be completely out to lunch in their economic judgment. If the Debt Fixers expect the country to take their pronouncements seriously, they should be forced to answer one simple question: when did you stop being wrong about the economy?

Dean Baker is the co-director of the Center for Economic and Policy Research (CEPR). He is the author of The End of Loser Liberalism: Making Markets Progressive. He also has a blog, “Beat the Press,” where he discusses the media’s coverage of economic issues.

Creative Commons

Winston Churchill on How to Deal With Right-Wing Bloggers

Five Job-Destroying CEOs Trying to “Fix” the Debt by Slashing Corporate Taxes and Cutting Social Security Benefits

Reprinted from Alternet

By Sarah AndersonScott Klinger

Let’s name them and shame them.

In poll after poll, the American people say they are far more concerned about the jobs crisis than the “debt crisis.” A powerful coalition of CEOs says they have an answer for both problems.

Give us more tax breaks, they say, and we’ll use the money to invest and create jobs. The national economic pie will expand and Uncle Sam will get plenty of the frothy meringue without having to raise tax rates.

That’s the line of the Fix the Debt campaign. Led by more than 90 CEOs, this turbo-charged PR/lobbying machine is blasting the message that such “pro-growth tax reform” should be a pillar of any deficit deal (along with cuts to benefit programs like Social Security and Medicare).

And it might be a good line — if not for some pesky real-world facts. You see the same corporations peddling this line have already been paying next to nothing in taxes. And instead of creating jobs, they’ve been destroying them. Here are five examples of job-cutting, tax-dodging CEOs who are leading Fix the Debt.

1. Randall Stephenson, AT&T

U.S. jobs destroyed since 2007: 54,000

Average effective federal corporate income tax rate, 2009-2011: 6.3%

Randall Stephenson presides over the biggest job destroyer among the Fix the Debt corporate supporters, having eliminated 54,000 jobs since 2007. The company also has one of the largest deficits in its worker pension fund — a gaping hole of $10 billion.

Can Stephenson blame all this belt-tightening on the Tax Man? Not exactly. Over the last three years, AT&T’s tax bills have been miniscule. According to the firm’s own financial reports, they’ve paid Uncle Sam only 6.3 percent on more than $43 billion in profits. If the telecom giant had paid the standard 35 percent corporate tax rate over the last three years, the federal deficit would be $12.5 billion lower.

So where have AT&T’s profits gone? A huge chunk has landed in Stephenson’s own pension fund. His $47 million AT&T retirement account is the third-largest among Fix the Debt CEOs. If converted to an annuity when he hits age 65, it would net him a retirement check of more than a quarter million dollars every month for the rest of his life.

While his economic future is more than secure, Stephenson emerged from a meeting with President Obama on November 28 “optimistic” about the chances of reforming (i.e., cutting) Social Security as part of a deal to avoid the so-called “fiscal cliff.”

2. Lowell McAdam, Verizon

U.S. jobs destroyed since 2007: 30,000

Average effective federal corporate income tax rate, 2009-2011: -3.3%

Another telecommunications giant, Verizon, is close behind AT&T in the layoff leader race, with 30,000 job cuts since 2007. Like its industry peer AT&T, Verizon also has a big deficit in its pension accounts. It would need to cough up $6 billion to meet its promised pension benefits to employees and another $24 billion to meet promised post-retirement health care benefits.

Did the blood-sucking IRS leave Verizon no choice but to slash jobs and underfund worker pensions? Far from it. The company actually got money back from Uncle Sam, despite reporting $34 billion in U.S. profits over the last three years. If Verizon had paid the full corporate tax rate of 35 percent, last year’s national deficit would have been $13.1 billion less. Had that amount been used for public education, it could have covered the cost of employing more than 190,000 elementary teachers for a year.

Verizon’s new CEO, Lowell McAdam, already has $8.7 million in Verizon pension assets, enough to set him up for a $47,834 monthly retirement check. McAdam’s predecessor, Ivan Seidenberg, who has also signed up as a Fix the Debt supporter, retired with more than $70 million in his Verizon retirement package.

3. David Cote, Honeywell

U.S. jobs destroyed since 2007: 4,000

Average effective federal corporate income tax rate, 2009-2011: -14.8%

Honeywell has created 10,000 new jobs since 2007 – outside the United States. At home, the firm has eliminated 4,000 jobs. In July, Honeywell announced it was closing a Metropolis, Illinois plant, laying off 230 workers, and selling another Illinois property, Sensata Technologies, to Bain Capital, Mitt Romney’s old private equity firm. Bain, another Fix the Debt supporter, immediately announced it was closing the 175-worker plant and shipping the jobs to China.

Are Honeywell’s U.S. job losses the result of Uncle Sam strangling all the life out of the company? Hardly. Over the last three years, the firm has been highly profitable each year, with total U.S. profits of $2.5 billion. And yet Honeywell used deductions, credits, and loopholes to garner refunds totaling $377 million over the last three years – an effective tax rate of negative 14.8 percent. If Honeywell had paid the full 35 percent corporate tax since 2009, the U.S. deficit would have been reduced by $1.3 billion.

Honeywell is not only near the front of the IRS refund line, it is also among the top recipients of government contracts. In 2011, the firm got $725 million in government deals, making it the 35th-largest federal contractor. Tax refunds go in one pocket, while taxpayer-financed government contracts go in the other.

Still, Honeywell’s nearly tax-free status hasn’t kept CEO Cote from being one of the most outspoken advocates for more corporate tax cuts. One of his favorite proposals is a shift to a territorial tax system, which would permanently exempt all foreign earnings of U.S. corporations from U.S. income taxes. Cote was one of 12 big company CEOs who met with President Obama on November 14 to plead for this tax break. Honeywell has more than $8 billion stashed offshore and could receive an immediate tax windfall of more than $2 billion if Congress approved this shift to a “territorial” tax system. According to an Institute for Policy Studies report, Fix the Debt corporations as a whole would stand to gain $134 billion.

Perhaps even more galling is Cote’s demand for cuts to earned benefit programs. Cote has $78 million in his Honeywell retirement accounts, enough to qualify for monthly retirement checks of $428,000 starting at age 65. In contrast the average retiree receives just $1,237 a month from Social Security.

4. Kenneth Frazier, Merck

U.S. jobs destroyed since 2007: 13,000

Average effective federal corporate income tax rate, 2009-2011: 13.2%

In 2009, Merck merged with Schering Plough to become the world’s second-largest pharmaceutical company. Less than a year later, Merck slashed 15 percent of its workforce, including closing facilities in Miami Lakes, Florida and Cambridge, Massachusetts. All told, between 2007 and 2011, Merck destroyed nearly 25,000 jobs, including at least 13,000 in the United States.

Merck’s radical downsizing has little to do with burdensome taxes. Between 2009 and 2011, the drug giant paid just 13.2 percent of its $9 billion of U.S. income in federal corporate income taxes. If Merck had paid the full 35 percent rate over the three-year period, the U.S. debt would have been reduced by nearly $2 billion, enough to pay for college scholarships for more than 250,000 students. But that’s not a low enough tax rate for Merck. As a part of Fix the Debt, CEO Kenneth Frazier is telling Congress the prescription to restore the U.S. economy should include a permanent corporate tax holiday for offshore earnings. If Congress fills that prescription, Merck could receive a $15 billion windfall on the $44 billion it has stashed offshore.

5. Terry Lundgren, Macy’s

U.S. jobs destroyed since 2007: 7,000

Average effective federal corporate income tax rate, 2009-2011: 20.7%

Though Macy’s sales have rebounded from their recessionary slump, the department store owner’s workforce is still down by 7,000 compared to 2007. Meanwhile, Macy’s CEO Terry Lundgren has seen his pay more than double over the period, from $8.7 million in 2007 to $17.6 million last year. Macy’s has also showered Lundgren with generous retirement benefits, currently worth more than $16.7 million. Over the last three years, Macy’s has taken advantage of various tax credits and deductions to lower its federal income tax rate to 20.7 percent. Lundgren also attended the November 28 meeting with President Obama, where Fix the Debt CEOs pushed cuts to Social Security and Medicare.

Another Way to Fix the Debt: End Corporate Entitlements, Demand Big Business Pays Its Fair Share

The five companies profiled here have contributed enormously to the national debt by eliminating livelihoods — together they’ve destroyed 108,000 jobs since 2007 — and through tax-dodging. If they had simply been required to pay the full statutory corporate tax rate of 35 percent, they would’ve paid $48 billion in additional federal income taxes over the last three years. And now these CEOs expect us to fall for their argument that even more tax breaks will be good for American workers.

There is an entitlement problem at the center of the debt debate — it is the entitlement of CEOs with track records of job destruction and tax dodging lecturing the rest of us on how to fix the debt.

Sarah Anderson is Global Economy Project Director and Scott Klinger is an Associate Fellow of the Institute for Policy Studies. They are co-authors of the report “A Pension Deficit Disorder.

Supreme Court to Review California’s Proposition 8

“The Supreme Court for the first time entered the debate over gay marriage Friday, announcing it would accept cases from New York and California that test the rights of same-sex couples.”


“In the first case, the high court will hear arguments on the Defense of Marriage Act, a 1996 federal law denying benefits to same-sex spouses. ”


“The second case involves California’s Proposition 8, a 2008 state measure that barred same-sex marriages in the state. Lower courts have struck down Proposition 8.”

Read the full article here: “Supreme Court to Hear Gay-Marriage Cases

Why Our Fascination With Zombies?

Our son-in-law recently asked his Facebook friends for their views on why there’s such a fascination with zombies in our culture?

This is a very tantalizing question, and it apparently hooked me because I’m returning to it after some days of letting it cook or fester or irritate or whatever. I notice that several of his friends refer to zombies as “mindless” or say that they possess the quality of “mindlessness.”

I’d like to suggest a different quality as the reason for our fascination with zombies, one that I’m not particularly qualified to talk about at all, but here goes anyway: “soullessness” … the absence of soul.

In this secular age, the “soul” — whatever that means — is not a particularly fashionable or obvious or even especially credible subject for serious consideration. But I have in mind the life work of the late Jungian psychologist James Hillman, who — If I understand him right — believed that the soul is an optional “organ,” one that we may choose to make the core work of our life, or not.

In his view, the soul is a project we may undertake. Or not. But … it’s a project of deep and “archetypal” importance.

This raises the horrifying possibility that many of us may live out our entire lives failing to do our most important work: building soul.

Hillman once said “A living sense of world requires a corresponding living organ of soul by means of which a living world can be perceived.”

Here’s the reaction of a couple of Hillman fans to that statement:

“Just think of how much reality is shut off from people because they are part of the walking dead – possessing dead rather than living souls. Not only are they not alive, the world is not alive to them.”


“Zombie is the normal state of a civilized adult human being, I’m afraid. No one alive in society today can survive without the dissociation characteristic of zombies. As Whitehead put it somewhere, “Civilization advances by extending the number of important operations which we can perform without thinking about them.”

Maybe we are all in danger of remaining zombies unless we consciously and deliberately take on the important work of building soul. That’s the horror that always lurks just below the surface.

So, what does it mean for each of us to “build soul?”

No problem: Curly explains it very simply here in “City Slickers:”

Should Darwin Have “Occupied” Andrew Carnegie’s Office?

The science of evolution supports the notion that a just society is built on our innate creaturely sense of equality and fairness.

“Fairness is the basis of the social contract,” says Eric Michael Johnson in a fascinating new article in Scientific American.

But hasn’t Darwinism been used in the past to justify unequal rewards for the captains of industry, the “survival of the fittest?”

Isn’t life basically unfair, as many of us have warned our children?

The American financial tycoon Andrew Carnegie certainly thought so and today’s economic elite have followed his example. In 1889 he used a perverted form of Darwinism to argue for a “law of competition” that became the cornerstone of his economic vision. His was a world in which might made right and where being too big to fail wasn’t a liability, it was the key to success. In his “Gospel of Wealth”, Carnegie wrote that this natural law might be hard for the least among us but “it ensures the survival of the fittest in every department.”

Johnson describes recent studies done with Capuchin monkeys and with chimpanzees that illustrate this innate sense of fairness:

According to research published in the journal Animal Behaviour (pdf here), fairness is not only essential to the human social contract, it also plays an important role in the lives of nonhuman primates more generally. Sarah F. Brosnan and colleagues conducted a series of behavioral tests with a colony of chimpanzees housed at the University of Texas in order to find out how they would respond when faced with an unfair distribution of resources. A previous study in the journal Nature by Brosnan and Frans de Waal found that capuchin monkeys would refuse a food item when they saw that another member of their group had received a more desired item at the same time (a grape instead of a slice of cucumber). Some individuals not only rejected the food, they even threw it back into the researchers’ face. The monkeys seemed to recognize that something was unfair and they responded accordingly. This raised the provocative question: can the basis of the social contract be found in our evolutionary cousins?

The biggest surprise coming out of the chimpanzee studies was that some chimps acted from a sense of solidarity with their fellow chimps, refusing a higher value reward if it was offered to them unfairly!

As Johnson explains, “even those who benefitted from inequality recognized that the situation was unfair and they refused to enjoy their own reward if it meant someone else had to suffer.”

Read Johnson’s fascinating article in full here: “The Gospel of Wealth Fails the Inequity Test in Primates

Other Resources:

Nova Science Now: “What Are Animals Thinking” (see especially minutes 6 through 8):

Watch What Are Animals Thinking? on PBS. See more from NOVA scienceNOW.

Book: Age of Empathy, by Franz De Waal

Supreme Failure: Chicago’s Anita Alvarez and the Campaign To Criminalize Citizen Monitoring of Police

Reprinted with permission from the Jonathan Turley Blog

By Jonathan Turley

Cook County State’s Attorney Anita Alvarez ended up empty-handed last week — and all of Chicago can celebrate. Alvarez lost a U.S. Supreme Court mission that would likely surprise most citizens of this progressive city Chicagoans: to strip them of their First Amendment rights and to allow her to prosecute citizens for videotaping police in public.

Alvarez’s position was denounced as extremist by a federal appellate court and civil libertarians around the country. However, she refused to yield to the courts, to the Constitution or to the public — making Chicago a leader of a national effort to bar the use of a technology widely considered the single most important deterrent of police abuse. Alvarez is not alone in this ignoble mission, and this threat to the public is not likely to pass with her latest defeat.

It was 21 years ago that a citizen filmed the savage beating of Rodney King by Los Angeles police officers after a high-speed car chase. The most chilling fact in the King case was that, absent the videotape, this would have likely been dismissed as another unsupported claim of police abuse.

Since that time, numerous acts of abuse by police have been captured by citizens — exposing false charges and excessive force often in direct contradiction to sworn statements of officers. These cases have increased exponentially with the explosion of cellphones with videotaping capabilities. Chicago has seen a long litany of such cases.

Last month, the Chicago Police Department settled a case with an alleged gang member who alleged that Officers Susana La Casa and Luis Contreras took him to the turf of a rival gang and allowed Latin King gang members to taunt and threaten him. It is the type of case that would ordinarily be dismissed on the word of the officers, who allegedly gave false statements regarding the case. Lawyers for Miguel “Mikey” Castillo, however, found a videotape from a witness showing the officers laughing as Castillo cowered in their police SUV. It is the type of act that Alvarez argued should have been a crime — not the police harassment (which her office declined to prosecute) but the filming of the police harassment.

The same is true for the still-pending case of Brad Williams, who filed a lawsuit against the Chicago Police Department in 2011 after he claimed to have been beaten by police in response to his filming an officer holding and dragging a man outside his squad car. Williams was told by officers that it was illegal to film police in public — the position advocated by Alvarez.

Loyola University Chicago professor Ralph Braseth was told the same thing in November 2011. Braseth was also videotaping an arrest as a journalist when he was detained and told that he was committing a crime. He was let go but not before a Chicago police officer deleted his video.

There remains a striking contradiction in the policies of Chicago officials. While Alvarez and others are pushing for the arrest of citizens who photograph police in public, Chicago authorities are also pushing for more and more cameras to videotape citizens in public. Thus, an American Civil Liberties Union report estimated more than 10,000 surveillance cameras are linked throughout the city to allow police to monitor citizens while Alvarez is trying to imprison people who monitor police in public.

When the latest case went before the U.S. 7th Circuit Court of Appeals, the panel described Alvarez’s arguments as “extreme” in arguing that citizens filming police in public is “wholly unprotected by the First Amendment.” Alvarez did not have to adopt such an extreme position and she did not have to seek a reversal from the Supreme Court. Yet, she sought to overturn a decision by Judge Diane Sykes that chastised her for disregarding “the First Amendment’s free-speech and free-press guarantees.”

Alvarez, however, was not without one supporter on the court. Judge Richard Posner admonished the ACLU lawyer who sought to defend the rights of citizens and journalists. In oral arguments, Posner interrupted the ACLU lawyers after just 14 words stating, “Yeah, I know. But I’m not interested, really, in what you want to do with these recordings of people’s encounters with the police.” He then stated openly what is usually left unstated by those seeking to jail citizens: “Once all this stuff can be recorded, there’s going to be a lot more of this snooping around by reporters and bloggers. … I’m always suspicious when the civil liberties people start telling the police how to do their business.”

Alvarez and others appear to share the same suspicion and hostility. Across the country, police and prosecutors continue to arrest or harass citizens who film police — even after numerous courts have stated that such filming is a protected constitutional right.

The latest such case occurred last week in California. Daniel J. Saulmon was filming an arrest when he was stopped by a police officer demanding his identification and an explanation — neither of which Saulmon was inclined to give since he was engaged in a clearly lawful activity. The officer promptly arrested him, and he was held in jail for four days — ultimately charged with resisting, delaying and obstructing an officer. The video shows Saulmon standing at a distance from the arrest and never resisting in any way.

As a native Chicagoan, it was distressing to see the Cook County state’s attorney seek the reduction of guarantees of free speech and free press. With a crime wave sweeping the city and daily murders recounted in national media, one would think that Alvarez would have a few things more important to attend to than stripping away the rights of the citizens that she swore to protect.

Jonathan Turley is a law professor at George Washington University and editor-in-chief of the legal blog

Buddhist Scholar Robert Thurman Calls Norquist’s No-Tax Pledge “Treasonous, Seditious”

Buddhist Scholar Robert Thurman (incidentally Uma’s father) recently produced this short video (now going viral) in which he correctly points out that members of Congress who sign Grover Norquist’s pledge to never raise taxes are acting in direct conflict with their solemn oath of office to protect and defend the Constitution.

He suggests that these Congress members are eligible for impeachment, and should immediately — and en masse — renounce the Norquist pledge.

At the end, Thurman — in a moment of what could most charitably be called wishful thinking — tosses in a few favorable comments about the Tea Party and urges them to oppose corporate power and support government.

As much of a reach as this Thurman video is, interestingly it comes precisely at a time when there are beginning to be some cracks in Congress for what in the past has been nearly universal support among Republicans for the Norquist pledge.

The Archeology of Decline: Debtpocalypse and the Hollowing Out of America

Reprinted from

By Steve Fraser

Debtpocalypse” looms.  Depending on who wins out in Washington, we’re told, we will either free fall over the fiscal cliff or take a terrifying slide to the pit at the bottom.  Grim as these scenarios might seem, there is something confected about the mise-en-scène, like an un-fun Playland.  After all, there is no fiscal cliff, or at least there was none — until the two parties built it.

And yet the pit exists.  It goes by the name of “austerity.” However, it didn’t just appear in time for the last election season or the lame-duck session of Congress to follow.  It was dug more than a generation ago, and has been getting wider and deeper ever since.  Millions of people have long made it their home.  “Debtpocalypse” is merely the latest installment in a tragic, 40-year-old story of the dispossession of American working people.

Think of it as the archeology of decline, or a tale of two worlds. As a long generation of austerity politics hollowed out the heartland, the quants and traders and financial wizards of Wall Street gobbled up ever more of the nation’s resources. It was another Great Migration — instead of people, though, trillions of dollars were being sucked out of industrial America and turned into “financial instruments” and new, exotic forms of wealth.  If blue-collar Americans were the particular victims here, then high finance is what consumed them.  Now, it promises to consume the rest of us.

Scenes from the Museum

In the mid-1970s, Hugh Carey, then governor of New York, was already noting the hollowing out of his part of America.  New York City, after all, was threatening to go bankrupt.  Plenty of other cities and states across what was then known as the “Frost Belt” were in similar shape.  Yankeedom, in Carey’s words, was turning into “a great national museum” where tourists could visit “the great railroad stations where the trains used to run.”

As it happened, the tourists weren’t interested.  Abandoned railroad stations might be fetching in an eerie sort of way, but the rest of the museum was filled with artifacts of recent ruination that were too depressing to be entertaining.  True, a century earlier, during the first Gilded Age, the upper crust used to amuse itself by taking guided tours of the urban demi-monde, thrilling to sites of exotic depravity or ethnic strangeness. They traipsed around “rag-pickers alley” on New York’s Lower East Side or the opium dens of Chinatown, or ghoulishly watched poor children salivate over toys in store window displays they could never hope to touch.

Times have changed.  The preference now is to entirely remove the unsightly.  Nonetheless, the national museum of industrial homicide has, city by city, decade by decade, grown more grotesque.

Camden, New Jersey, for example, had long been a robust, diversified small industrial city.  By the early 1970s, however, its reform mayor Angelo Errichetti was describing it this way: “It looked like the Vietcong had bombed us to get even.  The pride of Camden… was now a rat-infested skeleton of yesterday, a visible obscenity of urban decay.  The years of neglect, slumlord exploitation, tenant abuse, government bungling, indecisive and short-sighted policy had transformed the city’s housing, business, and industrial stock into a ravaged, rat-infested cancer on a sick, old industrial city.”

That was 40 years ago and yet, today, news stories are still being written about Camden’s never-ending decline into some bottomless abyss.  Consider that a measure of how long it takes to shut down a way of life.

Once upon a time, Youngstown, Ohio, was a typical smokestack city, part of the steel belt running through Pennsylvania and Ohio.  As with Camden, things there started turning south in the 1970s.  From 1977 to 1987, the city lost 50,000 jobs in steel and related industries.  By the late 1980s, the years of Ronald Reagan’s presidency when it was “morning again in America,” it was midnight in Youngstown: foreclosures, an epidemic of business bankruptcies, and everywhere collapsing community institutions including churches, unions, families, and the municipal government itself.

Burglaries, robberies, and assaults doubled after the steel plants closed.  In two years, child abuse rose by 21%, suicides by 70%. One-eighth of Mahoning County went on welfare.  Streets were filled with dead storefronts and the detritus of abandoned homes: scrap metal and wood shingles, shattered glass, stripped-away home siding, canning jars, and rusted swing sets.  Each week, 1,500 people visited the Salvation Army’s soup line.

The Wall Street Journal called Youngstown “a necropolis,” noting miles of “silent, empty steel mills” and a pervasive sense of fear and loss.  Bruce Springsteen would soon memorialize that loss in “The Ghost of Tom Joad.”

If you were unfortunate enough to live in the small industrial city of Mansfield, Ohio, for the last 40 years, you would have witnessed in microcosm the dystopia of destruction unfolding in similar places everywhere.  For a century, workshops there had made a kaleidoscope of goods: stoves, tires, steel, machinery, refrigerators, and cars. Then Mansfield’s rust belt started narrowing as one plant after another went shut down: Dominion Electric in 1971, Mansfield Tire and Rubber in 1978, Hoover Plastics in 1980, National Seating in 1985, Tappan Stoves in 1986, a Westinghouse plant and Ohio Brass in 1990, Wickes Lumber in 1997, Crane Plumbing in 2003, Neer Manufacturing in 2007, and Smurfit-Stone Container in 2009.  In 2010, General Motors closed its largest, most modern U.S. stamping factory, and thanks to the Great Recession, Con-way Freight, Value City, and Card Camera also shut down.

“Good times” or bad, it didn’t matter.  Mansfield shrank relentlessly, becoming the urban equivalent of skin and bones.  Its poverty rate is now at 28%, its median income $11,000 below the national average of $41,994.  What manufacturing remains is non-union and $10 an hour is considered a good wage.

Midway through this industrial autoda, a journalist watching the Campbell Works of Youngstown Sheet and Tube go dark, mused that “the dead steel mills stand as pathetic mausoleums to the decline of American industrial might that was once the envy of the world.” This dismal record is particularly impressive because it encompasses the “boom times” presided over by Presidents Reagan and Clinton.

The “Pit” Deepens 

In 1988, in the iciest part of the Frost Belt, a Wall Street Journal reporter noted, “There are two Americas now, and they grow further apart each day.”  He was referring to Eastport, Maine.  Although the deepest port on the East Coast, it hosted few ships, abandoned sardine factories lined its shore, and its bars were filled with the under- and unemployed.  The reporter pointed out that he had seen similar scenes from a collapsing rural economy “coast to coast, border to border”: shuttered saw mills, abandoned mines, closed schools, rutted roads, ghost airports.

Closing up, shutting down, going out of business: last one to leave please turn out the lights!

Such was the case in cities and towns around the country. Essential public services — garbage collection, policing, fire protection, schools, street maintenance, health-care — were atrophying.  So were the people who lived in those places.  High blood pressure, cardiac and digestive problems, and mortality rates were generally rising, as was doubt, self-blame, guilt, anxiety, and depression.  The drying up of social supports, even among those who once had been friends and workmates, haunted the inhabitants of these places as much as the industrial skeletons around them.


In the 1980s, when Jack Welch, soon to be known as “Neutron Jack” for his ruthlessness, became CEO of General Electric, he set out to raise the company’s stock price by gutting the workforce.  It only took him six years, but imagine what it was like in Schenectady, New York, which lost 22,000 jobs; Louisville, Kentucky, where 13,000 fewer people made appliances; Evendale, Ohio, where 12,000 no longer made lights and light fixtures; Pittsfield, Massachusetts, where 8,000 plastics makers lost their jobs; and Erie, Pennsylvania, where 6,000 locomotive workers got green slips.

Life as it had been lived in GE’s or other one-company towns ground to a halt. Two travelling observers, Dale Maharidge and Michael Williamson, making their way through the wasteland of middle America in 1984 spoke of “medieval cities of rusting iron” and a largely invisible landscape filling up with an army of transients, moving from place to place at any hint of work.  They were camped out under bridges, riding freight cars, living in makeshift tents in fetid swamps, often armed, trusting no one, selling their blood, eating out of dumpsters.

Nor was the calamity limited to the northern Rust Belt.  The South and Southwest did not prove immune from this wasting disease either.  Empty textile mills, often originally runaways from the North, dotted the Carolinas, Georgia, and elsewhere.  Half the jobs lost due to plant closings or relocations occurred in the Sunbelt.

In 2008, in the sunbelt town of Colorado Springs, Colorado, one-third of the city’s street lights were extinguished, police helicopters were sold, watering and fertilizing in the parks was eliminated from the budget, and surrounding suburbs closed down the public bus system. During the recent Great Recession one-industry towns like Dalton, Georgia (“the carpet capital of the world”), or Blakely, Georgia (“the peanut capital of the world”), or Elkhart, Indiana (“the RV capital of the world”) were closing libraries, firing police chiefs, and taking other desperate measures to survive.

And no one can forget Detroit. Once, it had been a world-class city, the country’s fourth largest, full of architectural gems.  In the 1950s, Detroit had a population with the highest median income and highest rate of home ownership in urban America.  Now, the “motor city” haunts the national imagination as a ghost town. Home to two million a quarter-century ago, its decrepit hulk is now “home” to 900,000.  Between 2000 and 2010 alone, the population hemorrhaged by 25%, nearly a quarter of a million people, almost as many as live in post-Katrina New Orleans.  There and in other core industrial centers like Baltimore, “death zones” have emerged where whole neighborhoods verge on medical collapse.

One-third of Detroit, an area the size of San Francisco, is now little more than empty houses, empty factories, and fields gone feral.  A whole industry of demolition, waste-disposal, and scrap-metal companies arose to tear down what once had been. With a jobless rate of 29%, some of its citizens are so poor they can’t pay for funerals, so bodies pile up at mortuaries.  Plans are even afoot to let the grasslands and forests take over, or to give the city to private enterprise.

Even the public zoo has been privatized.  With staff and animals reduced to the barest of minimums and living wages endangered by its new owner, an associate curator working with elephants and rhinos went in search of another job.  He found it with the city — chasing down feral dogs whose population had skyrocketed as the cityscape returned to wilderness.  History had, it seemed, abandoned dogs along with their human compatriots.

Looking Backward

But could this just be the familiar story of capitalism’s penchant for “creative destruction”?  The usual tale of old ways disappearing, sometimes painfully, as part of the story of progress as new wonders appear in their place?

Imagine for a moment the time traveler from Looking Backward, Edward Bellamy’s best-selling utopian novel of 1888 waking up in present-day America.  Instead of the prosperous land filled with technological wonders and egalitarian harmony Bellamy envisioned, his protagonist would find an unnervingly familiar world of decaying cities, people growing ever poorer and sicker, bridges and roads crumpling, sweatshops a commonplace, the largest prison population on the planet, workers afraid to stand up to their bosses, schools failing, debts growing more onerous, and inequalities starker than ever.

A recent grim statistic suggests just how Bellamy’s utopian hopes have given way to an increasingly dystopian reality.  For the first time in American history, the life expectancy of white people, men and women, has actually dropped.  Life spans for the least educated, in particular, have fallen by about four years since 1990.  The steepest decline: white women lacking a high school diploma.  They, on average, lost five years of life, while white men lacking a diploma lost three years.

Unprecedented for the United States, these numbers come close to the catastrophic decline Russian men experienced in the desperate years following the collapse of the Soviet Union.  Similarly, between 1985 and 2010, American women fell from 14th to 41st place in the United Nation’s ranking of international life expectancy. (Among developed countries, American women now rank last.)  Whatever combination of factors produced this social statistic, it may be the rawest measure of a society in the throes of economic anorexia.

One other marker of this eerie story of a developed nation undergoing underdevelopment and a striking reproach to a cherished national faith: for the first time since the Great Depression, the social mobility of Americans is moving in reverse.  In every decade from the 1970s on, fewer people have been able to move up the income ladder than in the previous 10 years.  Now Americans in their thirties earn 12% less on average than their parents’ generation at the same age.  Danes, Norwegians, Finns, Canadians, Swedes, Germans, and the French now all enjoy higher rates of upward mobility than Americans.  Remarkably, 42% of American men raised in the bottom one-fifth income cohort remain there for life, as compared to 25% in Denmark and 30% in notoriously class-stratified Great Britain.

Eating Our Own

Laments about “the vanishing middle class” have become commonplace, and little wonder.  Except for those in the top 10% of the income pyramid, everyone is on the down escalator.  The United States now has the highest percentage of low-wage workers — those who earn less than two-thirds of the median wage — of any developed nation. George Carlin once mordantly quipped, “It’s called the American Dream because you have to be asleep to believe it.” Now, that joke has become our waking reality.

During the “long nineteenth century,” wealth and poverty existed side by side.  So they do again.  In the first instance, when industrial capitalism was being born, it came of age by ingesting what was valuable embedded in pre-capitalist forms of life and labor, including land, animals, human muscle power, tools and talents, know-how, and the ways of organizing and distributing what got produced.  Wealth accumulated in the new economy by extinguishing wealth in the older ones.

“Progress” was the result of this economic metabolism.  Whatever its stark human and ecological costs, its achievements were also highly visible.  America’s capacity to sustain a larger and larger population at rising levels of material well-being, education, and health was its global boast for a century and half.

Shocking statistics about life expectancy and social mobility suggest that those days are over.  Wealth, great piles of it, is still being generated, and sometimes displayed so ostentatiously that no one could miss it.  Technological marvels still amaze.  Prosperity exists, though for an ever-shrinking cast of characters.  But a new economic metabolism is visibly at work.

For the last 40 years, prosperity, wealth, and “progress” have rested, at least in part, on a grotesque process of auto-cannibalism — it has also been called “dis-accumulation” by David Harvey — of a society that is devouring its own.

Traditional forms of primitive accumulation still exist abroad.  Hundreds of millions of former peasants, fisherman, craftspeople, scavengers, herdsmen, tradesmen, ranchers, and peddlers provide the labor power and cheap products that buoy the bottom lines of global manufacturing and retail corporations, as well as banks and agribusinesses.  But here in “the homeland,” the very profitability and prosperity of privileged sectors of the economy, especially the bloated financial arena, continue to depend on slicing, dicing, and stripping away what was built up over generations.

Once again a new world has been born.  This time, it depends on liquidating the assets of the old one or shipping them abroad to reward speculation in “fictitious capital.”  Rates of U.S. investment in new plants, technology, and research and development began declining during the 1970s, a fall-off that only accelerated in the gilded 1980s.  Manufacturing, which accounted for nearly 30% of the economy after the Second World War, had dropped to just over 10% by 2011.  Since the turn of the millennium alone, 3.5 million more manufacturing jobs have vanished and 42,000 manufacturing plants were shuttered.

Nor are we simply witnessing the passing away of relics of the nineteenth century. Today, only one American company is among the top ten in the solar power industry and the U.S. accounts for a mere 5.6% of world production of photovoltaic cells.  Only GE is among the top ten companies in wind energy. In 2007, a mere 8% of all new semi-conductor plants under construction globally were located in the U.S.  Of the 1.2 billion cell phones sold in 2009, none were made in the U.S.  The share of semi-conductors, steel, cars, and machine tools made in America has declined precipitously just in the last decade.  Much high-end engineering design and R&D work has been offshored.  Now, there are more people dealing cards in casinos than running lathes, and almost three times as many security guards as machinists.

The FIRE Next Time

Meanwhile, for more than a quarter of a century the fastest growing part of the economy has been the finance, insurance, and real estate (FIRE) sector.  Between 1980 and 2005, profits in the financial sector increased by 800%, more than three times the growth in non-financial sectors.

In those years, new creations of financial ingenuity, rare or never seen before, bred like rabbits.  In the early 1990s, for example, there were a couple of hundred hedge funds; by 2007, 10,000 of them.  A whole new species of mortgage broker roamed the land, supplanting old-style savings and loan or regional banks.  Fifty thousand mortgage brokerages employed 400,000 brokers, more than the whole U.S. textile industry.  A hedge fund manager put it bluntly, “The money that’s made from manufacturing stuff is a pittance in comparison to the amount of money made from shuffling money around.”

For too long, these two phenomena — the eviscerating of industry and the supersizing of high finance — have been treated as if they had nothing much to do with each other, but were simply occurring coincidentally.

Here, instead, is the fable we’ve been offered: Sad as it might be for some workers, towns, cities, and regions, the end of industry is the unfortunate, yet necessary, prelude to a happier future pioneered by “financial engineers.” Equipped with the mathematical and technological know-how that can turn money into more money (while bypassing the messiness of producing anything), they are our new wizards of prosperity!

Unfortunately, this uplifting tale rests on a categorical misapprehension.  The ascendancy of high finance didn’t just replace an industrial heartland in the process of being gutted; it initiated that gutting and then lived off it, particularly during its formative decades.  The FIRE sector, that is, not only supplanted industry, but grew at its expense — and at the expense of the high wages it used to pay and the capital that used to flow into it.

Think back to the days of junk bonds, leveraged buy-outs, megamergers and acquisitions, and asset stripping in the 1980s and 1990s.  (Think, in fact, of Bain Capital.)  What was getting bought and stripped and closed up supported windfall profits in high-interest-paying junk bonds.  The stupendous fees and commissions that went to those “engineering” such transactions were being picked from the carcass of a century and a half of American productive capacity. The hollowing out of the United States was well under way long before anyone dreamed up the “fiscal cliff.”

For some long time now, our political economy has been driven by investment banks, hedge funds, private equity firms, real estate developers, insurance goliaths, and a whole menagerie of ancillary enterprises that service them.  But high times in FIRE land have depended on the downward mobility of working people and the poor, cut adrift from more secure industrial havens and increasingly from the lifelines of public support. They have been living instead in the “pit of austerity.”  Soon many more of us will join them.

Steve Fraser is a historian, writer, and editor-at-large for New Labor Forum,co-founder of the American Empire Project, and TomDispatch regularHe is, most recently, the author of  Wall Street: America’s Dream PalaceHe teaches at Columbia University.

« Previous PageNext Page »