19 December 2011

Militarized Police State


The video of the pepper-spraying of student #Occupy protesters by University of California-Davis campus police in riot gear was a real eye-opener.  When I went to college, the campus police were un-uniformed and as far as I knew un-armed.  Their main function seemed to be ferreting out girls in dorm rooms after curfew.

Now they might be cruising around in armored personnel carriers.  Since 1997 the Department of Defense has given away more than 2½ billion dollars of excess equipment to 17,000+ state and local law enforcement agencies including body armor, night vision gear, assault rifles, grenade launchers, armored vehicles, helicopters, riverboats and robots.  The Department of Homeland Security has a grant program under which law enforcement agencies can acquire even more equipment.

The Texas Department of Public Safety has its own surveillance drones.

In October, California’s Alameda County Sheriff’s Department hosted Urban Shield, an annual training exercise  to coordinate responses to a terror attack or natural disaster in the Bay Area.  The exercise, which included a 50-hour SWAT competition on the UC-Berkeley campus, was attended by police teams from the US and other countries, notably Israel, which as “the Harvard of antiterrorism” has had a major influence on how US police handle civilians.

A month later, Oakland and UC-Berkeley #Occupiers were overpowered by police who had participated in the exercise.

As the police become more militarized, the military becomes more involved in law enforcement.

In 2008, US Northern Command, which was added to the military’s combatant commands after 9/11 to defend the US homeland, acquired its first active army unit, a brigade combat team assigned to the command’s US Army North.  The unit, designated a Consequence Management Response Force, is being trained in “crowd and traffic control equipment and nonlethal weapons designed to subdue unruly or dangerous individuals without killing them.”  Army Times, however, emphasized that the training is only for “war-zone operations, not for any domestic purpose.”

“Not for any domestic purpose” is to steer clear of the Posse Comitatus Act.  Enacted in 1878 after Congress ended Reconstruction and withdrew federal troops from the South, the Act prohibits the use of the US Army for law enforcement – “executing the laws” -- unless “expressly authorized by the Constitution or by act of Congress.”

The major act of Congress that expressly authorizes federal troops to execute federal and state laws is the Insurrection Act of 1807.  Earlier legislation, particularly the Militia Act of 1795, authorized the President to call up State militias (now called the National Guard) in case of rebellion against a State, if the State legislature or governor so requested, or in case US laws could not be enforced.  The 1807 Act allowed the President to use the army and navy in any case where calling up the militia was authorized.

The Civil Rights Act of 1871, aimed at the Ku Klux Klan, extended the power of the President to use militia and federal forces to any “insurrection, domestic violence, unlawful combination, or conspiracy” that deprived people in a State of their constitutional rights or obstructed US laws.

All this legislation, still quaintly referred to as the Insurrection Act of 1807, endured without significant change for the last 130 years until an amendment to it was buried in the Defense Authorization Act of 2007.  With an eye to Hurricane Katrina, recent flu epidemics and 9/11, Congress greatly enlarged the President’s authority to use armed forces for law enforcement by extending it to “natural disaster, epidemic, or other serious public health emergency, terrorist attack or incident, or other condition” leading to domestic violence that local authorities cannot control.

Now, as we speak, a bipartisan Congress is about to complete the militarization of our National Police State by putting the military in charge of terrorism with the authority, say two retired Marine generals, to “indefinitely detain without charge people suspected of involvement with terrorism, including United States citizens apprehended on American soil.”  The Marines, a former Commandant and a former commander of Central Command, urged President Obama to veto the legislation (which Obama has molded to his satisfaction) because it “would extend the battlefield to include the United States – and hand Osama bin Laden an unearned victory long after his well-earned demise.”

We might bear in mind that the terrorists with whom our National Police State is obsessed are mostly nonstate actors who rarely get loose in the US or, if home-grown, are rarely beyond surveillance.  Since real terrorists are scarce, police agencies are honing their new military skills on largely peaceful #Occupiers and students.  Whom are they protecting?

Terrorists have been prosecuted by US civil authorities in accordance with the US Constitution and the Geneva Conventions, but not successfully by the US military.  Turning people whom the President suspects of being terrorists over to the military subjects them to preventive detention under martial law with no right of habeas corpus.  Whom will that protect?

03 December 2011

Peak Denial


When I was 11 or 12, I started worrying about what would happen if the world ran out of natural resources.  I can’t remember which I had in mind, but this was during World War II when availability of resources was talked about.  My father’s response was reassuring: we would invent some substitute, as we invented synthetic rubber during the war when natural rubber supplies were cut off.  Necessity is the mother of invention, as the phrase went.

I can imagine that I might have given my son the same answer -- but not my grandson.

One difference between now and when I was 11 or 12 is world population.  It was 2.4 billion then and 2.5 billion in 1950 when I entered college.  Today it’s seven billion, almost three times larger.

But in the same time period the world economy has grown almost ten times larger.  The Earth Policy Institute tells us that “consumption has begun to outstrip natural assets on a global scale."  Leaving aside non-regenerative assets like oil and fossil aquifers,

“demand has surpassed the earth’s regenerative capacity.  We are overharvesting forests, overplowing fields, overgrazing grasslands, overdrawing aquifers, overfishing oceans, and pumping far more carbon into the atmosphere than nature can absorb.”

Again leaving out oil, the demand for and growing scarcity of all important commodities in the last decade has led to a surge in prices that erased the benefits of a century of declining prices.  Investment guru Jeremy Grantham believes that rising prices are very probably here to stay, representing a “Paradigm Shift” that is “perhaps the most important economic event since the industrial Revolution.”
 
“We all need to develop resource plans,” he concludes, “particularly energy policies.  There is little time to waste.”

Tell  that to the US Congress, and what do you get?  Policies that encourage overplowing, overdrawing and the pumping of more carbon into the atmosphere than nature can absorb without warming the globe and acidifying its oceans.

By way of explanation (or excuse), Grantham suggests that humans are genetically ill-disposed to dealing with “long horizon issues and deferring gratification,” possibly because “we could not store food for over 99% of our species’ career and were totally concerned with staying alive this year and this week.”  He also thinks humans are “optimistic and overconfident,” traits which may have been important to our survival.  Especially Americans.

In Collapse: How Societies Choose to Fail or Succeed (Viking Penguin 2005), Jared Diamond’s examination of human societies that succumbed to environmental problems and others that survived them, the author said that in planning the book he assumed it would just be about environmental damage.  But later he realized he had to add other factors including the only one that proved significant in every society’s failure or success -- “the society’s responses to its environmental problems.” 

The environmental issues for these societies, chiefly deforestation and soil and water problems, were similar to those we face today.  Global warming, however, is a modern issue, derived from industrialization and the growth in human population that it made possible.  President Johnson was officially advised of it in 1965, but neither he nor any of his eight successors nor any Congress in the almost half-century since has taken action.

Global warming is a long horizon issue, of course, though not so distant that it won’t affect our grandchildren.  It also requires a global response involving other nations.  But we acted with other nations to control toxic chemicals (another modern environmental issue) creating a hole in the ozone layer.  The fault is not in our stars but in our politics.

Since fossil fuel emissions account for most of the increase in greenhouse gases that cause global warming, an obvious first response to reduce and eventually reverse this increase is to raise the price of fossil fuels by taxing them or capping their use, thereby making non-fossil energy more competitive.  But enough members of Congress are indebted to corporate interests that profit from fossil fuels or depend on them to block any such response.

Most of these members claim to doubt global warming.  But considering “the consensus scientific view” that global warming is occurring and that greenhouse gases emitted by human activities are the primary cause, their claim seems more like self-induced self-delusion -- a fig leaf we could call Peak Denial hiding their representation of the corporate interests held by the 1% rather than the interests of the 99% among their voters and the public good.

Global warming, left unchecked, will eventually create an environment that’s different from the one to which both we as a species and the food that we grow are genetically adapted.  How the world society responds to it may determine whether the society succeeds or fails.

Congress is consistently choosing to fail.  To make sure that society succeeds, we must choose a Congress that is not dependent on the 1%.

19 November 2011

Peak Food


British undercover economist Tim Harford’s recent article, “Malthus’s ghost and baby number 7bn,” about whether the world can feed seven billion people, calls to mind the controversy over Peak Oil.  In 1956 geophysicist M. King Hubbert predicted that US oil production would peak in 1970.  Although his prediction was derided by oil professionals, it turned out to be right on the money.

Hubbert had observed that when a particular oil field reached roughly half of what can economically be produced, production levels off and then declines as producing the remaining half becomes more difficult.  Noting that the discovery of new oil fields in the US had itself peaked in 1930, Hubbert extended his theory to apply to all US oil fields taken as a group.

Peak Oil has now moved to the world stage, where discovery of new oil fields peaked in the 1960s.  When will world oil production level off?  Some say it already has.

The increasing cost and scarcity of post-peak oil production will increase oil prices.  But scarcity caused by the increasing demand of seven billion people will increase prices even more.  That is, Peak Oil’s effect on prices will be trumped by Peak Population.

We were, however, talking about feeding seven billion people, as to which Harford concluded “so far, so good.”  Although the cost of energy used in growing, producing and transporting food is a major component of its cost, food is a renewable resource, not finite like oil and other fossil energy.  Right?

Right -- but the soils in which food is grown can be depleted, and the sources of fresh water to grow it may not be renewable. 

The Green Revolution, powered by high-yield varieties of corn, wheat and rice grown with synthetic fertilizer, fed the surge in world population and livestock after World War II.  But in retrospect, it doesn’t seem so green.  According to a Pew Commission report, it had unwanted ecological impacts “such as aquifer depletion, groundwater contamination, and excess nutrient runoff” precisely because of its reliance on what made it a success -- “monoculture crops, irrigation, application of pesticides, and use of nitrogen and phosphorous fertilizers.”

Now Lester Brown of the Earth Policy Institute raises some startling questions about today’s global economy.  One is whether the US can feed China.  Another is whether China could starve the world.

Following the starvation of 30 million Chinese during Mao Zedong’s Great Leap Forward, Mao refocused China’s agricultural resources on the production of grains.  To create new cropland, China cleared the grasslands in its northwest province.  But overplowing and overpumping of freshwater aquifers in the years since has turned much of its farmland into desert.  Dust storms from northern and western China now envelop Beijing every spring.

The storms are reminiscent of the Dust Bowl in the southern US Great Plains where homesteaders had replaced the prairie grass and other vegetation with endless acres of wheat.  A decade of drought in the 1930s forced 2½ million people to abandon their farms.  This dryland is again being farmed with irrigation from the deepwater or “fossil” Ogallala Aquifer, but fossil aquifers do not replenish themselves.  If the Ogallala goes dry, farming will go back to lower-yield dryland farming if there is sufficient rainfall or cease altogether.         

Land degradation that reduces food-growing productivity is a worldwide concern.  Surveys in the 1980s identified roughly three percent of US soil as degraded and two percent as severely degraded.  A separate analysis showed two-thirds of the degradation was caused by “agricultural activities” (i.e., farming) and most of the rest by livestock overgrazing.  For the world as a whole, degradation is caused by these two factors plus deforestation in approximately equal proportions.

If degradation is not too severe, soil can be maintained and even restored by sound farming techniques such as reduced tillage, fallow periods, cover crops, crop rotation, manuring  and balanced fertilizer application.  But if degradation goes too far, farming will cease altogether.  Soil at that point becomes a non-renewable resource.   

That’s the problem China is facing.  To avoid politically unsettling increases in food prices, it will have to import grain .  The US is the world’s largest grain exporter.

Welcome to Peak Food – which, like Peak Oil and almost every other environmental problem you can think of, is trumped by Peak Population.  “For Americans,” Brown says,

“who live in a country that has been the world’s breadbasket for more than half a century, a country that has never known food shortages or runaway food prices, the world is about to change.  Like it or not, we are going to be sharing our grain harvest with the Chinese, no matter how much it raises our food prices.”

Time to husband our agricultural resources and adopt techniques consistent with a seven billion person world.  Grow food for food, not fuel.  And keep our fresh water free of pollutants from Canadian oil sands and US shale gas mining.

07 November 2011

Politics of Despair


The New York Times has commissioned Adam Davidson to write a column for its Sunday magazine “to demystify complicated economic issues – like whether anyone (C.E.O.’s, politicians, people running for the presidency) can actually create jobs.”  Lest you not read the later columns, he tells you how they’ll come out: “The fact is that creating [jobs] in a far-too-sluggish economy is practically impossible in our current capitalist democracy.”

In one of his own columns and also in the New York Review of Books, the Washington Post’s veteran Washington analyst Ezra Klein has come to a similar conclusion about the Obama administration.  Given the opposition of the Republican party and the Democrats’ tenuous control of the Senate, Obama did pretty much all he could have done to stimulate job creation.  His stimulus package helped forestall another 1930s-type depression and lowered the unemployment rate by several percentage points, but now unemployment is stuck at nine percent and nothing further can be done.

The politics of despair has taken over.

Let’s not forget, however, that there is a way to create jobs in a sluggish economy:  government spending.  The New Deal did it in the 1930s.  During its iconic 100-day (reconvened) session in 1933, the 73rd Congress financed two job programs (the Civilian Conservation Corps and the Public Works Administration) as well as relief for the poor and unemployed and the refinancing of residential mortgages to avoid foreclosure.  After Congress had adjourned, FDR created the Civil Works Administration (CWA) under which the government itself hired four million people for what turned out to be a very bitter winter.

CWA led in 1935 to the more permanent WPA (Works Progress Administration) that over the course of the next eight years spent 11 billion dollars employing 8½ million different people on well over a million projects that rebuilt the country’s infrastructure.  None of these programs cured the unemployment problem, but they did create jobs for workers that the private sector wasn’t hiring.  They put money in the empty pockets of previously unemployed people who promptly spent it, thereby stimulating the private sector to create more jobs to sell the products that the people who were temporarily employed by the government bought.

That as I understand it is what Keynesian stimulus is about.  During a recession reinforced by reduced spending on the part of insecure and out-of-work consumers, if the Federal Reserve has lowered short-term interest rates as much as it can, meaning close to or at the “lower bound” of zero percent, without inducing the private sector to produce enough output and create enough jobs to restore full employment, the economy is caught in a “liquidity trap.”  The surest way out is government spending not only to compensate for the inadequacy of private sector spending but also to stimulate the private sector to spend more.  During a liquidity trap, government can borrow at dirt-cheap rates and spend without driving up inflation or “crowding out” private sector spending.  It therefore makes economic sense for government to utilize idle workers to improve the economy’s infrastructure, as WPA did, while seeding the economy’s recovery to full output and employment.

Of course, FDR had advantages that Obama didn’t.  One was that the depression was much worse than our current recession and had lasted much longer when FDR took office.  As a result, he had overwhelming Democratic majorities in both chambers of Congress and a strong public mandate to do something – anything – to make things better.  Having served four years as Governor of New York that included the depression years and several years during World War I virtually running the Navy Department, he had experience and confidence that Obama lacked.  And while FDR was as fiscally conservative by nature as Obama seems to be, his concern for the needs of working people was greater than his conservatism.

FDR capitalized on these advantages and took swift and decisive action.  By the end of the 100 days, although economic conditions had actually gotten worse, “the feeling everywhere was so much better,” one of his close advisors wrote, and “good will [for FDR] spread like a benison over the land.”  Although FDR never got unemployment below ten percent until the huge stimulus of World War II, average Americans came to feel that FDR was trying to do something for them and stopped despairing.  The Democrats’ reward was long-lasting.  Voters kept them in control of the government for 18 of the next 20 years and in control of the House of Representatives for 58 of the next 62 years.

I suspect that the next opportunity for a Newer Deal, if it comes, will not occur before 2016.  By then the 99 percent will have suffered through eight years of despair and stagnant recovery and resistance to change by the one percent.  If they vote, they will control whether the country changes direction or stays the course.

25 October 2011

Wall Street v. the Pentagon


The focus of the original 99 Percenters in the US who came together in response to a call to occupy Wall Street on September 17, 2011 was the metaphorical Wall Street – the big commercial and investment banks that pumped up the housing bubble to spawn risky home mortgages until the bubble burst and brought on the Great Recession with its enduring legacy of 9+ percent unemployment.

The government rescued Wall Street but not its victims, homeowners who over-borrowed from the value of their new or existing homes after being assured by their bank-financed lenders that housing prices always went up.  Borrower beware was the lesson they learned the hard way when housing prices plummeted.

Beware Wall Street Unregulated was the lesson that we as a country relearned one more time.

But we have more special interest groups than Wall Street to be wary of.  Wall Street is prominent because of the havoc that it caused and because of its outsized political contributions to forestall regulation.  At heart, what bankers want is the freedom to exploit other bubbles and other marks without government interference.  They don’t need a government commitment to bail them out because they expect to enrich themselves and their banks, not fail, or at least enrich themselves before their banks fail.

By contrast, another special interest group, centered metaphorically in the Pentagon, could not thrive without a government commitment, in this case a commitment to perpetual war budgets.  War budgets tend to get cut when we have no wars to fight, but since 9/11 we’ve been engaged in a perpetual war against terrorists that is the longest war in our history and second only to World War II in cost.

In his farewell address, President Eisenhower named this special interest group the military-industrial-congressional complex (MICC), although he judiciously deleted the “congressional” portion in the public version of the speech.  MICC consists of a large standing military represented by the top generals and admirals in the Pentagon, a huge defense industry built to support the Cold War and a widespread bipartisan congressional patronage network.

After the Soviet Union stood down the Cold War at the end of 1988 and then liquidated itself at the end of 1991, leaving the US as the world’s lone superpower, we might have expected a significant reduction in the defense budget as a peace dividend.  The defense industry, however, had other plans, and competing in the market for non-defense businesses in the free market wasn’t one of them.  As one CEO conceded, “sword makers don’t make good and affordable plowshares.”  Ford switched from Ford cars to producing Jeeps and one B-24 Liberator every hour during World War II – and then back to cars after the war – but that was then and this is now.

What the sword makers proved particularly adept at are what a seminal 1990 paper by veteran Pentagon analyst Franklin “Chuck” Spinney called defense power games.  These games, which are “played” with the other MICC members (the Pentagon and Congress) much as adults play games with their young children, involve two moves by the sword maker, front loading and political engineering.  Spinney explains:
Front loading is the practice of planting [i.e., inducing the Pentagon or Congress to allocate] seed money for new programs while downplaying their future obligations.  This game, which is a clever form of the old-fashioned "bait-and-switch," makes it easier to sell high-cost programs to skeptics in the Pentagon and Congress.  Political engineering is the strategy of spreading dollars, jobs, and profits to as many important congressional districts as possible.  By making voters dependent on government money flows, the political engineers put the squeeze on Congress to support the front-loaded program once its true costs become apparent.
The games create a bias for complexity, since the ultimate costs of complex systems are harder to forecast (making front loading easier to get away with) and involve more subcontracting to spread around the country.  But complex systems are also more expensive to build and to maintain, meaning that costs grow faster than budgets.  Who within the MICC takes the hit?

“The defense power games,” Spinney tells us, “are stratagems for … transferring money from the taxpayer to a central bureaucracy that subsequently disburses the money to a socialistic industry, even if the transfer sacrifices the capabilities of our military forces [emphasis mine].  That is what slashing the operations budget [which supports the military] to save the investment budget [which supports the defense industry] is all about.”

In the Pentagon, defense power games have never been called off.  As a consequence, we have the highest military budgets since World War II but our all-volunteer forces are stretched to fight the fights that the President says are necessary.  Something is wrong here, and I’m not talking about the budget deficit.

Until that wrong is righted, let’s make sure that the Pentagon ranks high among the objects of our occupation.

19 October 2011

Changing Congress

To get into the subject of this, my first blog post – changing Congress -- let me run through a few assumptions without stopping here to explain them or defend them.  As expounded in Federalist Article #39, written by James Madison in 1788 as he argued for the adoption of the Constitution, the United States of America is a “government which derives all of its powers directly or indirectly from the great body of the people,” making it (by design) a democracy, and is “administered by persons holding their offices during pleasure, for a limited period, or during good behavior,” making it (by design) a democratic republic.  Members of Congress and the President are administrators elected by the people for a limited period.
But Madison also recognized in Federalist #10 that “men of factious tempers, of local prejudices, or of sinister designs, may, by intrigue, by corruption, or by other means, first obtain the suffrages, and then betray the interests, of the people.”  In 2007 Martin Gilens of Princeton concluded from an extensive study that “whether or not elected officials and other decision makers ‘care’ about middle-class Americans, influence over actual policy outcomes appears to be reserved overwhelmingly for those at the top of the income distribution.”  If government of, by and for the great body of the people is (by definition) a democracy, government of, by and for those at the top of the income distribution is a plutocracy.
The US today is a plutocratic republic.  Members of Congress are addicted to nonstop campaign fundraising, and almost all of the funds are raised from those at the top of the income distribution.  As Ken Silverstein pointed out several years ago in Harper’s, “the most lavish benefit of winning a congressional campaign is, ironically enough, the right to keep on campaigning – and therefore to keep raising and spending donor money.”  Although Congress is not the sole cause of our drift from democracy to plutocracy, I agree with Lawrence Lessig and Joe Trippi that we should strike at the root of the problem and fix Congress first.
Fixing Congress means electing members who have “an immediate dependence on, and an intimate sympathy with, the [great body of the] people” (Madison again) rather than just those at the top of the income distribution.  The hard way to get there is to persuade incumbent members of Congress to cure themselves of their addiction.  Lessig believes that we have to go for a Constitutional amendment, which has to be initiated by a two-thirds vote of both the House and the Senate or by the approval of two-thirds of the States for a Constitutional convention.  Approval of an amendment in either case requires the approval of three-fourths of the States.  Tough sledding, especially for a Constitutional convention, which has never been tried.
I like a third approach – putting up candidates who are committed never to accept donations over a certain amount, like 100 dollars per contributor per year.  This is the scheme proposed by the Fair Elections Now Act, a bill now before Congress to provide public financing for Congressional campaigns.  Getting Congress to pass the bill is also tough sledding, although (perhaps) not as tough as a Constitutional amendment.  A nice feature of FENA is forbidding candidates who sign on to it from using any money in the campaign – including personal or family wealth – that is not raised by personal contributions of 100 dollars or less or by matching contributions under the Act.
Imagining a Congress controlled by members beyond the reach of the plutocrats makes sugar plum fairies dance in my head.  But what really energizes them (the fairies) is another thought: what if the 99 percenters became regular voters?  What if they came to believe that the surest route to economic freedom is through exercising the political freedom that they already have to vote for candidates who don’t accept large donations?
As  I indicated in my (also first) tweet today, if only 99 percenters became regular voters in elections and primaries.  They could change Congress, 50 States & the World.