Friday, July 18, 2014

Former patriot of the year puts money ahead of country—but isn’t that the American way?

It turns out that Heather Bresch is as much a patriot as she was a student.

Ms. Bresch is chief executive officer of Mylan, Inc., a large maker of generic prescription drugs which recently announced that it is buying Abbott Labs for the purpose of moving to the Netherlands and enjoying lower taxes.

As Andrew Ross Sorkin detailed in a New York Times article titled Reluctantly, Patriot Flees Home,” Ms. Bresch was recipient of a “Patriot of the Year” award by Esquire Magazine, one of the literally thousands of bogus awards given by nonprofit organizations and the news media to corporate executives every year. Bresch won the Patriot Award not for acts of valor or self-sacrifice—but for having the connections to lobby for the Food and Drug Administration Safety and Innovation Act of 2012, which gives the FDA the authority to collect user fees for the drug and equipment reviews it conducts. 

Let’s grant Ms. Bresch the benefit of the doubt and assume that unlike virtually every other instance of an industry initiative to regulate itself, Bresch’ proposal was not a watered down version of what should have passed if Congress truly had in mind the best interests of the public.  But even making the incredulous assumption that she acted altruistically, Ms. Bresch has certainly not behaved as a patriot in her massive tax avoidance scheme.

A true patriot pays taxes—when represented, as Bresch so ably is, in part by her father, a Democratic Senator and former Governor of West Virginia.

A true patriot looks at the state of our roads, the state of our education system, the diminishing sums for medical and other research, the high price of college, the staggering poverty in the land of plenty and then does what he or she—or it if in the case of corporations—can do to help.  Instead Ms, Bresch and her company, like Pfizer, Abbvie, Tyco, Walgreens, Medtronic, Chiquita and dozens of other companies, decided to buy a smaller foreign competitor and renounce American citizenship to take advantage of a gaping loophole in the U.S. tax code.

Ms. Bresch is old hat at not being what she seems.  For years her resume said she earned a Masters of Business Administration from West Virginia University. In 2007, the Pittsburgh Post-Gazette called WVU on a routine fact check after seeing a news release announcing Bresch’ appointment as Mylan’s chief operating officer only to learn that Ms. Bresch did not in fact have a degree. What happened next bordered on low slapstick: WVU said Bresch had not earned her MBA, then called back days later to change its mind. In the interim, the university awarded Bresch a post facto MBA even though she was some 22 credits short of a degree that requires 48 credits. To do so, higher ups gave her grades for courses in which she had received “incompletes” and added six additional courses to her academic record.  The Post-Gazette had a field day reporting WVU cooking the academic books to award a bogus degree and the university’s subsequent weak attempts to cover it up. Heads rolled throughout the university.  

Forgotten in the academic scandal that dominated the news media in West Virginia and western Pennsylvania for months were two things:
1.      There must have been enormous political pressure on WVU for so many of its administrators to behave so unethically. It is unclear from where that pressure came, given that the Governor at the time was Heather’s daddy and the chairman of Mylan at the time was WVU’s largest donor.
2.      Bresch lied about having earned an MBA and continued to lie even after the Post-Gazette called her on it.  (Denying the truth may the modus operandi at Mylan. About two years later, then CEO Robert Coury insisted that an FDA investigation had ended even after the FDA said it was ongoing.) 

Sorkin, who neglects to mention Bresch’s past brush with resume-padding, expresses surprise that the “patriot” acted so unpatriotically, but it makes perfect sense to me.  In the United States, we learn that the most patriotic thing to do is to open or run a business. We also learn that a business is supposed to maximize profit for it shareholders in a legal manner, no matter how ethically repugnant it may seem. Lay off thousands of workers so that profit margins increase. Leave communities to chase cheaper labor and laxer environmental regulations. Suspend manufacture of needed pharmaceuticals because the profit margins are too narrow. Buy smaller foreign companies and move abroad to avoid taxes. All of it is ”just business,” in the words of fictional businessman Michael Corleone.

The syllogism is perfect:
·         Running a successful business is patriotic
·         Business rewards amoral if legal conduct, as long as it produces a profit
·         Abandoning the U.S. and denying its government of millions of dollars that could be used for safety, education, infrastructure investment, protection of the weak and security is good business.
THEREFORE
·         Abandoning the U.S. is patriotic

Those who think I’m just joking haven’t followed the past 35 years of the federal government facilitating the globalization of large American businesses to the detriment of U.S. workers and communities. It’s this record that makes me doubt that Congress will hear the cries of “unfair” that many are making and change the law so that any country that makes money in the United States has to pay the U.S. tax rate no matter how the business structures itself or where it locates its headquarters.  


If you want to sell to U.S. consumers, you should have to pay U.S. taxes and at the same rate as domestic companies. That’s only fair, but fairness has nothing to do with business, nor, in the age of the politics of selfishness, does it have anything to do with either governance or tax policy.

Wednesday, July 16, 2014

Anti-tax sentiment in the 17th century was anti-war; today, it’s pro the wealthy

By Marc Jampole

Reading about the 17th century in Geoffrey Parker’s Global Crisis really helps one understand our current challenges. Parker’s thesis is that the extreme weather conditions across the world in the 17th century tipped what would otherwise be normal political disruptions into rapid social, economic and political decay. The Fronde revolt in France, the 30 Years War in Germany, the Great Revolution that led to the temporarily overthrow of royalty in England, the Time of Trouble in Russia, the violent end of the Ming Dynasty and establishment of the Qing in China—these are just some of the major revolutions and wars that seemed to cluster around the middle of the 17th century, leading to serious population losses virtually everywhere.

Parker makes a compelling but not airtight case that the famines and extreme weather caused by what historians call “The Little Ice Age” did affect human societies enough to worsen all social and economic tensions and push many situations beyond the point of cataclysmic upheaval.

But even if we discount or reject Parker’s climate change theses, we can still learn a great deal from Global Crisis that applies to today’s world.

Take the topic of taxes, for example.  Parker shows that throughout the world in the 17th century rulers and their administrators collected and raised taxes for two purposes:
1.      To fund the extravagant lifestyles of royalty
2.      To fight wars of territorial conquest

No wonder there were literally hundreds of major and minor tax revolts throughout the entire world, and especially in Europe, during the middle war-torn decades of the 17th century! Who would want to pay for useless wars and the high life of the nobility?

Tax revolts have a storied and honorable history during the long and bloody era of royalty, including our own revolt against the British. Keep in mind though, that the American colonies were not opposed to taxes, merely to be being taxed without representation.

Fast forward to today and our far more complex post-industrial society.  In light of the strong historical connection between anti-tax revolts and warfare, isn’t it truly bizarre that the only budget item that none of our advocates for lowering tax rates want to cut is the military? In fact, virtually everyone who wants to lower taxes is also in favor of increased military spending.  They will gladly cut back spending on education, unemployment insurance, the space program, medical research, safety inspections, IRS audits and everything else the government does, but not on guns, bombs and ammo.

In the 17th century, tax protestors and rioters were mostly outsiders—peasants, merchants and minor nobles who objected to paying for foreign wars. By contrast, since the ascent of Reaganism and the politics of selfishness, most of those in favor of lowering taxes and against raising them to meet pressing needs are members of the establishment—rich folk like Pete Peterson, the Koch brothers and executives of large corporations plus their congressional factotums. And while they talk about lowering taxes as a general mantra, when you take a look at their tax proposals, they always only call for lowering taxes on two groups: the wealthy and corporations.

The rich control the news media, the multitude of think tanks that advise elected officials and the political process, which explains that the idea that taxes are bad is now so engrained in the public consciousness. Anti-tax fever has gotten so bad that Congress cannot even pass an adequate law to fund the repair and upkeep of our highways. Members of Congress either are afraid to pass a higher gas tax or are so adamantly against any tax that they just don’t care how much our roads deteriorate.

No one likes driving through potholes or over bridges that need structural work. Providing adequate permanent funding for our highways creates jobs and will lead to faster and more energy-efficient travel. To the degree that the tax would discourage driving, it may also help clean up the atmosphere. 

Yet no one—not even President Obama—will come out in favor of raising the gas tax or raising other taxes to fund highway repair and maintenance. Every elected official is as afraid of anti-tax frenzy as they are of the National Rifle Association.

Some may point out that a gas tax assesses everyone and goes against my basic premise that anti-taxers are really just interested in lowering taxes for the wealthy. Let me explain: the incessant call to lower taxes, which has dominated economic discussions since about 1980, has created an atmosphere in which the default position is to hate all taxes—new, old, general or earmarked. The debate in the marketplace is about all taxes, but the bills that are passed to typically give all or a large part of the tax breaks to the wealthy and corporations. We could make a new gas tax progressive by giving poor people gas rebates on income taxes, of course, but first we have to pass a new gas tax. And that’s nearly impossible in the current anti-tax environment. Meanwhile, we keep funding our senseless, goalless wars by borrowing money from the wealthy.

Let me close with a sarcastic shout-out to the New York Time which found room in its shrinking print pages for an extensive story on a scientist who denies that climate change is occurring. I’m guessing that it’s part of a series of personality pieces on climatologists and that the series will reflect current scientific opinion, so that in each of the next 200 editions, the Times will do in-depth studies of scientists who support the reality of global warming. 200 for and 1 against will just about represent the true balance of opinion among scientists.

Or maybe today’s article is the first in a series on scientists who speak against the overwhelming flood of facts on issues that were decided years ago: Next week, the Times might feature someone who doesn’t thinks the sun revolves around the Earth; and then move on to someone who believes in spontaneous regeneration, someone who still thinks phlogiston causes things to burn, someone who believes that vaccines cause autism and someone who thinks that only gays can get AIDS.

I’m fairly confident though that today’s feature about one of the small number of anti-climate change scientists is not the start off a special series but rather a continuation of the Times and the mass media’s decades long pandering to those advertisers who gain by postponing the changes we as a society will have to make to steer a peaceful and bloodless transition to the much warmer world of the future. 

Monday, July 14, 2014

Racism reborn as theories on Western superiority

By Marc Jampole
We’re seeing more theories exploring the reasons why the West dominates the world order or why the West has developed a more advanced culture. A few years back, a scientist tried to show that geography determined when and which cultures would dominate the globe at any given time.  In 1997’s Guns, Germs and Steel, Jared Diamond blamed the fact that “the literate societies with metal tools have conquered or exterminated the other societies” on the three items in the title of his book.  Several scholars like Bernard Lewis have made a living touting the superiority of Western culture and telling us why. A recent Economist took a reverse gambit, dedicating a long article on why the Arab culture has failed—failure of course measured as an inability to move towards a Western political and economic model.
Most of these theories define or assume that following traits define Western superiority:
  • A free market system
  • Free trade across borders, with restraint of labor
  • A representative democracy
  • An industrialized, and now post-industrialized economy
  • A consumer society built on cars, cell phones and disposable clothing
The latest to proclaim and then explain Western superiority is Nicholas Wade, in his Troubled Inheritance, which describes research that found minor differences in the genetic makeup of Asians, Caucasians, sub-Saharan Africans, Native Americans and the original inhabitants of Australia and Papua New Guinea. He assumes without any proof or explanation that these differences explain the differences in culture of these peoples and the superiority of white ways.
Too bad that the premise of Wade and of all of these writers—that the West has forged a superior way of life—is false.
One argument against the assumption of Western superiority is to point out the ills caused by Western ways: an epidemic of obesity and diabetes, resource shortages, the mass extinction of species, and human-made global warming.
But I prefer to do the numbers.
We have to start with a measurable standard. I know a lot of readers are going to go for standard of living or gross domestic product per person, but consider this: The only goal in the broadest of all unfolding histories—evolution—is survival. I’m going to assert that the best measurement of surviving is the size of the population.
And the Chinese win, hands down. When population historians analyze every extant population survey of different cultures, countries, continents and parts of the world, they find that at every point in the recorded history of humankind there have been more Han Chinese than any other race, culture and/or nationality. That’s 10,000 years of continual Chinese demographic superiority, even when they seemed to be under the paw of Western Europe militarily and economically.
As obnoxious as the idea of Western superiority is the very notion that we are in some kind of world competition that is culturally or racially based.
The very assumption of Western superiority is inherently racist, as is the sometimes frenzied search for proof that the races are different. It’s true that today the West seems to dominate the world and its cultural aspirations, as Greece and Rome once did for the Mediterranean world and Persia once did among the myriad cultures of the Asian subcontinent. But Asia, the Middle East, the subcontinent and Europe/America have all taken turns being the dominant economic and cultural power over recorded history. To take one moment and call it the end game of all history didn’t work when Marx tried it and it didn’t work when Francis Fukiyama tried it. And it doesn’t work when social thinkers say the West is superior just because it may have dominated and forcibly led much of the rest of the world for much of the last 300 years. This phase will pass as surely as did the Tang Dynasty, Ghengis Khan’s empire and the Moghul empire in India.

Saturday, July 12, 2014

Editorial: Next Stop Single Payer



The infamous Hobby Lobby decision satisfied nobody. On June 30 five old men on the Supreme Court decided that corporate owners can overrule physicians and insurance regulators over whether their female employees can receive contraceptive treatment.

The court majority agreed with attorneys for Hobby Lobby Inc. and other bosses who claim that the contraceptive “mandate” imposed by the US Department of Health and Human Services as part of the Affordable Care Act was a “substantial burden” on the religious freedoms of their closely-held corporations and violates their rights under the Religious Freedom Restoration Act of 1993.

As Justice Ruth Bader Ginsburg wrote in her dissent, there is no precedent on behalf of the majority’s assertion that secular, for-profit corporations can be “persons” under RFRA. “The absence of such precedent is just what one would expect, for the exercise of religion is characteristic of natural persons, not artificial legal entities,” she wrote.

The Hobby Lobby ruling, written by Justice Samuel Alito, seemed to allow closely-held corporations to object to four specific types of birth control — including IUDs and Plan B — because the business owners inaccurately if sincerely consider them to cause abortion. (The contraceptives at issue prevent a fertilized egg from implanting into the lining of the uterus. A woman is not considered pregnant until the developing embryo successfully implants in the lining of the uterus. The only drug approved to induce abortion is RU-486 and it is not on the FDA’s list of approved contraception.)

A day after Hobby Lobby, the Court’s resolve began to crack. On July 1, the court indicated that its ruling also applies to for-profit employers who object to any of the 20 forms of birth control included in the ACA’s contraceptive mandate, not just the four methods at issue in the Hobby Lobby case. The Court ordered three appeals courts to reconsider cases in which they had rejected challenges from corporations that object to providing insurance that covers any contraceptive services at all.

And the Court in the Hobby Lobby case seemed to validate the ultimate goal of providing contraceptives when Justice Alito wrote for the majority that the government had to use the “least restrictive alternative.” That means that if there is a less burdensome way to implement the law, it needs to be used. The majority pointed to a workaround the administration had come up with to accommodate religious nonprofits. If there are objections to a medical treatment, third parties will provide coverage to the employees.

In case of contraceptives, the nonprofits must fill out a document that declares that paying for any or all of the 20 devices and methods approved by government regulators would violate their religious beliefs. Then their insurers or third-party administrators would take on the responsibility of paying for the birth control, and would get reimbursed by the government through credits against fees owed under other parts of the health law.

But many groups still object that filling out the government Form 700 is akin to signing a permission slip for evil activity.

In an unsigned order issued July 3, moments before they adjourned for their summer recess, the Court suggested that the nonprofit workaround might also be unconstitutional. “Overnight, the cure has become the disease,” Dahlia Lithwick and Sonja West wrote in Slate.com (July 4). “Having explicitly promised that Hobby Lobby would go no further than Hobby Lobby, the court went back on its word, then skipped town for the summer.”

In the new case, the Court granted Wheaton College, a Christian college in Illinois, a temporary exemption from the requirement that it use Form 700. The Court said the interim order would not affect the ability of employees and students to obtain, without cost, the full range of FDA approved contraceptives, since the government already knew the college objected.

More than four dozen faith-affiliated charities, colleges and hospitals that oppose some or all contraception as immoral have filed lawsuits to relieve them of the obligation to pay, even indirectly, for birth control.

“Anything that forces unwilling religious believers to be part of the system is not going to pass the test,” Mark Rienzi, senior counsel for the Becket Fund for Religious Liberty, which represents many of the faith-affiliated nonprofits, told the Associated Press. Hobby Lobby Inc. also is a Becket Fund client.

The Supreme Court will be asked to take on the issue in its next term, which begins in October.

The Wheaton College injunction drew a furious reaction from the three female Justices, Sonia Sotomayor, Ruth Bader Ginsburg and Elena Kagan. “Those who are bound by our decisions usually believe they can take us at our word,” Justice Sotomayor wrote in the dissent. “Not so today. After expressly relying on the availability of the religious-nonprofit accommodation to hold that the contraceptive coverage requirement violates [the Religious Freedom Restoration Act] as applied to closely held for-profit corporations, the Court now, as the dissent in Hobby Lobby feared it might, retreats from that position.”

The Court’s action, she added, even “undermines confidence in this institution.”

Justice Sotomayor wrote that the majority, which acted on an emergency application, had not only introduced pointless complexity into an already byzantine set of regulations but had also revised its Hobby Lobby decision.

Justice Sotomayor said the ruling reached beyond Wheaton and could lead to similar results at many other nonprofit religious organizations that have similar concerns. “The issuance of an injunction in this case will presumably entitle hundreds or thousands of other objectors to the same remedy,” she wrote.

“Not everyone was fooled by the majority’s promise that the decision in Hobby Lobby was narrow,” Lithwick and West wrote. “But the speed with which the Court has loosened the dam on this is stunning. While the court has told us that we are not allowed to question the sincerity of corporations’ professed religious beliefs, we remain free to question the sincerity of the Court’s pinky promise that the Hobby Lobby decision would have a limited scope. At the end of this term, many people sighed a breath of relief that the outcome of Hobby Lobby was not as bad as we’d feared. It will be.”

Senate Democrats have floated a bill to reverse the Hobby Lobby decision, but in the face of Republican intransigence they might as well move toward single-payer health coverage so that businesses and religious organizations don’t have to worry about being complicit in the medical procedures their employees get.

There are bills in Congress, including Rep. John Conyers’ longrunning HR 676 and Sen. Bernie Sanders’ S 1782, that would expand Medicare to cover everybody, but they are given practically no chance of going anywhere. However, the Affordable Care Act allows states to seek waivers to implement their own single-payer plans starting in 2017. Vermont has enacted such a plan and is working on its Green Mountain Care, with the main challenge being how to pay for the estimated $2 billion cost (which would still be less than the $2.5 billion Vermonters pay in private premiums and out-of-pocket for health care).

To get the waiver, a state must demonstrate that its public option would provide coverage at least as good, for at least as many people, as the ACA would, and not add costs to the federal budget. The federal government would provide funds to the state that equal what it would spend under the ACA. For Public Citizen’s “Road Map to ‘Single Payer’,” see (http://www.citizen.org/road-map-to-single-payer-health-care-report).

Ultimately, with the court teetering on a 5-4 balance, the threadbare Hobby Lobby decision underscores the importance of Democrats keeping control of the Senate this November. The right vacancy on the court in the next two years could clear the way to reverse a decade of bad decisions. But not if Sen. Charles Grassley (R-Iowa) is Judiciary chairman. — JMC

From The Progressive Populist, August 1, 2014

Populist.com
Blog | Current Issue | Back Issues | Essays | Links
About the Progressive Populist | How to Subscribe | How to Contact Us

Copyright © 2014 The Progressive Populist
PO Box 819, Manchaca TX 78652

Selections from the August 1, 2014 issue











Monday, July 7, 2014

Ranking the presidents since World War II shows what a sorry lot they have been

By Marc Jampole

A recent survey found that a sampling of about 1,300 Americans rank Ronald Reagan as our best president since World War II and Barack Obama as the worst—just nosing out that supreme incompetent George W. Bush AKA Bush II.  

I’m not sure what goes into the thinking of most people, but if we judge the presidents on the good and bad they did, the direction into which they guided the country and the competence with which they led, Reagan should rank as the third worst president since World War II—and alas, also the third worst president ever.

Let’s start with our worst president since Roosevelt and also our worst president of all time—and it’s not even close. Harry Truman earns this dubious distinction by virtue of ordering the dropping of atom bombs on Hiroshima and Nagasaki. People make excuses for these barbarous acts which led to the slaughter of the largest and second largest number of human beings in a day’s time in recorded history. Apologists say that Truman saved more American lives than the bombs took, which is absurd on the surface, since Japan was already reeling and had already proposed virtually the same terms that they took at the final surrender. Estimates range from 150,000 to 250,000 killed by the only two atom bombs ever used on human beings. How could subduing Japan with conventional airstrikes of munitions factories and military bases taken as many lives? The almost smarmy assertion that dropping the bombs saved lives also neglects the fact that the American lives supposedly saved were soldiers, whereas most of those actually killed at Hiroshima and Nagasaki were neither soldiers nor workers in war factories, but innocent civilians.

Outside of dropping the bombs, Truman’s record is pretty shabby: He helped to start the cold war. He selected nuclear power over solar as the primary energy source for the government to support. He nationalized steel factories to stop a strike. He let Joe McCarthy walk all over the country and tacitly approved the red scare.

Let’s move on to Bush II. Rating Bush II as a worse president than Ronald Reagan is a tough call, because they are the two ideologues most responsible for the economic mess we’re in. In a sense, Bush II completed the Reagan revolution.

But Bush II led an incompetent regime that pretty much botched everything it touched.  His team was asleep at the wheel when the 9/11 attacks hit. The response included two of the most ill-conceived and expensive wars in history, two wars that destabilized the powder keg that is the Middle East and led to a worldwide loss of trust in and respect for the United States. Bush II established a torture gulag across the globe and a spy state at home. Bush II tax cuts starved the country of much needed funds to invest in the future and help the needy. His handling of Hurricane Katrina displayed both incompetence and disregard for suffering.

Any discussion of Ronald Reagan should start with the fact that he and his team were traitors who should have been placed on trial for crimes against the United States. I’m referring to the deal with Iran which kept our hostages in captivity for months longer than they had to be, only so Reagan could defeat Jimmy Carter in the 1980 election. What the Reagan Administration did for Iran in return seems unconscionable to a patriot: we sold weapons of warfare. And what did Reagan do with the money from arms sales to a country the president said was our enemy?  He funded a civil war in Nicaragua.

Even without this treachery, Reagan would still rank among our three worst presidents of all time. He was the leader of the turn in American politics around 1980 that has led us down a disastrous path. The economic plan of Reaganism called for and produced an enormous shift in wealth from the poor and middle class to the wealthy over a 30+ year period that continues. His game plan included all the reasons the rich have so much and the rest of us are struggling: lowering taxes on the wealthy and businesses; weakening laws that protect unions; privatizing government services; cutting social services; and gutting Social Security.

Reagan also asked the country to stick its head in the ground ostrich-like and ignore how our fossil-fuel dependent economy was degrading the earth and threatening our future.

Now that we have disposed of the truly incompetent and/or evil presidents, I want to reverse the order of presentation by naming Lyndon Baines Johnson as the best president we have had since FDR.  If we take away the Viet Nam War, it’s an easy call—Johnson would rank with Lincoln as our greatest of leaders.  He passed the Civil Rights Act, Medicare and Medicaid. He started food stamps, work study, Head Start and a slew of other anti-poverty programs that worked, no matter how much right-wingers want to rewrite history. He passed the most generous education bill and the strictest gun control law in American history. Under Johnson, the space program thrived and it was only a cruel twist of fate that postponed the first moon landing until early in Nixon’s first term.

Of course there have always been stories afloat about Johnson fixing elections early in his career or practicing crony capitalism (as if any president since Andrew Johnson hasn’t?). But that he was essentially a decent man comes out again and again, and especially in that transcendent moment when he learned that the FBI was spying on Martin Luther King and he hit the roof and ordered it stopped immediately. This ultimate wielder of power knew better than most that power must be restrained in a free society.

Unfortunately, there is the Viet Nam War, which he inherited from Eisenhower and Kennedy and bequeathed to Richard Nixon. Viet Nam crystallized all the contradictions of America’s Cold War policies: imperialism parading as idealism, exaggeration of the threat from the Soviet Union and an inability to view the world from any other perspective except that of large multinational corporations. I don’t mean to absolve Johnson—he made the decisions to escalate and bomb. It was a major flaw that disfigures Johnson as a historical figure and sullies the rest of his accomplishments.

After Johnson, I select two presidents who were pretty mediocre, but ruled over good times, made no enormous blunders and led competent administrations that did a fairly good job of running the country on a day-to-day basis and responding to the occasional disaster. If you read the labels most pundits put on these two men, you would think they were miles apart of political spectrum, but if you instead review their stands, you find them fairly close indeed. Both were centrist on social policy and both continued the imperialistic foreign policy that has guided the country since Roosevelt.  I’m talking about stodgy Republican Dwight Eisenhower and rock-star Democrat Bill Clinton. I personally favor Clinton because he tried to pass single-payer health insurance and presided over a relative shrinking of the U.S. military and U.S. militarism.

How is it possible that the evil genius of Richard Nixon can rank as high as fourth among recent presidents? His illegal actions in Southeast Asia and extension of the Viet Nam War were disgraceful. His dirty tricks and domestic spying shook the country by being the first visible signs that technology and centralized power could quickly reduce us to a police state. But Nixon also opened China, set wage and price controls, continued Johnson’s poverty and education programs and established the Environmental Protection Agency and the Occupational Safety & Health Administration. He also ran a competent administration that responded with reason and rationality to most challenges, except, unfortunately, the war and Nixon’s political intrigues.

Nixon was a despicable human being by virtually all accounts, so it’s a little painful to rate him above four essentially likable men, none of whom had the competence to pursue their agendas: Carter, Obama, Kennedy and Ford. I find parts of the vision of all four of these men problematic: Carter was in favor of globalization without protections for U.S. workers or the environment. Obama is basically a pro-business, anti-union liberal who shares the consensus view that the United States should have special rights in world affairs. Kennedy was a militaristic cold-warrior who fervently believed in cutting taxes on his economic class—the ultra wealthy. Ford basically was a continuation of Eisenhower and Nixon, a pro-business cold-warrior open to compromise with progressives on social issues.  None of these men had a great impact because none knew how to work the system like Johnson or Nixon.

That leaves us with Bush I, who is to Reagan what Ford is to Nixon-Eisenhower, a continuation. Bush I was a little more effective than Carter or Obama, but his policies kept us down the path to greater inequality.

Here, then is the OpEdge ranking of presidents since 1945. Of these 12 white males, only three would rank in the top half of all our presidents. Again, I rate the bottom three as the three most disastrous presidencies in American history:
  1. Lyndon Johnson
  2. Bill Clinton
  3. Dwight Eisenhower
  4. Richard Nixon
  5. Jimmy Carter
  6. Barack Obama
  7. John F. Kennedy
  8. Gerald Ford
  9. Bush I
  10.  Ronald Reagan
  11. Bush II
  12. Harry Truman

It’s the times that usually make the man or woman, and not the other way around. These men represented ideas that those with wealth and influence found attractive. Donors, their parties and the think tanks funded by big individual and corporate money shaped their views. It was General Electric money, after all, that helped turn Ronald Reagan from a New Dealer to the symbol of the politics of selfishness. None of these men would have found support if they didn’t buy into the basic premises of American foreign policy over the past century. 

Since World War II we have made three major wrong turns as a country: The first was to create the cold war and continue to assert America’s divine right to intervene anywhere around the world at any time. The second was to ignore the threat of environmental degradation and resource shortages and build our economy on wasteful consumerism powered by fossil fuels. The third was to turn our back on the mixed-model social democracy that we began to establish from 1932-1976 or so and return to economic rules that favored the interests of the wealthy over everyone else’s. We probably would have taken these treacherous paths no matter who we had elected president.

Tuesday, July 1, 2014

After yesterday’s Hobby Lobby decision, don’t forget it’s Ralph Nader’s court as much as Robert’s

By Marc Jampole 

Yesterday’s Supreme Court decision to allow privately held companies to opt out of covering contraception for their female employees pretty much delivers the 2016 presidential election to the Democrats, which probably means Hillary Clinton.

The numbers (which come from the Gutmacher Institute) speak for themselves:
·         99% of all women, aged 15-44, have used at least one type of contraception in their lives.
·         62% of women of child-bearing age currently use some form of contraception, including 77% of married women.

In other words, women, like men, like to have sex, but don’t want to risk pregnancy every time they engage in the act with a member of the opposite sex.  That’s a lot of voters who will find a large aspect of their private lives theoretically threatened by this decision. 

The New York Times is reporting that both Democrats and Republicans are going to make the Hobby Lobby decision a key campaign issue.  The Times article says that the Dems will hope the decision compels more “liberal” Americans to vote in the mid-term election, which pretty much misses the point: contraception is not an issue to liberal women, but to virtually all women, regardless of their political beliefs.

The many fine people who believe that contraception is wrong for religious reasons are entitled to their opinion. They just aren’t entitled to impose their opinion on others—at least not until yesterday’s disappointment.

What about the individual right of the company owner who believes that contraception is wrong, you may ask? Why does he (or she—but we’re talking about mostly men) have to pay for the heathen, immoral ways of employees?

For the same reason that everyone has to pay for health insurance that covers free annual physicals, pap smears, vaccines for children and flu shots. For the same reasons that children have to have vaccines to be allowed to attend school, why everyone has to have certain vaccines before traveling to certain countries. It is in the best interest of society to cover preventive medicine for individuals and to institute preventive public health measures for the entire population.

Birth control prevents unwanted pregnancies, which lead to greater social problems and more poverty, exacting tremendous costs on society. Birth control is also safer and costs less than going through a pregnancy. I’m not saying that pregnancy is bad, only that it can lead to complications that are harmful to women and which no woman should have to suffer unless she chooses to do so.

The greater interests of society and of the overwhelming majority of individuals should overrule the right of corporations run by individuals to distort the marketplace by injecting their religious beliefs into the social contract they have with their employees. But the Roberts Court once again has basically shredded the constitution to give more power to the powerful.

It’s easy to blame the Roberts Court for yet another of many recent decisions that assert that corporations have greater rights than individuals, especially those corporations run by individuals.  But let’s not forget that Supreme Court justices are nominated by Presidents and that George W. Bush nominated Roberts and Alito.  Bush, who won 500,000 less votes than Al Gore, won the electoral college in the 2000 elections because of the substantial turnout of voters for the third-party candidate Ralph Nadar in key electoral battlegrounds. Ralph Nadar is as much to blame for the current Supreme Court as Karl Rove, the evil genius who decided to cast his political fate with the dunderhead Bush or Supreme Court Justice Sandra O’Connor, the swing vote that decided to hand the Florida electoral votes to Bush instead of getting a recount.  I often condemn this trio—for the wars in Iraq and Afghanistan, the American torture gulag, the Citizen’s United decision that has unleashed untold corporate dollars to influence elections and the many other depredations of the Bush II-Cheney regime.

Nadar was right that there isn’t much difference between the two major parties. Most of the Baby Boomer Democratic candidates are on the left only to the degree that they are left of Republicans. And why wouldn’t the Democrats tend toward the right wing--they feed at the same corporate troughs as Republicans do and those troughs have gotten much bigger since Citizen’s United. But despite being slightly right of Dwight Eisenhower except on gay and women’s rights, the Democrats still tend to do more to help the poor, elderly, unemployed and disenfranchised than Republicans do; still have more interest in obtaining legal equality for LGBT, women, minorities and immigrants; still exercise some moral constraints on the darker aspects of our essentially imperialistic foreign policy; still tend to support more gun control; and still favor using regulations and taxation to slow down and address the effects of global warming.

The Reagan revolution gradually moved the country to the right over 30 years. Moving it leftward so that we have a European-modeled social democracy may take just as long, if not longer, but the first step unfortunately is to elect the current group of Democrats—mangy and conservative as they are. We can’t get rid of the truly disgracefully right-wing Dems like Andrew Cuomo or Joe Manchin while they can point to their Republican opponents as worse yet.

The current political landscape exists because of two stupid decisions by voters in whose best interest it is to support the progressive agenda. The first stupid decision was to vote for Nader in 2000 and the second was to stay home for the 2010 mid-term elections.  Let’s hope they don’t make those mistakes again. 

Monday, June 23, 2014

End of net neutrality, start of publishing monopoly, big cable merger—we’re approaching de facto censorship

By Marc Jampole

Remember when the growth of the Internet was supposed to level the playing field between large and small companies and between rich and poor individuals and organizations. Sure, the wealthy and large could buy more ads, but the cost to set up a web page or blog—and later to build a network through social media—made it easier for the little guy to compete.  It seemed as if the world could really operate according to Ralph Waldo Emerson’s idealistic notion that “if you build a better mousetrap the world will beat a path to your door,” without the investment of millions into marketing communications.

Inherent in the promise of the web was the principle of a free market for good, services and ideas, undistorted by size, clout or spending.

But no market is ever absolutely free. The biggest players seem always to make sure of that. Without the constraint of government regulations, over time the large and connected will always crowd everyone else out of the marketplace, whether we are talking about widgets or political views. Large companies once hired children to work in factories until child labor laws. They sold adulterated food until the Pure Food & Drug Act of 1906 and other laws. They got together to fix prices until the government stepped in. They opposed minimum mileage and seat belts in cars until the federal government stepped in.

But when it comes to the Internet and other media of mass communications, it seems that the government only steps in to help the big players.

We currently face three controversies which together could rip to shreds any hope of obtaining the state of grace predicted by Internet utopians. In fact, if the federal government makes the wrong decision in all three of these areas, we may end up living in a de facto state of censorship in which we can exercise freedom of speech but only the largest corporations and the richest people will actually be able to get through to significant numbers of people:
·         The Federal Communications Commission (FCC) proposal to end net neutrality
·         The merger of Comcast and Time Warner
·         The unfair monopolistic actions taken by Amazon.com against Hachette Book Group

Let’s look at what’s at stake when it comes to each of these issues:

Net neutrality is the idea that Internet service providers (ISP) and governments should treat all data on the Internet equally, not discriminating or charging differently according to user, content, site, platform, application or type of equipment.  Earlier this year, the news media reported that the FCC is considering a new rule that will permit ISPs to offer content providers a faster track to send content for a higher fee. It means that Netflix will be able to pay more to ISPs like Verizon, Time Warner, Comcast, Cox, Frontier, Windstream and others for the right to have its programming delivered faster than other online streamers. The ISPs will be able to charge the Republican and Democratic parties more than smaller political groups to deliver their messages over the Internet. No longer will you wonder why a website is slow to download—it likely won’t be because of a bandwidth problem; no, in all probability the owner of the slow-to-load website couldn’t afford to pay the extra freight for faster delivery of the information.

Gone will be the days of an Internet level playing field.

Gone, too, will be any possibility of diversity in television programming, if the FCC allows the merger of Comcast and Time Warner into a cable TV leviathan that will control one third of all cable viewing. As it is, there is little difference in the offerings of the various cable networks around the country. Wherever you go across the country, you see pretty much the same menu of network offerings, “Law & Order” reruns, reality shows, religious shows, right-wing cant masquerading as news, centrists masquerading as progressives and sports, sports, sports.  But it will only get worse as one company will make the decision for what networks to buy for one third of all cable TV viewers. BTW, the merged Comcast Time Warner will also control 40% of the wired broadband Internet market.

Instead of merging, both Time Warner and Comcast should be forced to split themselves so that no company controls more than two or three percent of cable TV screens. We should return to the days when no company could own more than a very small number of TV and radio stations—five at the most—and no company could own both a TV station and newspaper in the same region. There are now six or seven companies that control most of the mass media, which is why we see a sad lack of diversity of opinion; why most of every daily newspaper looks like the daily newspaper in every other town; why you find Rush Limbaugh and Sean Hannity on so many radio stations.

The Amazon-Hachette situation presents another aspect of the consolidation of the news media.  Amazon controls two-thirds of the market for digital books and about 30% of all books. Amazon and Hachette have been in some tough negotiations: Amazon wants the behemoth French publisher to accept concessions on their revenues from e-book sales, so Amazon can make more and charge the public less. Because Hachette is resisting, Amazon had removed the link on its website that enables customers to preorder Hachette books, slowed down delivery of Hachette books and not restocked popular Hachette titles.  I would think that this kind of pressure from such a large player with no pre-existing editorial policy definitely fits the description for an unfair monopolistic practice, which is illegal. (Now if Amazon claimed to be a Christian bookseller could justify not selling a Hachette book that promoted atheism, but Amazon declares itself to a marketplace for everything).

Hachette is another media behemoth, but if Amazon gets it way it way, it will make it much harder for both the large and small publishing houses to turn a profit. And think of this: today Amazon is messing with a publisher for financial reasons. In the future, it might decide to mess with a someone for political reasons, much as Wal-Mart used to make record companies provide special censored versions of CDs with lyrics that did not fit Wal-Mart’s conservative moral stance.

And yet we have heard nothing from the Justice Department on this issue. Maybe they’re too busy rousting immigrants. It doesn’t seem as if any of our Senators or Congresspersons with a special interest in freedom of speech has heard that Amazon is trying to unfairly and illegally push its weight around.  I know that some of them are very busy defending the right of a Duck Dynasty star to make rancid sexist and racist statements. Other publishers and writers are wringing their hands, but there have been few if any calls for a boycott of Amazon.  Although Amazon is easy to use—and you can also order your dish soap and sox at the same time—it is not the only place to buy books or anything else on the Internet.

Imagine the worst case scenario occurring:  the merger goes through, net neutrality ends and Hachette knuckles under. The Internet will become the province of large corporations and moneyed individuals.  The chances of a book from a little publisher becoming a best seller through the Internet grapevine will be negligible. The possibility of an article in a small magazine not backed by big money getting onto the Google News or Yahoo! homepage will shrink to almost nothing. A small record label might not be able to download music directly to customers and be forced to give a cut of its profit to Amazon or some other large Internet merchant. Whether it’s a book or a TV show, the little guy will need more resources to compete and so will be unable to do so. A handful of large corporations will control our public discourse even more than they do now.   

Wednesday, June 18, 2014

As Geoffrey Parker’s Global Crisis shows, governments have waged war instead of helping citizens for centuries

By Marc Jampole

Global Crisis by British historian Geoffrey Parker presents the 17th century as a case history of the devastation that climate change can wreck upon human societies. 

The 1600s experienced an enormous number of droughts, lengthy winters, floods, major earthquakes and other extreme weather phenomena resulting from what scientists and historians call “The Little Ice Age.” The Little Ice Age hit human societies hard, leading to famines, plagues and other disasters in all the continents, which put pressure on the still forming nation states throughout the world to go to war to gain or protect their resources. An enormous number of civil wars also broke out, as nobles and/or peasants resisted higher taxes and confiscation of grain and land. The world population was much less in 1680 than it had been in 1600, with some regions losing perhaps a third of their population.

Parker isn’t saying that the sudden cooling of the earth in the 17th century caused all the mayhem of the period, but that sudden climate change combined with and exacerbated political instability to push the world into general disaster and decline. I’m only about a third through this 700+ page tome, but I’m already convinced that Parker gives us a roadmap to our future if we don’t slow down global warming: resource shortages, natural disasters and population displacements could plunge most of the world into a living hell of poverty, warfare, epidemics, famine and environmental degradation. 

The topic of today’s OpEdge article is not, however, Global Crisis, but one paragraph on page 34 of the book.  The topic of the paragraph is what Parker calls “indirect” or “opportunity” costs, which refers to the lost opportunity to spend money on something because you have already spent the money on something else. In the paragraph in question, Parker refers to the many positive initiatives that 17th-century governments did not pursue because they had already spent so much fighting wars:
·         Philip IV of Spain, who spent £30 million to finance foreign wars between 1618 and 1648, claimed that he didn’t have the funds to set up a national banking system.
·         Charles I of Great Britain, whose wars between 1625 and 1630 cost £6 million, decided he could not afford to create public granaries for famine relief.
·         After Manchu raiders broke through the Great Wall in 1629, the emperor’s drastic reductions in non-defense spending included closure of one-third of all postal stations.

Parker suggests that these examples represent the tip of an iceberg of societal needs that went unfilled in the 17th because rulers were raising armies to grab or defend land.

La plus ça change, as the French say: The more things change, the more they remain the same.  The United States currently spends more than $680 billion a year on the formal military budget, or about 19% of all federal government spending and 28% of estimated tax revenue. That’s more than we spend on education, highways and bridges, research, job creation, safety inspectors, agencies such as the Center for Disease Control and the Federal Drug Administration and all other discretionary goods and services.  This enormous number—enough to build 340,000 new houses a year at $200,000 a pop or to cut the annual college tuition build by $10,000 for about 6 million people—does not include what we spend to fight the wars in Iraq and Afghanistan, which for some reason Congress and the Bush II Administration decided to keep out of budgetary and deficit discussions. 

By itself, the United States military budget accounts for 40% of global arms spending with a budget from 6-7 times that of China.

No wonder are roads are full of pot holes. No wonder federal aid to higher education has been slashed. No wonder our space exploration program is winding down instead of ratcheting up. No wonder there are more outbreaks of food poisoning and food recalls, which safety inspections help to prevent.

I’m not saying that we should do without military expenditures, but I’m fairly confident that if we swore off hegemonic foreign invasions, cut our nuclear force (which could destroy life as we know it on Earth many times over), cut research on new military weapons and significantly reduced our current armed forces, that we would be able to invest our tax revenues in more productive means.

Of course the dirty little non-secret of a capitalist system with few restraints on the market is that it needs war and military spending to provide enough jobs.  Of course, this dirty little secret has its own dirty little secret, which is that we would not harm a flourishing economy with a small military budget—but we would make a tremendous shift in wealth from military suppliers to suppliers of alternative energy and environmental protection equipment, social programs, highway builders and engineers and other segments of the economy that do not have quote as much lobbying clout as the military-industrial complex. 

Tuesday, June 17, 2014

Jump Street: once again, a movie character portrayed as intelligent is uncoordinated and socially inept

By Marc Jampole 

With “22 Jump Street” one of the most popular movies of the summer, it is instructive to remember how the two bros of this cop bromance met in “21 Jump Street,” the first of the Jump movies.

The Jonah Hill and Channing Tatum characters knew each other in high school, but only as acquaintances. Seven years later at the police academy, Channing—a handsome and tall stud who dominates the physical training—notices that klutzy Jonah earned a 100% on a written exam which Channing flunked. The two become best buds—Jonah’s character helping Channing’s to raise his written test scores to a “C,” while under Channing’s tutelage, Jonah improves his conditioning enough to pass the physical exams, even though he still is more uncoordinated than you would ever want a police officer to be.

Once again in a mass entertainment, a highly intelligent character is presented as physically inferior—out of shape, weak, poorly coordinated, slow.

But it gets worse. We also learn that Jonah’s character is also socially inept: the girl he wants to take to the prom rejects him.

So yes, once again mass entertainment creates the myth that if you’re smart, you are likely unattractive to the opposite sex.  We get this myth big-banged into us  in television shows, movies, cartoons and commercials. It seems as if only the superhuman freaks like Ironman can be smart and sexy.

Of course, real life is different. While there are some smart people who flat line after doing one sit-up or sit shyly and stiffly in a corner at parties, there are many others who play competitive athletics and have an active social life. When I see the highly intelligent portrayed as social pariahs or weaklings, I think of the top 25 chess players at the national chess tournaments in my son’s grade when he was playing youth chess. Among the top 4-6 players, there were kids who looked as pale as a white sheet and were either soft and blubbery or very thin—these were the kids who studied chess three or more hours a day, every day. But most of the top-ranked boys (since mostly boys played chess back then) in the country were like my son: muscled and always moving with an athletic grace. And why not—these extremely intelligent kids were usually involved in one or more sports. Another example would be astronauts: these men and women have very high intelligences and are athletically gifted; no one has reported that an astronaut ever had a hard time getting a date.

Since it’s mostly very smart people who write, direct and produce movies and TV shows, why do they insist on insulting themselves by telling what amounts to a big lie? My own theory is that the makers of our mass entertainments reflect the views of the people who pay the piper—the very wealthy, who rightfully sense that a true meritocracy would result in their losing their money and position. They also see that computerization has dramatically decreased the number of good jobs around; by denigrating the intelligent, they may hope to discourage the poor from striving in school, thus reducing competition for their offspring.  Both today and in the past, religious figures have denigrated intelligence as inferior to unquestioning faith. The application of intelligence generally involves questioning, and one thing that the sellers of goods and services don’t want is for us to ask questions in the marketplace. Thus, in many ways, promoting intelligence and scholarly activity as a social value upsets the status quo of American consumerism.

Interestingly enough, the Jump movies manifest not only the traditional disdain of intellectuals in American society, but also represent an alarming social trend: the infantilization of adults. More and more adults are maintaining their entertainment habits of childhood, playing with My Little Pony dolls or Legos, reading comic books and Harry Potter novels and spending vacations at theme parks. Many movies in recent years have glorified the life of adults—especially adult men—who remain children and teenagers. 

Add the Jump series to entertainment in which men behave as boys.  The cops are adults who go undercover in a high school and pretend to be average teenagers. In other words, they have formal permission from authorities to behave as children, despite being adults. Their job is to be “not grown up.” The Jump movies therefore show another way for adults to remain kids and retain the predilections and habits of their teen years. 

Saturday, June 14, 2014

Editorial: Let's Clear the Air


President Obama rightly took the initiative in imposing limits on carbon pollution from power plants. Fossil fuel producers are screaming bloody murder, but the consensus of climate scientists is clear: Immediate action is needed to reduce carbon in the Earth’s atmosphere as we attempt to throttle down on global warming.

Using authority under the Clean Air Act, the Environmental Protection Agency on June 2 unveiled regulations that offer the states flexibility to get to the goal of cutting carbon pollution from power plants by 30% by 2030, compared with 2005.

In coal country, the new rules are seen as job killers and Obama is vilified for conducting a “war on coal” that threatens coal-mining families.

Paul Krugman noted in the New York Times that the Chamber of Commerce has predicted the cost of reducing carbon dioxide emissions will be more than $50 billion a year between now and 2030. That almost certainly is overstated and it’s supposed to sound like a big deal. But Krugman noted that the US has a $17 trillion economy, which will grow over time. “So what the Chamber of Commerce is actually saying is that we can take dramatic steps on climate — steps that would transform international negotiations, setting the stage for global action — while reducing our incomes by only one-fifth of 1%. That’s cheap!”

He also noted that the chamber’s estimate of costs per household — $200 per year — would be a fraction of 1% of the average American household income of more than $70,000 a year, which also is going to rise over time.

“One more useful comparison: The Pentagon has warned that global warming and its consequences pose a significant threat to national security. (Republicans in the House responded with a legislative amendment that would forbid the military from even thinking about the issue.) Currently, we’re spending $600 billion a year on defense. Is it really extravagant to spend another 8% of that budget to reduce a serious threat?”

Krugman also has noted that back in the 1980s conservatives claimed that any attempt to limit acid rain would have devastating economic effects. “In reality, the cap-and-trade system for sulfur dioxide was highly successful at minimal cost. The Northeastern states have had a cap-and-trade arrangement for carbon since 2009, and so far have seen emissions drop sharply while their economies grew faster than the rest of the country. Environmentalism is not the enemy of economic growth.”

At this point, Krugman noted, coal mining accounts for only 1/16th of 1% of overall US employment. “Shutting down the whole industry would eliminate fewer jobs than America lost in an average week during the Great Recession of 2007-9,” he wrote.

“Or put it this way: the real war on coal, or at least on coal workers, took place a generation ago, waged not by liberal environmentalists but by the coal industry itself. And the coal workers lost.”

Employment in coal mines already has been reduced by coal companies, along with the tops of Appalachian mountains. Automation enabled by strip mines, which take the tops off the mountains and fill in the valleys below, has reduced the number of jobs from 177,848 workers in 1984, when 3,496 mines produced 895 million tons of coal, to about 88,000 jobs at 1,300 mines that produced more than one million tons in 2012.

Ironically, an EPA crackdown on mountaintop removal in 2009 — which the industry complained was the first shot of Obama’s war on coal — actually forced coal companies to return to more labor-intensive underground mining and put more miners back to work. Jobs increased from an average of 76,470 jobs under George W. Bush to an average of 88,152 under Obama, according to an analysis by Appalachian Voices.

West Virginia was the nation’s second leading producer of coal, and the leader in coal mining jobs, with an average of 22,626 under Obama, up from 17,976 under Bush.

Kentucky was the nation’s third leading producer of coal, and ranked second in coal mining jobs, with an average of 17,168 under Obama, compared with 15,826 under Bush, an increase of 8.5%.

It doesn’t make much sense to sacrifice the environment simply to keep coal mines open. And Central Appalachia’s coal appears to be running out, as many of the thick, easy-to-mine seams have been harvested. The US Energy Information Administration estimates that coal production in eastern Kentucky and West Virginia will soon be half of what it was in 2008, plunging from 234 million tons to 112 million tons in 2015, Brad Plumer reported at WashingtonPost.com in November 2013.

Another big problem for Appalachia’s coal industry is competition from cheaper, low-sulfur coal from the West, particularly from Wyoming’s Powder River Basin. Wyoming is the nation’s leading coal-producing state, producing 388 million tons of coal nearly 40% of the nation’s coal production, from 18 mines that employed more than 6,500 miners in 2013.

Coal producers also face competition from cheap natural gas from shale fracking as well as renewable energy sources.

Emily Atkins noted at ThinkProgress.org (June 6) that the Chamber of Commerce report is based on a much more aggressive policy than the one the EPA proposed, and it fails to account for new jobs that would be created in the clean energy sector.

The EPA has projected that the coal extraction industry would lose as many as 14,300 jobs from 2017 to 2020 as a result of the new rule. However, EPA said renewable energy construction could increase by up to 19,100 jobs over the same time period. The EPA also estimated that up to 112,000 jobs would be created solely by the energy efficiency sector in 2025. There already are more jobs in the renewable energy sector, which created 78,600 green jobs in 46 states in 2013, according to Environmental Entrepreneurs (E2). Solar energy employs nearly 143,000 total workers, as solar workers installed 4,751 megawatts of new solar photovoltaic capacity, good for roughly 29% of all new US electrical capacity.

Uncertainty over the federal Production Tax Credit slowed wind energy development, and 8,500 jobs created in 2013 were down from 12,600 in 2012, but the wind industry still employs 80,700.

There is substantial reason to believe that replacing more coal-fired plants with renewable energy will result in a net job gain, Atkins noted. Multiple studies over the last 10 years — from EPI to the University of California at Berkeley — show that the renewable energy sector generates more jobs per megawatt of power installed, per unit of energy produced, and per dollar of investment than the fossil fuel-based energy sector.

In terms of solar energy, a 2004 Berkeley study showed that every $1 million of investment in the solar industry generates 5.65 person-years of employment over ten years, while $1 million invested in the coal industry generates only 3.96 person-years of employment over the same time period. For the wind industry, $1 million in investment equals 5.7 person-years of employment, the study showed.

Congress should take steps to help workers who already have been displaced from their jobs in coal mines and other industries and help them to transition to renewable energy or other industries. Extension of long-term unemployment benefits would be a good first step. The Department of Labor recently announced it would award $7.5 million to help Kentuckians who used to work in the coal industry find new jobs.

More can be done to address job displacement, but people in coal country should stop denying that transition is inevitable. And depicting the Obama administration as the enemy of coal country is neither helpful nor very smart politically. We might add that Kentucky could use a Democratic senator such as Alison Lundergan Grimes to represent the state’s interests at the White House, since neither Sen. Mitch McConnell nor Sen. Rand Paul have been able to play well with the Obama administration. — JMC

From The Progressive Populist, July 1-15, 2014

Populist.com
Blog | Current Issue | Back Issues | Essays | Links
About the Progressive Populist | How to Subscribe | How to Contact Us

Copyright © 2014 The Progressive Populist
PO Box 819, Manchaca TX 78652

Selections from the July 1-15, 2014 issue


COVER/Sam Stein
Eric Cantor wasn’t right enough?


EDITORIAL
Let’s clear the air with carbon regs


LETTERS TO THE EDITOR 

DON ROLLINS
GOP picks up Florida’s check this time


RURAL ROUTES/Margot McMillen
Organic ‘standards’ isn’t enough


DISPATCHES
Cantor upset may be nail in coffin of immigration reform;
Texas Republicans adopt immigration bashing platform;
Prog congressional candidates win;
Cal’s ‘top two’ primary eliminates 3rd party rivals;
Repubs block student loan refinancing plan;
Dems propose Social Security expansion;
GOP to GI's: We'll get to you maybe;
Nixon aide confirms campaign scuttled Viet peace talks in 1968...


JOHN YOUNG
Obama: True champion of the planet


GRASSROOTS/Hank Kalet
Money corrupts in many ways


BOB BURNETT
Racism: What’s the problem?


WENONAH HAUTER
Fracked gas exports without restrictions


HEALTH CARE/Joan Retsinas
Obamacare will insure felons


SAM URETSKY
Red states need Medicaid patch


WAYNE O’LEARY
Schizophrenic Democrats


JOHN BUELL
Economic lessons from the boys of summer


GENE NICHOL
On being a white people’s party


N. GUNASEKARAN
21st century India in the making?


SETH SANDRONSKY
Labor and the AT&T-DirecTV merger


ROB PATTERSON
Neil deGrasse Tyson gets around ‘Cosmos’


POPULIST PICKS
‘Hunted’; ‘Rad Gumbo’; ‘Party Girl’



and more ...

Monday, June 9, 2014

Latest bad idea from right-wing economists: cure wealth inequality with more policies that make us unequal

By Marc Jampole

Now that conservatives and their academic factotums realize that they can no longer deny that we have become a world in which inequality is growing, they are beginning to fight a rear-guard action by declaring that the way to reverse the trend of greater inequality is to promote the very policies that created it.

In a Wall Street Journal article titled “The Blue-State Path to Inequality,” Stephen Moore, chief economist of the right-wing Heritage Foundation, and Richard Vedder, an economics professor at Ohio University, compare the inequality of income in red states and blue states and finds that there is a greater spread in the blue states, which they aver without proving have greater social safety nets. 

Moore and Vedder cook up a stew of bad math and faulty logic to try to prove their point.  Their reasoning is so laughably inept that I think I’ll refer to them as the “Keystone Profs,” in honor of the Keystone Cops, a fictional crew of incompetent police officers from the silent movie era.

Let’s start with the bad math. To demonstrate that inequality of income is greater in the blue states, the Keystone Profs use the Gini coefficient, a single number cooked up by an Italian statistician Corrado Gini more than a century ago.  The Gini coefficient takes a set of raw data and tries to turn it into a single number that can be compared to similar sets of raw data of other populations; the lower the Gini the less inequality of income exists in the population being measured. 

The problem is that the Gini coefficient is highly inaccurate. One of the first things that Thomas Picketty does in his Capital in the 21st Century is to discredit the Gini coefficient as a viable tool for measuring wealth inequality.  Even the Keystone Profs admit there are many flaws in the Gini coefficient.

We cannot assume the Gini coefficient sorts out the states accurately. The differences in Gini coefficients in the red and blue states the article references are slight; all are in the .400s. For example, the difference between red state Texas (.477) and blue state California (.482) is slight—certainly within the margin of error of a Gini coefficient comparison.  We cannot depend on a Gini ranking of the states to reflect reality. Yet the Keystone Profs persist in using it.

But even if we accept the flawed Gini coefficient as our tool for measuring inequality of income, Moore and Vedder’s argument doesn’t hold water for two reasons. First of all, they assume that the wider social welfare net in blue states causes inequality when in fact social welfare programs are a response to inequality.  Large inequality of wealth developed earlier in the blue states, which industrialized and urbanized first and include those two big-wealth magnets, New York City and California. While the wealthy and ultra wealthy live everywhere, no one can deny that more of them make their money or end up living in New York City and the state of California.  The large 19th and early 20th century fortunes were made in or transferred to New York and Chicago. Today’s high income professions are focused in New York and California—entertainment, banking, high tech. New York and California have always spawned multimillionaires at a higher rate than other states. No wonder blue states communities recognized the problem of inequality earlier than red states and have done more about it.

But while the blue states do more than red states to foster equality of income and wealth, it isn’t much more on the world’s scale. All states are providing less support to public school and university education than 30 years ago and all have put the lid on or cut property and state income taxes. All have suffered from lower federal taxes, a federal policy that has been anti-union or neutral for more than three decades and the decline in local jobs generated by the federal government.

The bigger mistake, then, is to limit the comparisons between blue and red states. That’s like reciting the alphabet from C to E. 

There isn’t that great a difference in what blue and red states do to counteract the tendency of free market capitalism to create wide inequalities of wealth when compared to what governments do in western Europe and Japan, which take more taxes from the wealthy and provide better educational, healthcare and retirement benefits to everyone.  While wealth and income inequality have grown in western Europe and Japan (see Picketty’s book for a great analysis) over the last 35 years, the populations of these countries still enjoy more income and wealth equality than we do in the United States. By excluding western Europe and Japan in the discussion, the Keystone Profs cook the books.

Vedder and Moore follow Picketty in saying that economic growth removes inequality, but they advocate policies that are not pro-growth, but pro-corporation. They assume that unions, minimum wages and high income taxes are bad for economic growth when in fact the economic history of not just the Unites States but the entire world proves that high taxes on the wealthy and high incomes for workers lead to high growth because more of the wealth circulates to people who will spend it as opposed to accumulating it in overvalued assets, which is what the ultra-wealthy do with all of the extra wealth they have from lower taxes. 

And all you Wall Street Journal subscribers in the audience thought it was the fish bones that stank so putridly when you entered the kitchen this morning. No, it was the newspaper you wrapped them in!