Thursday, February 23, 2012

By saying Obama follows a secular not a religious agenda, Republicans make case for reelecting the President

By Marc Jampole

The Republicans are doing a lot to restore my faith in President Obama.

First Rick Santorum accused Barack Obama of being the most anti-religious president in the nation’s history. Then Mitt Romney said that Obama sought to substitute a “secular agenda” for one based on faith.

What a relief to know we have a secular president.

What a relief, especially after weeks of hearing Santorum, Romney and the other Republicans talk about introducing religion into public policy and political decisions in one way or another.

What a relief to have reaffirmed the fact that President Obama, unlike his immediate predecessor, follows the U.S. Constitution and the wishes of our mostly deist founding leaders and promotes a secular agenda.

We are, after all, a secular country, one that is not supposed to have religion enter into government decisions, nor to favor one religion over others.

I understand that a good 20-25% of voters think differently. They believe that we are a Christian nation. What’s more, they want to force a set of values on everyone that they associate with Christian practice.

But we have to go no further than the issue of birth control to recognize how much the real world diverges from the ideals of the Christian right wing. In the real world, 98% of all women use birth control sometime during their life. In the real world, the cost of birth control is far less than the cost of an unwanted pregnancy, which means that when you ask a religious organization to pay for their employees' birth control you are asking them for no money and in fact giving them money since their insurance costs will decline.

To be sure, both Romney and Santorum are playing to the hard core base that now determines Republican primary elections. But besides pandering to the right-wing “values” voters, the labeling of Obama as “non-Christian” and “non-religious” also has a subtle impact on other voters. It’s another, harder to disprove, version of the “Obama wasn’t born in the United States” canard. Wherever we fall on the political spectrum and however devout or non-practicing we are, most Americans have a Christian background and live their lives by an ethos they identify as Christian. To say that Obama is not Christian or is anti-religious (which is just another way for them to say “anti-Christian”) turns him into the “other” or the “stranger” who has historically been so feared in American culture and politics. The ultimate outsider, of course, is the Black.

There’s an economic aspect to the accusations, too: that old saw that communists and socialists are godless. Americans for decades are used to hearing the words “godless” and “socialist” (or “communist”) pronounced one after the other to describe progressives and liberals. To say that Obama is against religion is also a veiled way of saying that he is against our free market capitalist system.

And yet, I think many will share my desire that religion not enter into a president’s decision-making. I think most of us prefer that decisions are based on facts, science, reason, the law and what’s best for the country and its people.

That Romney and Santorum affirm that our current president is following a secular path gives me more confidence in what Obama is doing. That the Republican candidates don’t like Obama’s secular path is scary. Because they could get elected, and that would be trouble. We recently had a faith-based president and it didn’t really work out, unless you like useless, goalless wars, state-sponsored torture, catastrophic environmental change and the largest deficit in American history.

Wednesday, February 22, 2012

Why pick on affirmative action? Why hasn’t anyone sued universities about favoritism to legacies and athletes?

By Marc Jampole

Yesterday’s announcement that the U.S. Supreme Court has accepted an appeal from a rejected white applicant to the University of Texas-Austin reminds me that every time the constitutionality of affirmative action returns to the issues agenda, one question is always left out:

Why didn’t the applicant suing a university for accepting minorities with a less impressive record of academic achievement also or instead sue the university for discriminating in favor of athletes and legacies? Legacies, for those not up on academic admissions parlance, are students whose parents previously went to or have contributed money to the university.

The unfairness of lowering standards to accept athletes to an institution dedicated to intellectual achievement and professional training seems fairly obvious. I can understand giving a break on the SAT or grades to a national chess champion or the winner of a science fair, but what does sports have to do with the mission of higher education?

And yet where are the lawsuits claiming that the university acted illegally in preferring a kid with a 1000 on the SATs who can throw a football 70 yards through a tire to someone with 1100 on the college boards who has no extra-curricular activities?

The advantage given to legacies is even more unfair, because it is a major part of the rigidity in the college system that necessitated affirmative action in the first place.

A quick search of “college admissions legacies” will reveal the oft-told history of legacy preferences, which Ivy and other colleges began to use after World War I when their objective criteria were leading to the admission of too many Jews. Today, Ivy League and private colleges give from 10% to 15% legacy preference, but some give as many as 30% of all acceptances to legacy applicants. Overall, many more students are admitted as legacy preferences than are admitted through affirmative action programs. What’s more, polls find that 75% of all Americans are opposed to legacy preferences.

Yet no one sues universities because they are giving preferences to people whose parents graduated from or gave money to the school.

Correction. In his 2010 analysis of everything that’s wrong with legacy preferences in The Chronicle of Higher Education, educational policy guru Richard D. Kahlenberg cites a losing 1970’s case filed against legacy preferences at the University of North Carolina-Chapel Hill. The problem was that the plaintiff included legacies in a hodgepodge of discrimination complaints, including discrimination for being an out-of-state student. Her SAT scores were 850, uncommonly low for an in-state or out-of-state UNC student both back then and now.

Kahlenberg summarizes the compelling legal argument against legacies at both public and private universities, based on the 14th amendment and the Civil Rights Act of 1866. He also demolishes the argument that universities need legacy admissions to keep the donations rolling into the university coffers. Analysis reveals that people give about the same to their alma mater or other universities with or without the legacy factor.

The higher the university on the food chain, the more the legacy admission undercuts the ambitions of other competent but less connected candidates. Let’s face it, the more important the job, the more likely it will be filled by an Ivy or Ivy-like (e.g., Stanford or Northwestern) graduate, and if not an Ivy, a public Ivy (e.g., U of Washington or UNC) or other prestigious school. The college educated earn more in general, so no matter how you slice it, legacies come from wealthier families on average than non-legacies at virtually every university.

When you’re better off, you are more likely to have special lessons, more likely to travel abroad, more likely to participate in national youth competitions, more likely to take an SAT prep course and more likely to live the lifestyle behind the cultural assumptions of the SATs. Affirmative action is one of the ways that colleges can level the playing field.

I’m not saying that once legacy admissions are ended we won’t need affirmative action anymore. What I’m saying is that the legacy system reflects the subtle action of institutional racism and is one more reason we need affirmation action. By the way, we’ll know that we won’t need affirmation action anymore when the rate of poverty among African-Americans or the average wage of African-Americans is about what it is for everyone else.

As others have pointed out, the Supreme Court decision to take the appeal is especially disturbing in light of its 2003 ruling upholding affirmative action. In that ruling, the Supreme Court laid down some affirmative action guidelines for universities and suggested that the high court shouldn’t revisit the issue for another 25 years. Of course that was before Roberts and Alito joined the court.

Most of the plaintiffs in these affirmative action lawsuits are middle class and upper middle class whites. That leaves us with the fact that none of these fighters for equality ever thought to take on legacies. There are certainly more legacies than there are affirmative action students, and the legacies tend to include more of the children of those people who have taken money from the middle class through the economic and tax policies of the past 30 years.

It’s quite puzzling. The only answer that I have is that it’s another manifestation of the racism that has distorted the politics and social policy of this country since its inception.

Tuesday, February 21, 2012

MIT professor revisits the cultures that bombed Pearl Harbor, destroyed the WTC and dropped A-bombs

I’m reading a very fascinating scholarly study called Cultures of War by MIT history professor John Dower. Professor Dower analyzes in detail the similarities in the cultural assumptions, bureaucratic decision-making processes, fascination with technology, religious orientation, use of propaganda and strategic military imperatives of four events that serve as symbolic points in the cultural history of two wars.

Interestingly enough, in all cases the decision to act proved disastrous for mankind. In three, and maybe all four, it was also disastrous for the nation/organization instigating the act:

  1. The Japanese attack of Pearl Harbor

  2. The United States detonation of atomic bombs on Hiroshima and Nagasaki

  3. Al Qaida’s terrorist attacks by suicide crash on September 11, 2001

  4. The United States “war by choice” against Iraq under the false pretext of destroying weapons of mass destruction and disabling Al Qaida.

The obvious symmetry in considering the cultures that produced all four of these actions is that they are paired: in both pairs, the actions of the United States are typically considered to be reactions against horrible deeds, by a nation in one pair and by a terrorist organization in the other.

But Dower carefully draws U.S. defensive motives into question: He recapitulates what we already know about the duplicitous lead-up to the invasion of Iraq by the Bush II Administration. He also reminds us that no one can say for sure that dropping two atom bombs saved more lives than the more than 200,000 that the U.S. obliterated in two fairly short bombing raids. We know for a fact, however, that the U.S. wanted to brandish its new weapon for the Soviets and everyone else in the world and wanted to stop potential grumbling at home about the cost of the Manhattan Project. Dower also shows us how much the U.S. wanted to go to war against Japan before Pearl Harbor and how much the Bush II (non)brain trust wanted to attack Iraq before 9/11.

Many would consider it blasphemy and/or treason to equate the moral bearing of Osama bin Laden and the U.S. under Bush II or Roosevelt, but Dower makes a very strong case. Here are some of the similarities:

  • The use of religion as a justification and of religious imagery in manifestos about the events

  • The postulation of a battle between civilizations

  • The belief that your civilization/religion is infinitely superior to the civilization/religion of the foe

  • A justification of killing innocent civilians, politely known as “collateral damage”

  • The focus on technology (in the case of Osama, it was computers, not weapons)

We really did feel threatened by the combined force of the Japanese and Germans, and we really did feel threatened by the terrorist attack. But Dower makes it clear that Osama’s followers, too, felt their civilization threatened by U.S. military activity and economic and social imperialism.

The fact that many of us think that the Japanese and the extreme Islamists were fools or devils to feel that their way of life was superior merely suggests that we are unable to transcend our own cultural imperatives that tell us that our way of life is the best. I’m not saying that Al Qaida was right to launch the 9/11 attack. It was as wrong as we were to drop the atom bombs and to attack Iraq (and to pursue the Viet Nam War for that matter). But they certainly were right to think their civilization was threatened by U.S. military and political actions.

And just as Pearl Harbor united even the most vocal pacifists and isolationists in the United States, just as 9/11 united us again, so did the invasion of Iraq, the declarations by the Bush II Administration of a holy war and our establishment of a world-wide torture gulag help Osama recruit many new terrorists throughout the Islamic world. In all cases, too, the governments and terrorist organization embarked on major propaganda campaigns to convince their people that they did the right thing by unleashing death, in one case against soldiers sworn to fight to the death and in the others against innocent bystanders.

Perhaps the most horrible similarity in all the cultures of war that Dower considers in his provocative and easy-to-read book is that in all four attacks, the participants—the military men, the government officials, the scientists and engineers, the soldiers who did the dirty work—were able to forget that they were engaged in killing large numbers of people.

Many factors led to the dehumanization of the people at the receiving end of the bombs, tanks and suicide plane crashes:

  • A bureaucratic language that used euphemisms and passive constructions to conceal horrible realities

  • A focus on the complex challenge of the task at hand (as opposed to the destructive ends)

  • A belief in the inferiority of the victims

  • The self-deception often generated by the constant creation of propaganda for others

  • A religious belief, i.e., that you’re on a religious mission

These factors affected the decisions and actions of the one-party Shinto autocracy, the Christian representative democracy and the Islamic theocracy. Who has the moral high ground here?

There is no threshold for terror, for starting wars of choice or for unleashing weapons of mass destruction.

There is no military justification for bombing and attacking civilians that can offset or override the moral evil involved in killing masses of innocents.

Heinous acts of terror, genocide and lawlessness in a just cause turn that cause to evil and take from the perpetrators any claims of morality or civilization.

I came away from reading Cultures of War more convinced than ever of these things.