Survey
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
Crazy Rich, Are Americans successful because they're nuts?
By Daniel Gross,
Posted Tuesday, Feb. 15, 2005, at 1:50 PM PT
Are Americans rich because they're nuts?
That's the thesis of a new book, The Hypomanic Edge: The Link
Between (a Little) Craziness and (a Lot of) Success in America, by
John D. Gartner, a psychotherapist and clinical assistant professor of
psychiatry at the Johns Hopkins University Medical School. America
may be the dominant force in the global economy because we're a
nation made of somewhat Crazy Eddies—gonzo businessmen and women who may be genetically predisposed to take big-time risks.
It sounds right. Creativity and genius have often been linked to mental
illness. Many virtuoso painters, composers, and architects are a little
kooky. Why not entrepreneurs? Gartner identifies "hypomania" as a
benign form of madness—manic without the depressive. Here's how
they present: "Hypomanics are brimming with infectious energy,
irrational confidence, and really big ideas. They think, talk, move, and
make decisions quickly. Anyone who slows them down with questions
'just doesn't get it.' " They find it hard to sit still, channel their energy
"into the achievement of wildly grand ambitions," feel a sense of
destiny, "can be euphoric," have a tendency to overspend, take risks,
and act impulsively, and with poor judgment. They are "witty and gregarious" and possess a confidence
that makes them charismatic and persuasive. It sounds a lot like Jim Clark, the founder of Netscape and
animating character of Michael Lewis' The New New Thing. Or like President Bush.
Gartner concludes that many of the components of the archetypal American character—optimism,
entrepreneurial energy, religious zeal—fit the hypomanic profile. Perhaps, he posits, this nation of
immigrants has a gene pool of hypomanics. Immigration may select for it. After all, who else would be
eager to embark on a dangerous journey, convinced he could make it in the New World? As a result,
Gartner writes, Americans may be "culturally and genetically predisposed to economic risk."
Gartner sets out to prove his case not through contemporary case studies or the aggregation of vast
quantities of data, but through brief, lively studies of key hypomanics from five different centuries.
Christopher Columbus, the "messianic entrepreneur," had divine ambitions. Some of the first settlers of
the United States were "protestant prophets"—John Winthrop of Massachusetts and Roger Williams in
Rhode Island. Alexander Hamilton fearlessly charged British positions at Yorktown and wrote The
Federalist papers in a series of all-nighters. His hypomania "was an essential ingredient in his
accomplishments. And his accomplishments were an essential ingredient in the creation of America."
Andrew Carnegie, the hypomanic steel baron, feverishly built up an industrial empire and then spent the
rest of his life trying to change the world. More recent hypomanics might include movie-studio moguls
David Selznick and Louis B. Mayer, and geneticist Craig Venter, who founded Celera and set off a race to
decode the human genome.
It's a fun read. But Gartner's diagnosis overlooks the more rational factors that were crucial to the settling
of America and the construction of our unique economic and business culture. The British Protestants
who crossed the Atlantic Ocean in the 17th century came for God, but they also came for the cod. And the
timber and the tobacco. By the time John Winthrop arrived in Massachusetts, non-dissenting settlers—
economic opportunists, not prophets—had been farming and trading in Jamestown, Va., for more than 20
years.
Immigrants like Hamilton, Carnegie, and David Selznick's parents may have been hypomanic. But
whether you were a landless peasant in Ireland in the 1840s, a Jewish cobbler in Russia in 1910, or an
1
Indian computer programmer in the 1980s, the decision to move to America made profound economic
sense. America had cheap land in abundance. The opportunities—if occasionally overblown—were real.
So was the infrastructure that provided for the rule of law, capital markets, and the protections of minority
rights.
In fact, practicality and realism have coexisted with messianism and utopianism in the American
experiment from the very beginning. Benjamin Franklin was almost certainly hypomanic by Gartner's
reckoning, but he was also one of the most relentlessly practical Americans of the 18th century. The U.S.
economy has been distinguished by hypomanic booms and busts and by the creative destruction that lies
at the heart of entrepreneurial capitalism. But it's also distinguished by durable systems and institutions
that are emblematic of our distinct style of managerial capitalism—the Federal Reserve and the New York
Stock Exchange, our telecommunications networks, and Procter & Gamble. Such institutions are not the
work of flamboyant geniuses but of tons of thoughtful, far-sighted, and average Americans.
To put it another way, hypomanics instigate, but they rarely build institutions that outlive them. The
necessary counterpart to the hypomanic Henry Ford was Alfred P. Sloan, who built General Motors.
Andrew Carnegie could never have monetized his fortune were it not for the constantly rationalizing
financier J.P. Morgan. In every generation, cooler-headed executives and entrepreneurs enter a field after
a burst of creativity and build businesses and fortunes by imposing the discipline and order the creators
may have lacked. Steve Jobs of Apple (who certainly has hypomanic tendencies) is an innovator who has
become a billionaire, made long-term investors wealthy, and helped spread the personal computer
revolution. But the same can also be said of the preternaturally well-adjusted Michael Dell.
Executives who chew the scenery may suck up attention, revolutionize industries, and symbolize
American capitalism. But they're not particularly good for companies' long-term health, as Harvard
Business School professor Rakesh Kurana's argues in his book Searching for a Corporate Savior: The
Irrational Quest for Charismatic CEOs. No country produces as many turnaround artists, bankruptcy
specialists, and temporary CEOs as America does.
The late 1990s Internet boom was the golden age of hypomanic CEOs. But how many of their companies
still survive? Perhaps the most prominent and successful Internet executive is eBay's distinctly
unhypomanic Meg Whitman. So, yes, Gartner is right that hypomanic first movers matter a lot, and that
we need a few more. But we shouldn't forget the huge contributions of the more sober-minded folks who
follow behind and pick up the pieces.
http://www.slate.com/id/2113568
2
Tripping De-Light Fantastic, Are psychedelic drugs good for you?
By
John Horgan, Posted Wednesday, May 7, 2003, at 8:44 AM PT
A year ago, hoping to dispel the postpartum gloom that had gripped
me after I finished writing a book, I hiked into a forest near my home
and pitched a tent under some pine trees. I spent that day and evening
listening to the forest, scribbling in my journal, and thinking—all while
under the influence of a psychedelic drug. The next morning I returned
to my wife and children feeling better than I had in months.
What I did that day should not be illegal. Adults seeking solace or
insight ought to be allowed to consume psychedelics such as LSD,
psilocybin, and mescaline. U.S. laws now classify them as Schedule 1
drugs, banned for all purposes because of their health risks. But
recent studies have shown that psychedelics—which more than 20
million Americans have ingested—can be harmless and even
beneficial when taken under appropriate circumstances.
Citing this research, some scholars and scientists are proposing that
the prohibitions against psychedelics—or entheogens, "God
engenderers," as believers in their spiritual benefits prefer to call
them—should be reconsidered. This legal issue has recently been
brought to a head by a religious sect in New Mexico that is suing the
United States for the right to drink a hallucinogenic tea called
ayahuasca in its ceremonies. A federal court is expected to rule on the
potentially momentous case any day now.
"There is no doubt that hallucinogens can be used unwisely," says Charles Grob, a psychiatrist at the
UCLA-Harbor Medical Center, who testified in the ayahuasca case and is a leader of the effort to
rehabilitate the reputation of these substances. "But these studies show that they can be used safely
within certain parameters."
After LSD's astonishing effects were discovered by the Swiss chemist Albert Hofmann 60 years ago,
many psychiatrists considered it and similar compounds potential treatments for psychological ailments.
That is why the psychiatrist Humphrey Osmond called the drugs "psychedelic," from the Greek root for
"mind-revealing." By the mid-1960s, medical journals had published more than 1,000 papers describing
the treatment with psychedelics of some 40,000 patients afflicted with disorders ranging from
schizophrenia to alcoholism.
One remarkable study from this period, known as the Good Friday experiment, probed the capacity of
psilocybin (the active ingredient of so-called magic mushrooms) to trigger spiritual experiences. On Good
Friday, 1962, the Harvard psychiatrist Walter Pahnke dispensed psilocybin and placebos to a group of 30
divinity students and professors assembled in a Boston church. A majority of those who received
psilocybin reported sensations of profound awe and self-transcendence that had lasting positive effects.
By the early 1970s, the surging popularity of psychedelics among the young—urged by Timothy Leary to
"turn on, tune in, and drop out"—had triggered a backlash. Psychedelics were outlawed, and virtually all
research on them was shut down amid rising concerns about their adverse social and medical effects. In
1971, the Journal of the American Medical Association warned that repeated consumption of
psychedelics would usually result in permanent "personality deterioration."
Further research has shown these fears to be exaggerated, says John Halpern, a psychiatrist at Harvard
Medical School. To be sure, psychedelics can cause acute and sometimes persistent psychopathology,
3
especially in those predisposed to mental illness. But Halpern maintains that these compounds are
usually harmless when ingested by healthy individuals in appropriate settings.
As evidence, Halpern cites a five-year study he recently completed with a Harvard colleague of members
of the Native American Church, who are permitted by U.S. law to consume the mescaline-containing
cactus peyote as a sacrament. Church members who had taken peyote at least 100 times showed no
adverse neurocognitive effects compared to a control group.
Similar results have emerged from a study of ayahuasca by UCLA psychiatrist Grob and other scientists,
results that Grob describes in the essay collection Hallucinogens. Ayahuasca, a tea brewed from two
Amazonian plants, contains the potent psychedelic dimethyltryptamine, or DMT. Although the tea often
triggers nausea and diarrhea, Indian shamans have prized it for its visionary properties for centuries, and
since 1987 it has served as a legal sacrament for several churches in Brazil. The largest is the Uniao Do
Vegetal, or UDV, which combines elements of Christianity with nature worship and claims 8,000
members.
Grob and his colleagues found that UDV members were on average healthier physiologically and
psychologically than a control group. The UDV adherents also had elevated receptors for the
neurotransmitter serotonin, which has been correlated with resistance to depression and other disorders.
Many of the UDV members told the scientists that ayahuasca had helped them overcome alcoholism,
drug addiction, and other self-destructive behaviors.
These findings emboldened UDV adherents based in New Mexico to sue the U.S. Justice Department for
the right to drink their sacrament. The case dates to 1999, when federal agents seized 30 gallons of
ayahuasca that the UDV group had imported from Brazil. Last August, a federal judge in Albuquerque
ruled in favor of the UDV worshippers. The judge contended that the Justice Department had not shown
that ayahuasca poses enough of a health risk to warrant restricting the UDV members' right to practice
their religion. The Justice Department lawyers appealed, and the case is now before the 10th Circuit Court
in Denver.
Of course, even advocates of entheogens admit that they pose risks. Ayahuasca can cause cardiac
irregularities and other dangerous side effects, Grob notes, when combined with amphetamines,
antidepressants, cheese, red wine, and other common substances. Ayahuasca drinkers generally fast
before sessions to reduce the risks of these side effects.
In the new book The Antipodes of the Mind, an in-depth study of ayahuasca visions, the Israeli
psychologist Benny Shanon recalls that the tea transformed him from a "devout atheist" into someone
awestruck by the wonders of nature and of human consciousness. But he warns that ayahuasca can also
be "the worst of liars," leaving some users gripped by belief in ghosts, telepathy, and other occult
phenomena. Similarly, in Cleansing the Doors of Perception, the eminent religious scholar Huston Smith
recalls that during the Good Friday experiment, in which he participated, one subject became so agitated
that he had to be injected with Thorazine. Smith nonetheless contends that entheogens can serve a
spiritual purpose, if used with reverence; after all, mind-altering substances have played an inspirational
role in many religions, including Hinduism and the Eleusinian cult of ancient Greece.
I have firsthand experience of the double-edged nature of entheogens, which I've taken sporadically since
my late teens. There have been moments of vertiginous anxiety; one particularly bad trip in 1981 left me
with unsettling flashbacks for months. But overall the pros have outweighed the cons. I usually end up
feeling the way I did after my LSD sojourn last summer: existentially refreshed, with a renewed
appreciation of ordinary existence.
Entheogens are far less addictive and toxic than alcohol or tobacco. Why should we continue to be
denied their benefits, in religious or non-religious contexts? Risks could be minimized by making these
substances available only through licensed therapists, who can screen clients for mental instability and
4
advise them on how to make their experiences as rewarding as possible. Some people might be
prescribed entheogens for a specific disorder, such as depression or alcoholism. And just as drugs such
as Prozac and Viagra are prescribed not just to heal the ill but also to enhance the lives of the healthy, so
might entheogens.
This scenario may not be so far-fetched, given last year's court decision favoring the UDV in New Mexico
and other developments. A sanctioned study of psilocybin's capacity for treating obsessive-compulsive
disorder is now under way at the University of Arizona. And UCLA psychiatrist Grob recently received
FDA approval to investigate whether psilocybin can relieve anxiety in late-stage cancer patients. Maybe
those of us who enjoy an occasional psychedelic sojourn will be able to do so without feeling like outlaws.
Wouldn't that be a trip?
http://www.slate.com/id/2082647
5
The Weaker Sex, The Kamasutra gets a cold shower.
By Christopher Caldwell,
Posted Tuesday, June 18, 2002, at 10:21 AM PT
Here's some advice for lovelorn men from the world's most consulted—and
certainly its most venerated—sex manual: "If you cut the knotty roots of the
millwort and milk-hedge plants into pieces, coat them with a powder of red
arsenic and sulphur, dry and pulverized the mixture seven times, mix it with
honey, and spread it on your penis, you put your sexual partner in your power."
This is one of the things you'll find if you look for sex tips in the Kamasutra, a
treatise on erotic love written by the Northern Indian sage Vatsyayana
Mallanaga in the third century. But you wouldn't have found it before this
month. The version of the Kamasutra we've been previously reading gives only
a fragment of the original texts. Chicago religion professor Wendy Doniger and
Harvard psychoanalyst Sudhir Kakar have just published an authoritative new
translation. The surprise is that descriptions of sexual positions are a relatively
minor part of the Kamasutra; what really matters are the folk recipes (which are
nuts) and the moral advice (which is vile).
The most popular version of the Kamasutra is the translation that bears the signature of the 19th-century
rake, philologist, and deflowerer of Her Majesty's colonial subjects Sir Richard Francis Burton. It is not just
a poor translation—it isn't even Burton's, most of the work having been carried out by his friend, the
considerably less dashing F.F. "Bunnie" Arbuthnot. What is worst about it is that it's incomplete, focusing
primarily on Chapters 2 (sexual positions) and 6 (drugs). It will come as a crushing blow to adolescent
boys everywhere that the part that was left out bears no resemblance to the part we know. The
Kamasutra we've been reading up to now is all dinner and no bill.
The real Kamasutra is a much more serious enterprise than Americans were led to believe back in 1962,
when it was first published as an actual trade book. Back then, most readers were emerging from the
recreational psychosexual torment that characterized the Freudian 1950s. To such readers, the book
seemed to offer readers release through dharma. This assumption was based on the well-established fact
that all cultures outside of the United States and Europe do nothing but run barefoot through the grass all
day and laugh and have sex.
It turns out that dharma wasn't quite what free-love advocates (or even Jack Kerouac and Allen Ginsberg)
thought it was. Dharma, the editors tell us, "includes duty, religion, religious merit, morality, social
obligations, the law, justice and so forth." The Kamasutra takes the Indic wisdom that there are three aims
in life—religion, power, and pleasure—and concentrates on the third. Proceeding mostly through lists and
enumerations, the book is to lovers what Clement Wood's Rhyming Dictionary is to poets. "When a man
and woman have not yet made love together," Vatsyayana writes, "they use four sorts of embraces to
reveal the signs of their love: 'touching,' 'stabbing,' 'grinding,' and 'pressing.' " The compendium of
enumerations, by the editors' tally, runs to "12 embraces, 17 kisses, 16 bites and scratches, 17 positions,
5 unusual acts, 16 slaps and screams," etc. There is also a list of the kinds of men who are good with
women, which runs on for half a page and begins with "a man who knows the Kamasutra."
What will most horrify devotees of the old edition is the book's morality. Its advice is explicitly greedy,
sneaky, paranoid, and amoral. It's bad enough that it lays out a list of ways to sleep with other men's
wives. The book even presents (in Book I, Chapter 5) a dozen reasons for doing so, to buck up the faint
of heart. Here's one: "There is no danger involved in my having this woman, and there is a chance of
wealth. And since I am useless, I have exhausted all means of making a living. Such as I am, I will get a
lot of money from her in this way, with very little trouble." (Doniger and Kakar tend not to paint the
adulterer in quite the swashbuckling light that previous translators did.) Here's a second: "Another
woman, whose desire I desire, is in the power of this woman. I will get to that one by using this one as a
6
bridge." The Kamasutra even tells you (in Book V, Chapter 6) how to break into your rival's house. This is
not a guide to "love," in any sense that we would understand it. It is Machiavelli for satyriacs.
That's why the highlight of this edition is its enormous and erudite introductory essay, which places the
book in context, explains traditional Sanskrit literary forms, and directs the reader to the 20th-century
Indian critic Devadatta Shastri's magisterial commentaries on the book in Hindi, which paints the
Kamasutra as a moral treatise.
Rightly so, the editors think. While Doniger and Kakar deplore the book's nonchalance about rape, along
with its "pretty virulent homophobia," they also think it was a step in the right direction. For one thing,
although it claimed sexuality had a strong element of violence in it, it sought to limit that violence. For
another, though it treated women as sexual objects, it also managed to treat them as sexual actors. That
this was some kind of breakthrough will come as a surprise to the modern readers. Today, any person
with a blip of intelligence assumes it's a law of nature, and not just a cultural achievement, that our sex
lives revolve around subjectivity—what he thinks she thinks he thinks she thinks, etc. That it is a cultural
achievement is the key thing one learns from reading a book that antedates it.
One cannot escape, toward the end of their essay, the sense that Doniger and Kakar think the version of
love and sex espoused in the Kamasutra is much shallower, much less erotic, and much less interesting
than our own. The invention of romantic love—which the authors trace to the beginning of the 12th
century—is responsible for this progress. The editors admit that conventions by which a man idealizes a
woman as "an infinitely superior being" have their own problems. But such conventions do "reverse the
accents of the master-servant metaphor of possessive desire." This reversal is missing from the world of
the Kamasutra. The result is not only that the love life Vatsyayana describes is brutal but also that it is
shallow. Indic love of the classical literary period—which the counterculture of 40 years ago thought it was
being so subtle in embracing—is Hugh Hefner's kind of love.
The publication of a full Kamasutra means that future generations will read it differently than past ones
did. In coming years, young men will leaf through the pages of this edition, learn about the sexual mores
of third-century India, and close it thanking God they live now and not back then.
http://www.slate.com/id/2067092
7
The Cinderella Complex, What do modern fairy tales tell us about our
culture?
By Jill Hunter Pellettieri, Updated Friday, April 23, 2004, at 3:19 PM PT
No matter how sassy, ambitious, and independent a girl might be, her
life is only complete when she's Mrs. Prince Charming. At least that's
the subtext of two "modern" Cinderella stories now in theaters—Ella
Enchanted and The Prince & Me.
Both films are updates of Walt Disney's well-known take on the classic
fairy tale, in which a passive, preternaturally good young woman is
saved from a dismal life as a scullery maid by the arrival of her Prince
Charming. But in hewing closely to this version's old-fashioned
plotline, both films also mark a shift away from the feminist revisions
Ella meets her fate
and critiques of the 1960s and '70s, in which writers like Marcia
Lieberman and Angela Carter rejected the notion that happiness was only to be found by nabbing a man.
Historically, fairy tales have reflected the values of the society in which they were written or revised-mirroring its preoccupations, obsessions, ambitions, and shortcomings. So the question inevitably arises:
What do these updates say about our culture's view of women and marriage? Why do the older versions
no longer ring true?
In Ella Enchanted, Ella of Frell (played by Anne Hathaway), a poor but beautiful young woman desperate
to escape the constraints of her awful stepfamily, sets out to rid herself of a curse she's had since birth—
excessive obedience. (Gail Carson Levine, who wrote the novel Ella Enchanted, set her defiant
protagonist in opposition to Disney's Cinderella whom she saw as blindly obedient.) Along the way, Ella
and the kingdom's much sought-after prince, Prince Charmont ("Char"), fall in love. In The Prince & Me,
Paige, a poor farm girl (played by Julia Stiles) fervently pursues her dream of becoming a doctor by
working her way through college. When she falls in love with "Eddie," the crown prince of Denmark, she is
faced with a choice: Should she attend medical school or stay with her true love?
It's interesting to examine the transformation that occurs in each young woman's life after meeting her
prince. Prior to their courtships, Ella and Paige are role models for what modern women are taught they
should be: smart, confident, career-minded. They are in control of their destinies and undistracted by
men. Paige, for example, makes it clear that she has no time for a boyfriend who might compromise her
goals.
And yet their admirable moxie finds its demise in romance. Pair Ella and Paige with their better halves
and they differ little from their Disney predecessor: They're submissive and invisible, women behind their
men.
Indeed, the responsibility of each woman is to bolster her man's ego
and improve his public image. When Eddie and Char express doubt
and fear about becoming king, Paige and Ella inspire their princes with
an obligatory adoring pep talk, providing them with the confidence to
ascend the throne. In The Prince & Me, much is made of Paige's
positive influence on her fiance (she redirects his lowbrow preference
for the "Girls Gone Wild" type); we witness Paige's transformation from
independent young woman, worthy of praise in her own right, to
caretaker, supporter, muse, and moral compass for her husband.
Destined to be queen
On one level, of course, Ella Enchanted and The Prince & Me are just
teen entertainments, with the sylphlike stars and escapist plots that Hollywood has recognized have boxoffice potential. The last several years have seen a flood of such princess tales, movies like The Princess
Diaries series (also starring Hathaway) and Hilary Duff's upcoming A Cinderella Story. America's
8
increased royal fixation may also explain the success of such films; our fascination has no doubt been
heightened—at least among teenage girls—by Prince William's coming of age. His bachelor status
provides young women with a fantasy that he might feasibly provide them with their very own fairy tale.
But on another level, these films reflect a newly coalesced form of "feminism"—one that shows women as
bright and self-sufficient but doesn't categorically disparage pre-feminist preoccupations with spouse and
family. From a radical feminist perspective, these films seem to be less progressive than the fairy-tale
revisions of their immediate forbears; it might even be argued that such "updates" represent a regression.
Today's tales act out society's dueling fantasies and realities, exposing the deep cultural ambivalence we
have about the role of women in society. We may pay lip service to the idea that a woman should be
educated and capable, but there remains an unspoken belief in our culture that happily ever after only
comes with a prince.
How did we get here? In 1697, French writer Charles Perrault updated an age-old fairy tale about a young
woman named Cinderella to appeal to his contemporaries—French nobility and bourgeoisie. Many early
versions of the tale boasted a resourceful young woman who played an active role in her destiny.
Perrault, however, wrote his Cinderella as a well-mannered, docile, selfless woman who would fit in
seamlessly with the ideal 17th-century upperclass society. It was his version that Walt Disney made
famous in 1950 and to which feminists vehemently reacted in the 1960s and '70s, ultimately co-opting the
story to their own ends. (In her famous poem, "Cinderella," for instance, Anne Sexton mocks the happily
ever after: "Cinderella and the prince/ lived, they say, happily ever after/ like two dolls in a museum case/
never bothered by diapers or dust …")
Today's teenage girls (the intended audience of these films) have been brought up by women who read
Sexton and her peers and who have taught their daughters that they can want it all—marriage, career,
family. But can they have it all? As Ella Enchanted and The Prince & Me demonstrate, women still walk
the tightrope between society's competing expectations: Can a woman maintain her own identity while
falling for her prince?
It's a question our culture has not yet been able to answer. First ladies and wives of presidential
candidates are encouraged to forge their own identity, but they nevertheless feel the cultural imperative to
present a façade of sympathetic, supportive wife to the American public: Think of Hillary Clinton's
transformation from policy maven to domestic doyenne or Judy Steinberg Dean's metamorphosis from
professional-minded doctor to doting 60 Minutes interviewee. Likewise, the women of Sex and the City
were celebrated for being smart, accomplished, and independent, yet they still longed to be swept away
for a fairy-tale ending with the man of their dreams. The Bachelor, perhaps our most horrifically explicit
public enactment of the fairy tale, thrusts its young female contestants into an environment in which
they're all vying to be "chosen" by "the one"—just as the prince in these fairy tales must "choose" his
bride from a pool of eligible maidens.
Yet Ella and Paige do not entirely abandon their individuality. By overcoming her curse of obedience, Ella
saves Char's life and prevents his evil uncle from ruling the kingdom. Paige makes the decision to attend
med school, leaving Eddie to tend to Denmark on his own. But in the end, both women fold their lives into
the lives of their men, not the other way around. Ella becomes Char's wife, and though Eddie tells Paige
he'll wait for marriage until she becomes a doctor, Paige still plans to ascend the throne as queen of
Denmark; it's clear that she'll make some significant sacrifices to be with him—just later, rather than
sooner.
The fictional heroines of Ella Enchanted and The Prince & Me attempt to reconcile the dichotomous
desires of today's women, but the resolution is an uneasy one. Neither of these women quite figure out
how to navigate the conflict, making them an apt reflection of our society's own bewilderment. But we'll
get there eventually: Twenty years from now, there will surely be a new Cinderella incarnation. It's likely
she'll still want her happily ever after, but what that might be is anybody's guess.
9
http://www.slate.com/id/2099412
10
Why Blondes Love Sufis, A new novel about America's love for Sufism
gets it wrong.
By Lee Smith, Posted Thursday, Jan. 23, 2003, at 2:33 PM PT
In Pico Iyer's new novel, Abandon, a character laments the poor turnout for a
lecture on Islamic theology. If the lecture's listing had used the word "Sufi," he
thinks to himself, "there'd have been blondes in the back row." Sufism is
Islamic mysticism, and if it hasn't yet hit the Hollywood set the way Jewish
mysticism and the cabala have, its best-known spokesman, Rumi, is
nonetheless an American best-seller. Rumi, a 13th-century Persian-language
poet, whose collections are usually highly visible at your local independent
booksellers, is the central presence in Iyer's novel. He's not a character per se,
but his poetry permeates the book, and, in the spirit of mysticism, a Sufi is
never more present than when he's absent.
In the United States and non-Muslim Europe, Sufism has become a kind of
New-Age amalgam of spiritual practices, but its roots reach back to the earliest
days of Islam. As the early Muslim conquerors took their faith to different lands,
Sufism began to borrow from different traditions, including Greek and Hindu philosophy and Christian
theology. Hence, many of the early Sufis believed that all faiths were equal, and that to privilege one
religion was to deny the existence of the divine elsewhere. As Ibn al-'Arabi, another 13th-century Sufi,
writes in his collection Stations of Desire, beautifully translated by Michael Sells:
My heart can take on
any form:
for gazelles a meadow,
a cloister for monks,
For the idols, sacred ground,
Ka'ba for the circling pilgrim,
the tables of the Torah,
the scrolls of the Qur'an.
I profess the religion of love.
Wherever its caravan turns
along the way, that is the belief,
the faith I keep.
Sufism has always had a powerful literary tradition. Part of Iyer's novel deals with Sufi manuscripts
smuggled out of Iran after the '79 revolution—which is a useful metaphor for Sufism's modern-day
transformation. Paradoxically, fundamentalism has popularized Sufism in the Islamic world while diluting
its more esoteric elements. It's true, for instance, that in Egypt during the last 20 years the ranks of the
Sufi orders have swelled. But that's not because of a renewed interest in mystical theology as such.
Rather, everyday Muslims looking for an alternative, any alternative, to militant Islam have found it only in
Sufism. Conversely, throughout Islamic history, fundamentalism often arises as a corrective to what it
perceives as the excesses of Sufism.
Historically, there are a couple of things about Sufism that rankle the fundamentalists. One is that Sufism,
many feel, encourages a kind of fatalism and withdrawal from the real world. The second is that Sufism
looks a little like Christianity. Sufis believe in intercessors (in Arabic wali or auliya')—people with special
spiritual access, who can help a person's prayers be heard by Allah. Mainstream Islam, never mind
fundamentalism, rejects intercessors, since it holds that every Muslim is equal before God. (Even the
prophet Muhammad is not prayed to but prayed for.) Furthermore, the tombs of important Sufi figures and
intercessors—such as that of the mystic and poet Ibn al-'Arabi in Damascus—have become shrines for
Muslim pilgrims. But contemporary mainstream Islam asserts that there are only three major shrines—
11
Mecca, Medina, and Haram al-Sharif in Jerusalem—and that revering others is tantamount to idolatry,
which is forbidden in Islam. That Sufism does resemble Christianity in many regards may be one reason
that it has attracted so many Western spiritual adventurers. Sufism holds that all men embody the
divine—a notion that liberal humanists presumably find at least as appealing as the creed that one man,
Jesus Christ, is a part of God. Other attractions are Sufism's poetic legacy, of which Rumi and Ibn al'Arabi are the premier examples, and its rich tradition of music and dancing.
In Abandon, the two main characters find themselves at a Sufi dance ceremony in a remote California
spot and are deeply moved by both the frenetic energy of the dervishes and the still, quiet center of the
spectacle. Iyer's novel somewhat idealizes Sufism by highlighting the noblest aspects of its intellectual
history and doctrine of self-abnegation. Unfortunately, in the contemporary Islamic world Sufism is seldom
as philosophically generous as the major figures in its past—or even as the spiritually restless Americans
whose Sufism is an avenue into another culture. I recently got a sense of this in Cairo when I spent some
time with a Sufi group, or tariqa (a word derived from "way" or "path"), as part of an ongoing quest for a
genuinely moderate or liberal trend in Islam.
"Sufis don't believe in jihad," my friend Ahmed, a philosophy student writing on Sufism, reassured me.
"We're different from other Muslims." Sufis interpret the Quran much more liberally than other Muslims,
indeed sometimes metaphorically. Thus unlike the vast majority of Muslims who hold that jihad is both a
holy war fought to defend Islam and an offensive one to spread its reach, Sufis believe that jihad is only
about the internal battle that all human beings must wage against their own worst impulses.
"Sufis want peace with everyone," said the leader of the tariqa, Sheikh Mokhtar. "Sufis don't want war
with Israel."
Sufis don't usually express political opinions; as this opinion doesn't challenge the government's peace
treaty with Israel, it's safe. Sufis are generally focused on their own spiritual betterment, which has its
good and bad points. It's good, on the one hand, that the Sufis don't want to destroy Israel; bad, on the
other, that in Egypt, a country with very high rates of poverty, unemployment, and illiteracy, the Sufis feel
little obligation to help others. In a sense, Sufis have turned an inward philosophical stance into a
justification for ignoring the ills of their society. Or, as one Sufi told me when I asked why more Muslims
didn't practice Sufism: "Thank God they don't. We don't want heaven to be as crowded as Cairo."
This is the kind of sentiment that has made it easy for the fundamentalists to win wide popular support.
Egypt's professional elite, which comprise many of the Sufi groups, by and large, despise the
underclasses for their ignorance and poverty; the Muslim Brotherhood, which is Egypt's and the rest of
the Arab world's best-known fundamentalist organization, does social work in poor neighborhoods.
Sufism as self-improvement tends, like many religious enthusiasms, toward competition. Secret
knowledge is a significant part of all mystical traditions and Sufi wisdom, issuing as it does from the heart
and not the mind, comes in stages. Not everyone is equally advanced. I was told that as a guest I wasn't
going to get any "secret knowledge." I said that as an outsider I probably wouldn't be able to distinguish
what was secret and what wasn't anyway—still, what kind of secret things did they mean? One member
of the tariqa told me that Sheikh Mokhtar, a roundish man in his early 50s, knew everything. He's a wali.
That fact didn't quite register with me until after the meeting, when I saw Muslim men reaching out like
Romans to touch the sheikh's blue suit. Ahmed and I got to meet with him briefly.
Ahmed excitedly told the sheikh that he was writing his philosophy dissertation about the differences
between Sufi tariqat. Sheikh Mokhtar was furious. "There are no differences between Sufis!" he shouted.
(We learned later that Sheikh Mokhtar was in the middle of a fight with his brother, another Sufi wali who
led a tariqa of his own. Their argument had degenerated so far that the two brothers had accused each
other of apostasy—a charge that in Islam frequently leads to a death sentence.) Ahmed was red with
public shame. The sheikh, still in a rage, told me that if I wasn't Muslim, there was little for me in Sufism.
12
I wasn't expecting anything from Sufism. It was worse for my crestfallen Muslim friend, who had wanted to
abandon himself to something like the philosophy of his ancestors, Ibn al-'Arabi and Rumi, who had built
a part of that world.
http://www.slate.com/id/2077467
13
Tune In, Turn On. A new documentary about Ram Dass shows him to be
more than your average hippie.
By Siân Gibby Posted Tuesday, April 20, 2004, at 8:24 AM PT
In 1971, a long-haired former Harvard professor known as Ram Dass first told the world to "Be here
Ram now." These words encapsulated his philosophy of existing in the present moment and would
Dass become the seminal mantra for the burgeoning New Age movement. And it was with this simple
message, as well as his yoga and meditation teachings, that Ram Dass (formerly Richard Alpert) ushered
in a new era of consciousness-raising.
Thirty years later, being here now isn't what it used to be. In the '70s, Ram Dass' audience was mostly
middle-class white kids interested in expanding their minds by whatever means—including drugs. Now
those original disciples are aging, and their bodies are failing. But the proto-New Age guru remains
relevant to this audience, especially now that he is aging himself.
In 1997, the then 65-year-old Ram Dass had a stroke that left him partly paralyzed and with vastly
diminished speech capacities. A new documentary, Ram Dass Fierce Grace (PBS, April 20, 10 p.m. ET;
consult your local listings), tells his story and claims that this phase may actually be the most spiritually
enriching and valuable one of his eventful life. Ram Dass, the film asserts, can now help people deal with
aging, infirmity, and disability by pointing the way to a higher understanding of these mysteries. The film
uses extensive footage from the 1960s, as well as interviews with Ram Dass, his family, and disciples to
show his life's arc: from precocious child to brilliant academic, from countercultural guru to the physically
broken but still lively man he is today.
The documentary professes the worthy aim of illustrating how Ram Dass' own pain has lifted him to a
new level of awareness and increased his compassion (after all, if anyone is going to learn a great lesson
about suffering from physical ailments, wouldn't it be Ram Dass?), but this notion isn't illustrated in any
concrete way for the viewer. Educating the audience, either by providing practical techniques of the sort
Ram Dass employs for transcending illness and aging or by placing his message and spiritual work in a
historical context, takes a back seat to showcasing the guru's considerable charisma and the pathos of
his situation. The end result, sadly, is a moving portrait of an old, ill man, but little else.
To understand the importance of Ram Dass' work, it's necessary to know the history of how he became
one of the country's most celebrated spiritual practitioners. Ram Dass was born Richard Alpert, the bright
and personable scion of a wealthy, influential Jewish family. Alpert taught at Harvard in the '60s, joining
his colleague Timothy Leary in "consciousness-raising" experiments using LSD (for which they were soon
fired). Unsatisfied by that research, Alpert then traveled to India where he met a man, Neem Karoli Baba,
who would become his Hindu guru. Alpert stayed on to study Hinduism with this master, was renamed
Ram Dass, and subsequently came back to the States to spread word of new techniques for spiritual
practice based on yoga and transcendental meditation (meditation using a repeated mantra, such as
"Om").
In the late '60s, America's youth was ripe for anything that would allow for the exploration of human
consciousness. Ram Dass' methods—which seemed revolutionary at the time but by today's standards
are simply practical techniques for increasing awareness—made him enormously popular. His book Be
Here Now sold more than a million copies. Since then, Ram Dass has written a number of spiritually
themed books; he has continued to teach; and in 1996, he started plans for a radio show, a project that
he was forced to set aside after the stroke.
The film lingers too long on historical footage of the hippie era and on by-now clichéd sound bites of
Leary talking about "turning on" and dropping acid. Long sequences of traipsing Hare Krishnas, swathed
in marigold garlands and singing their familiar chant, pass by without explanation. All of this, it seems, is
there to provide background, but the images raise more questions than they answer. How did the
experience of LSD and transcendental meditation compare, for instance? Were the drug experiments
14
worthwhile as an introduction to opening oneself up to a different level of awareness? The film fetishizes
the countercultural movement, lumping Ram Dass in among the hippie free-for-all.
The film also rather mystically (and crudely) assumes that the guru's newly truncated speech has taken
on a poetic, spiritual force. (I guiltily found myself wondering if the massive doses of drugs he had taken
in his youth might have played a part in his brain condition. This doesn't get addressed.) Though he has
lost his ability to articulate, his cryptic, sometimes baffling statements are supposed to be oracular,
infused with a wisdom born of suffering. This may be true, but it's hard to tell. We are shown scant
footage of Ram Dass talking at the height of his powers of articulation, so we can't know how diminished
his capacities are, or whether he has indeed become more insightful.
In failing to illustrate how his stroke has focused (or diminished) Ram Dass' mind, and thus his message
and teachings, the film basically just shows us an old guy, famous among the counterculture youth of
yesteryear, who is still cool and sweet and charming and who may now be more lovable than ever—
because he's suffering.
In spite of its frustrating omissions, Ram Dass Fierce Grace is worth watching. Witnessing the guru's
struggles—especially in light of his warmth, goodwill, and buoyant ego—is sobering and moving. It's also
unusual to see the frankness with which some of the people interviewed for the film speak about Ram
Dass and their own spiritual and physical pain. For example, a couple whose daughter's violent death
years ago nearly destroyed their family recall with breathtaking candor how a sympathetic letter from Ram
Dass helped them begin the healing process. The agony that suffuses their faces and voices is
shockingly fresh and raw.
Unlike the standard, sentimentalized portraits of grieving or the "empowering" survivor tales we usually
see on television, this movie shows the face of aging with a great deal of honesty and grace. If only it had
done a better service to Ram Dass himself, by placing him in historical perspective instead of in
distractingly anachronistic surroundings. There is much to say about the way in which Ram Dass'
message affected our culture: He helped usher in the New Age movement, which, notwithstanding the
outworn jokes of crystals and pyramid power, is a worthwhile development in an otherwise spiritually
bereft society. Outside the often strangling parameters of organized religion, there isn't much room in
America for the soul's questing; what scope there is was imported to us by people like Ram Dass. Indeed,
the spiritual tools he introduced to us—for centering oneself and embracing our present experience—
have stood the test of time. And now, in our culture of cynicism and violence, we need his perspective
and ecumenical outreach more than ever.
http://www.slate.com/id/2099202
15
Front Page Horror, Should newspapers show us violent images from
Iraq?
By Jim Lewis, Posted Monday, April 5, 2004, at 3:32 PM PT
(To see some of the graphic photos that occasioned this article—which argues against the use of
such images—click here.)
I spent a few weeks last summer in the Ituri region of the eastern
Congo, writing about the civil war there. One morning I set out with a
French wire-service reporter and a photographer for a place called Nizi
in the hills outside Bunia, where we had heard there had been a
massacre. When we arrived a few hours later, we found a mud hut
village that some Lendu tribesmen had attacked the day before.
Everyone who had lived in the village had fled, and the attackers, too,
were gone. On the ground there were piles of bodies, mostly women
and children who hadn't run quickly enough. There were 22 of them in
all, and they had been butchered with machetes; in some cases they
Horror in Fallujah
bore patches of bloody red and fatty orange, where their genitalia had
been hacked off or their internal organs cut out. I had heard that Ituri was the worst place in the world.
Faced with evidence of just that, and with no one left alive to interview, I walked to the edge of the village
and back, I wrote down some notes, and after that I stood around helplessly for a little while. And then, in
a daze, I took a lot of pictures.
When I came home I had the pictures developed, but then I couldn't figure out what to do with them. I
showed them to my editor, and I showed them to a few friends, some of whom looked at them carefully,
and some of whom immediately turned away. I looked at them myself, often, and I wanted very badly for
other people to see them, for a number of reasons, not all of which I'm proud of. But part of it was simply
to try to convey to people what it was like: This is what happened, and there they were, lying just like that.
As it turned out, I had to explain what was happening in Ituri anyway; and in time the pictures just seemed
to get in the way, as indeed they would if I linked to them here. I still have them, and I still look at them
from time to time, but I rarely try to show them to anyone else.
A few months before I went to the Congo, I'd had a discussion here on Slate with Luc Sante, during which
I argued that American news venues had not just the right but the duty to publish photographs of
atrocities. At the time I had, of course, seen those sorts of pictures, but I'd never taken them. Now that I
have, I'm not so sure. It's not that the public deserves to be spared such things, because they don't. It's
just that I no longer think that what happens when horrifying pictures are published has anything to do
with journalism.
The subject of photography in wartime bears revisiting because last week, for the first time, truly terrible
pictures from the Iraq war were published in the American press. To be sure, we've seen many images of
bloodshed and grief, but the pictures of the burned and mutilated bodies of American military contractors
hanging from the bridge in Fallujah crossed a boundary. I don't remember seeing anything quite so
shocking in a newspaper in my lifetime, with the exception of one or two 9/11 photographs, printed almost
by accident in the days immediately following the attacks.
Now, presumably, the news media have had thousands of photographs of the war in Iraq that were
equally graphic, if not more so, and they have forsworn publishing them. So why did editors choose to
use these? The answer, I think, is quite simple: It was convenient, and it was time. A newspaper could not
have published similar pictures of American soldiers, because there remains a taboo on showing explicit
images of our own military dead. And it would not have published similar pictures of dead Iraqis, because
there remains a disinclination to show detailed images of the effects of our own military campaign. But
these victims were American civilians working in the private sector and therefore unprotected by such
sanctions. What's more, support for the war is waning, and we're approaching a moment of transition, so
16
it's more in keeping with the public mood, and more revealing of the problems that lie ahead, to show
evidence that things aren't going well.
In general, the argument in favor of publishing raw and grisly photographs of war—and as I say, I once
endorsed it myself—is that they're necessary to bring home to people what's at stake, the real and
ferocious damage that combat does to cultures and to human bodies. Photographs, according to this
position, are more immediate and convincing than words. But what, really, did these pictures show? That
the people of Fallujah don't want Americans occupying their city? We knew that. That Iraqis are capable
of appalling savagery? We knew that, too. And besides, so are the members of any nation, given the right
circumstances. In fact, the Fallujah pictures looked to me very much like old pictures of white Americans
lynching blacks: in the background, the bodies mutilated and hung; in the foreground, the ecstatic crowd.
But what happened last week in Iraq was not, of course, a lynching. History, context, and culture all make
the events in Fallujah very different, but history, context, and culture are precisely what a picture can't
show, at least not one picture alone. So what the photographs tell us, most clearly, is what the press
thinks its audience can stand, and hence, how, in general, the war is going. On a literal level they show
almost nothing of any value whatsoever, except perhaps that something gruesome happened; and any
reporter could tell you that, as indeed many of them have. The pictures may seem to make the death of
those Americans more vivid, but in the end, they make it a little harder to see.
By coincidence, the same day that the New York Times ran the Fallujah pictures on the front page and
above the fold, it also ran a story about a Bronx man who had shot himself in the head in the lobby of an
apartment building. A police security camera caught the suicide on video, and somehow the footage
ended up on a Web site that specializes in "free video clips that include shocking moments, brutal
stupidity, and a healthy dose of hardcore sex." The man's foster mother was trying to find out how this
had happened: how the death of someone she loved had been turned into a kind of pornography.
That's a good question. The answer, I think, or part of it, is that pictures of extreme violence are always a
kind of pornography: There is an outrageous event. There is the fact that someone was present with a
camera to record it. There is the fact that someone had the means and the will to publish it. And there is
the fact that you and I are looking at it. The more surprising it is that these four things should come to
pass, the less the event we're looking at retains any of its original meaning. So pictures of horror, at a
certain point, no longer function as news; they become, themselves, the news. The papers that have run
the photographs from Fallujah have been getting scores of letters from readers, some appreciative, some
not. But that's just the problem: It's a fine thing when readers think about what their local papers do, but
on the whole, one would rather they were thinking about Iraq.
There's a certain kind of vaguely puritanical humanism that suspects pictures of being inherently
sensationalistic and manipulative—the idiom of the rabble, which must be kept subordinate to the natural
calm reason of words. I don't believe that. But shocking photographs become horror-porn very quickly
and very easily, more quickly and easily than language does. A picture can do many things that a
paragraph can't, but enlightening us about the horrors of war isn't one of them.
Last month I was working with a photographer in Pakistan, and one night I asked him over dinner if he'd
ever taken a picture that no one would run. Yes, he said: It was of two soldiers walking down the road,
one of them carrying the top half of a child who had been severed in two by a bomb, while the other
carried the bottom half. He told me which conflict of the many he had covered had produced this image,
but I don't remember where it was, and I suspect that the fact that I don't remember goes some way
toward explaining why no one would publish it—and why, with all due respect to this man, who is deeply
ethical, professional, and skilled, I don't think I would publish it either. Because shock overwhelms
information every time. I realized as much when I came back from the Congo and discovered that I
couldn't get my own photographs to show what, in the end, I wanted them to show: nothing more or less
than the fact that these poor people had died.
17
http://www.slate.com/id/2098283
18
Hell, No! We Weren't Wrong To Not Go!
Mickey Kaus, Posted Saturday, May 5, 2001, at 11:52 AM PT
One implication of the Kerrey revelations is so simple and obvious that it has somehow gone more or less
unacknowledged in the controversy. It's this: What Kerrey did in Vietnam (even in the least damning
version of events) doesn't just remind us what was wrong about the war. It gives retrospective moral
support to the decision by most of Kerrey's generational peers--myself included--to avoid combat service
in Vietnam.
New York Times columnist William Safire anticipated this point when, a few days after the Kerrey story
broke, he wrote a pre-emptive piece caricaturing and condemning the expected resurgence of anti-war
sentiment: "The American elites that ducked the draft were right to refuse to get involved in somebody
else's civil war, goes this voice." But it turned out Safire was trying to hoot down a voice that was never
heard. I haven't read all the Kerrey commentary, but I've read a lot of it--and I haven't seen anyone claim
that the incident at Thanh Phong shows why they were right to avoid fighting in the war.
It's not hard to explain the absence of this particular I-told-you-so. There may well have been a period,
during the '60s, when returning Viet vets were shunned as "baby-killers"--though I suspect that
phenomenon has been exaggerated, and Kerrey's example suggests that at least some of the vets may
have been projecting their self-condemnation onto others. In any case, the tables turned very quickly. As
early as 1975, the year the war ended, journalist James Fallows published an influential article in which
he condemned his own draft-dodging, noting that while he was a student the boys of "Chelsea and the
backwoods of West Virginia" were sent off to die. The moral course for those who opposed the Vietnam
War, he argued, was to go to jail.
Fallows initially advocated such civil disobedience as the most effective way to stop the war (it would
have produced a huge parental outcry). But that anti-war rationale soon fell away in the general desire to
honor those who served and, implicitly, condemn people who avoid their duty to the nation. Millions of
baby-boomers who ducked the war began to feel guilty, even envious of those who'd fought it. Bill
Clinton's escape from Vietnam-era service became a major liability, and not just because of the
manipulative way he did it. In this atmosphere, who wants to risk being charged with dishonoring veterans
(or scorned as a cowardly, elitist Boomer) by noting that Kerrey's admission casts the actions of men like
Clinton in a more favorable light?
Just because a course of conduct saves your ass doesn't necessarily mean it's wrong. I passively
avoided service in Vietnam (accepting a student deferment before escaping the draft in the lottery). I did it
mainly because, like the 90-plus percent of my draft-age peers who also avoided service, I didn't want to
get killed. But I also did it because I didn't want to kill women and children (or other men) without a better
reason than I'd been given. Kerrey's story reminds me that these qualms were justified, and that I
shouldn't be ashamed of my behavior. His description of the tacit rules of engagement in "free-fire zones"-in which "standard operating procedure was to dispose of the people we made contact with," even if they
were entire peasant families whose main crime was failing to obey our order to move out of their villages-is enough. (See the discussion of this point here and here.)
To be sure: I'm not saying Kerrey wasn't also a hero. I'm not saying it was wrong to go to Vietnam, or
dishonorable, or that those who served weren't obviously more courageous than those who avoided
serving. I'm not saying dodging service was courageous at all, or in any way more honorable. I'm not
judging Vietnam vets, including Kerrey (assuming his version holds up). I'm not saying the war wasn't
based on an honorable anti-communist goal, or that this wasn't a goal worth fighting for. I'm not saying
that many of us on the Left who opposed the war weren't grievously mistaken about the nature of the
Vietnamese and Cambodian communists. I'm not saying it was right that those who went to Vietnam
tended to be poorer and less well-connected than those who avoided service. I'm not saying it was right
that people like me escaped any form of national service, even civilian service. I'm not saying Fallows'
original point--that if I really opposed the war I should have gone to jail--doesn't have merit.
19
I am saying that the Thanh Phong story reminds us that avoiding serving in Vietnam had an honorable
and realistic ethical basis (in addition to its realistic selfish basis). Patriots like Kerrey himself, after all,
quickly came to the conclusion that the Vietnam War was an unjustifiable moral hell. We listened to them.
http://slate.msn.com/id/1007621
20
Dr. Hoffman's problem child turns 58
It started causing trouble as a teen and has never really stopped.
We can't name names, but its initials are LSD.
-----------By Chris Colin
April 16, 2001 | A hundred and one years ago today, the first
book of postage stamps was issued in the United States. Exactly
43 years after that, on a Friday afternoon -- April 16, 1943 -Swiss chemist Albert Hofmann took the world's first LSD trip.
We now know a lot. We know about the epiphanies, and the paranoia and the visions. We know that the
CIA conducted secret LSD experiments on U.S. citizens, the military and its own agents for years. We
know that there was briefly talk at the CIA of testing LSD on unsuspecting subway riders. We know that
the public got its hands on the drug in the '60s and reconfigured it as a countercultural apparatus. We
know that some users really did believe, at inopportune times, that they could fly. We also know LSD's
dangers were often exaggerated by the government, and then the media.
We know acid's popularity waned in the late '70s and '80s, then picked up in the '90s. In 1993, 3.2 million
Americans said they'd used it, according to the Drug Enforcement Administration. We know it makes
police and drug enforcement agents see red -- since 1981, they've busted only a handful of labs, and the
majority of their arrests have just been middlemen.
Hofmann called LSD his "problem child." Soon enough, it was America's, and as with all problem children,
we've yet to hone a good response.
On March 30, San Francisco artist Mark McCloud was acquitted of LSD charges in Kansas City, Mo.
McCloud, 47, makes blotter art -- the prints found on sheets of LSD tabs -- minus the LSD, according to
McCloud and a Kansas City jury.
Police seized 33,000 sheets of blotter paper, plus some framed samples, from McCloud's home,
according to the Kansas City Star. He was accused of conspiring to distribute and distributing the drug
near a school. But McCloud's lawyer pointed out that the blotter paper was untreated and therefore legal - never having been impregnated with the d-lysergic acid diethylamide, it was like a pot pipe with no pot in
it.
In the past, McCloud has joked that collecting blotter art is tricky, since it's tempting to eat it. A small,
personal stash of the drug was found in his apartment during the raid. He says he hung out with Timothy
Leary and tripped with '60s LSD pioneer Augustus Owsley Stanley III. McCloud likes acid. Nevertheless,
given the well-established history of LSD hysteria -- recall, for example, the rumor that evil acidheads had
wiped LSD on the buttons of pay phones across the country -- the prosecution argued with all the
paranoia of a bad trip.
"Mark McCloud was the head of an LSD conspiracy that gained a new generation to the cause of LSD,"
assistant U.S. attorney Mike Oliver said. "It is a cause that creates in people the capacity to make lifethreatening and life-stealing decisions."
When McCloud's reputation in the art world was trumpeted -- an editor from Paper magazine flew in to
testify on the two-time National Endowment for the Arts grant winner's behalf -- Oliver turned it around,
saying the artist used this reputation as a cover to disguise his countrywide distribution operations.
21
"This was about the circulation of blotter art as an art form," McCloud said, according to the story in the
Star. "Thank God the people of Kansas City can tell the difference between art and LSD."
Not so long ago, McCloud wasn't that keen on people telling the difference.
"Art is not a big enough word for it," the one-time art professor told the Los Angeles Times about his
collection back in 1995. "It's magic."
But with a life sentence looming, "art" became the perfect word for it. McCloud had a decent lawyer, and
the "magic" argument didn't surface once in the two-week trial. When it became clear that the prosecution
could link him to the empty blotters but not the LSD, McCloud watched the case unravel.
Even when they're not facing prison time, people don't stand up for LSD like they used to. Its current
renewed popularity notwithstanding, the days of earnestly arguing the drug's virtues are gone. Any
discussion about the chemical as a legitimate mechanism for exploring consciousness, treating terminally
ill patients or working with habitual criminals has long since been mocked into farce by either those
opposed to it or those now embarrassed by their youthful naiveté.
Before LSD was criminal and foolish or visionary and therapeutic, it was an accident that startled a young
chemist. Albert Hofmann had been researching ergot alkaloids at the Sandoz (now Novartis) company's
pharmaceutical-chemical research laboratory in Basel, Switzerland, when, during the purification and
crystallization of lysergic acid diethylamide, he noticed a strange sensation. He described it for a
colleague later in a widely quoted note, inaugurating the illustrious and indulgent genre of the acid trip log:
Last Friday, April 16, 1943, I was forced to interrupt my work in the laboratory in the
middle of the afternoon and proceed home, being affected by a remarkable restlessness,
combined with a slight dizziness. At home I lay down and sank into a not unpleasant
intoxicated-like condition, characterized by an extremely stimulated imagination. In a
dreamlike state, with eyes closed (I found the daylight to be unpleasantly glaring) I
perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with
intense, kaleidoscopic play of colors. After some two hours this condition faded away.
Three days later, Hofmann dosed himself deliberately, swallowing 250 micrograms. From that day comes
the famous bicycle ride through Basel, and his subsequent glimpse of the true powers of the drug, which
included terror:
A demon had invaded me, had taken possession of my body, mind, and soul. I jumped
up and screamed, trying to free myself from him, but then sank down again and lay
helpless on the sofa. The substance, with which I had wanted to experiment, had
vanquished me ... I was seized by the dreadful fear of going insane. I was taken to
another world, another place, another time. My body seemed to be without sensation,
lifeless, strange. Was I dying? Was this the transition? ... I had not even taken leave of
my family (my wife, with our three children, had traveled that day to visit her parents, in
Lucerne). Would they ever understand that I had not experimented thoughtlessly,
irresponsibly, but rather with the utmost caution, and that such a result was in no way
foreseeable?
In the years that followed, Hofmann and others explored LSD's potential as, among other things, an aid to
psychotherapy. While the military conducted awful (and long-denied, and at least once fatal) experiments,
the medical community wrote volumes about the drug's potential benefits. (Before LSD became illegal, it
was tested on more than 40,000 subjects by psychologists alone, according to Martin A. Lee and Bruce
Shlain in their book "Acid Dreams: The CIA, LSD and the Sixties Rebellion.")
Research extended beyond the unsuspecting to the rich and famous. Anaïs Nin, André Previn and Clare
Booth Luce all participated in LSD studies. For a while, Cary Grant couldn't say enough about his
22
treatment. Leary claimed that, contrary to popular opinion, it was Grant who turned him onto acid. In
1959, Grant raved about his LSD therapy:
I have been born again. I have been through a psychiatric experience which has
completely changed me. I was horrendous. I had to face things about myself which I
never admitted, which I didn't know were there. Now I know that I hurt every woman I
ever loved. I was an utter fake, a self-opinionated bore, a know-all who knew very little. I
found I was hiding behind all kinds of defenses, hypocrisies and vanities. I had to get rid
of them layer by layer. The moment when your conscious meets your subconscious is a
hell of a wrench. With me there came a day when I saw the light.
But acid's popularity took off just as the '60s did, and Leary, his research partner, Richard Alpert (aka
Ram Dass), and others got loud about it. Eventually it didn't matter whether or not they were right.
"He was a very intelligent man, and quite charming," Hofmann said of Leary in a 1995 interview.
"However, he also had a need for too much attention. He enjoyed being provocative, and that shifted the
focus from what should have been the essential issue. It is unfortunate, but for many years these drugs
became taboo."
Ironically, it's the taboo that gave McCloud his hobby -- if the acid proponents had succeeded, his
collection would be commonplace. They didn't, and it isn't; McCloud has three decades' worth of blotter
art and shows no signs of stopping.
"These are symbols of a secret society," he told the San Francisco Weekly in 1995. "And I am the mule
that has dragged the cultural evidence to town."
To McCloud, who emigrated from Argentina as a boy, blotter art confirms the integrity of the acid
pantheon. The variety of prints, he says, isn't essential; if the LSD industry were just about money, the
manufacturers wouldn't bother prettying up their product. (Maybe, but the significance of blotter art has
also been exaggerated. Aficionados allude to the impact an image will have over a trip: Ganesha will
frame your experience in mysticism, while Daffy Duck's mug will give you a silly trip. Of course in reality,
while LSD does vary from sheet to sheet, the different images have nothing to do with what will happen
next.)
The Kansas City acquittal wasn't the first for McCloud -- in 1993, he was released on similar charges.
Since then, he claims, the drug enforcement agencies have been after him.
"An Interpol agent came to me with fake blotter paper made at great expense by Scotland Yard. They
were trying to trap me," he once told the British magazine Loaded. "Said they wanted to raise money for a
friend of mine on a similar charge. I could see through them straight away. When you do what I do, you
learn to be careful."
There are McCloud fans out there who hope he'll be more careful now. Those who like him love him.
Indeed, his collection is occasionally beautiful, and is certainly impressive in scope. As folk art, nothing
around captures this particular sensibility better. Yet at the same time, the Kansas City trial brought out
something sad in his art. LSD culture, often vapid and sometimes heroic, has been reduced to its vessels.
For two weeks, lawyers argued about the blotter paper that LSD comes on. This, it would seem, is what's
left. Hofmann, the other eloquent scientists, the activists and the fools have all been lumped into one
kooky old pile. With acid's marginalization, all that remains in the spotlight is some overwrought, stampsize nostalgia. Maybe LSD was never not on the margin, but the margin didn't used to seem so far away.
Still, taking a page from Hofmann, Leary, Ken Kesey, Abbie Hofmann and half a dozen other
psychonauts, McCloud has managed to scrape together his share of charm.
23
"Mr. McCloud, I told you I didn't want any reaction in the courtroom," U.S. District Judge Gary Fenner
scolded when the just-acquitted defendant jumped up to greet the jury.
According to the Kansas City Star, McCloud shrugged.
"I sure would have loved to have given that jury a hug," he said. "Where I come from, you thank the
people who set you free."
salon.com
http://archive.salon.com/people/feature/2001/04/16/hoffman/index.html
24
Teen Spirit, What was so important about the Beatles' appearances on The
Ed Sullivan Show?
video clips.
By Fred Kaplan, Posted Friday, Feb. 6, 2004, at 1:09 PM PT
Click on images to play
It may be impossible for anyone who wasn't living at the time to grasp how much the country changed 40
years ago this Sunday. On Feb. 9, 1964, at 8 p.m. ET, the Beatles appeared on The Ed Sullivan Show.
Everyone knows the rough outlines: the Fab Four mop-heads from Liverpool,
their journey to America, thousands of teenagers screaming in the streets, the
subsequent "British invasion," and the transformation of rock 'n' roll.
Fab Four debut
56K | 100K
But Americans under, say, 40 have had to take the historic importance of
these events on faith. Listening many years after the fact to those early
Beatles songs ("I Wanna Hold Your Hand," "She Loves You," "Please Please
Me," and so forth), they must have wondered—must still wonder—what the
fuss was about. These are fairly sappy tunes, compared with what followed,
from not only the Beatles but other bands.
A new DVD clears up the generational mystery. The two-disc set is called The Four Complete Historic Ed
Sullivan Shows Featuring the Beatles, an inelegant but meticulous title. It contains all four Sullivan shows
on which the Beatles made live appearances—the three Sundays in a row in February '64 (Feb. 9, 16,
and 23) and their return on Sept. 12, 1965—not just the parts with the Beatles but each hourlong program
in its entirety, commercials included.
Not only is it a fascinating time capsule (and a teary piece of nostalgia for those of us who well remember
the broadcasts from our youth), it also provides an unvarnished picture of the popular culture of the era—
and, thus, the impact that the Beatles had on it.
To a degree that young Americans couldn't comprehend today, The Ed
Sullivan Show was American popular culture. More than 50 million
A monopoly on popular
Americans—over half of the TV-viewing audience at the time—tuned in to it on
culture
CBS every Sunday night. (More than 70 million watched on the night of the
56K | 100K
Beatles' debut.) It was a variety show like no other, with animals, acrobats,
puppets, plate-twirlers, stand-up comics, nightclub singers, scenes from the latest hit musicals and
ballets—all the acts personally selected by this odd-looking, odd-talking, otherwise untalented ex-gossip
columnist.
As John Leonard put it in his wonderful essay, "Ed Sullivan Died for Our Sins," "Never before and never
again in the history of our republic would so many gather so loyally, for so long, in the thrall of one man's
taste." In an age when televisions had only three channels, Leonard noted,
Ed Sullivan was a one-man cable television system with wrestling, BRAVO and comedy
channels, Broadway, Hollywood and C-SPAN, sports and music video. We turned to him
once a week in our living rooms for everything we now expect from an entire industry
every minute of our semi-conscious lives.
Watching these shows now on DVD, we reach one conclusion very quickly:
Most of the stuff on "Sullivan" was crap. And the stuff that wasn't bad (and
State-of-the-art shtick
some of it wasn't) was, for the most part, very old—not old in the sense of
56K | 100K
having been aired 40 years ago, but old hat, even at the time, relics of the
Borscht Belt: hack impressionists, dreary puppets, lame parlor magicians, and mediocre starlet-singers.
(Who remembers that Mitzi Gaynor—or "Hollywood's delightful Mitzi Gaynor," as Ed introduced her—had
such a lousy voice?)
25
Everybody liked this stuff back then. I remember liking it, too. That's all there was. There was no concept
of an alternative.
That's why the Beatles were such a big deal. From the moment they
strummed those electric chords, wagged their mops of hair, and smiled those
Suddenly, an alternative
beaming, ironic, isn't-this-cool-but-also-a-bit-absurd smiles, we all knew it was
56K | 100K
something from a different galaxy. (And, given how rarefied foreign travel was
then, England might as well have been in a different galaxy.)
A slew of clueless scholars and columnists have mused, over the decades, that the Beatles caused such
a sensation because they snapped us out of the gloom brought on by the Kennedy assassination, which
had taken place the previous November. This is silly sociology. Look at these DVDs or at any footage of a
Beatles concert or a Beatles mob. It's extremely doubtful that any of these teenage girls were cheering,
screaming, palpitating, even crying with joy as some sort of catharsis to their anguish over Lee Harvey
Oswald's deed in Dallas. Meanwhile, their parents, who were the ones more likely traumatized by the
death of the president, remained tellingly immune to Beatlemania.
The Beatles took hold of our country and shook it to a different place because
they were young, because their music had a young, fresh feel, and because—
Nonthreatening rebellion
this is the crucial thing—our parents didn't get it.
56K | 100K
Ed Sullivan didn't entirely get it, either—and why should he have? He was even older than our parents.
Legend has it that, on a trip to England a few months earlier, Ed saw the commotion the Beatles were
causing and thought he'd book the lads on his show as a novelty act—until their manager, Brian Epstein,
insisted on top billing. You can imagine Ed thinking: Top billing for these kids? Above Frank Gorshin,
Myron Cohen, Gordon and Sheila McRae? Above Hollywood's delightful Mitzi Gaynor?!
The day after that Sullivan show, every boy came to school with his hair combed down as far as he could
manage (which, in most cases, wasn't very far). Some went out and bought Beatle wigs. Or saved up to
buy a guitar and then got together with friends to form a band. And this was OK, as long as you didn't play
too loud. The Beatles' rebelliousness was playful, not menacing. (Ed frequently praised them, in his
introductions, as "fine youngsters.") Their sexuality had an androgynous element—that long hair and such
pretty faces (except Ringo, the funny mascot of the group). They were a palatable transition to the truly
menacing figures to come—the Rolling Stones (who weren't booked by Sullivan till 1967 and, even then,
were forced to change the lyrics of "Let's Spend the Night Together" to "Let's spend some time together"),
later punk rock, and beyond.
The timing of the Beatles was perfect. 1964 marked the emergence of the Baby Boomers as a social
force—and the Beatles were the vehicle for their ascendance as a cultural force. What records were the
No. 1 hits on the pop charts before the Beatles took over the slot and stayed there for years to come?
Bobby Vinton's "There! I've Said It Again," the Singing Nun's "Dominique," and Dale & Grace's "I'm
Leaving It Up to You." The Beatles changed the charts forever. You can draw a line in the historical sands
of popular culture at 1964. A lot of pop music that came after that point still sounds modern today. Almost
all the pop music that came before that point sounds ancient.
On Feb. 9, 1964, The Ed Sullivan Show was the stage on which this change was dramatized. The
Beatles were the young and the new; almost all the other acts were the old and the stale. That night, at
least to every kid I knew, the future looked clear, happy, and ours.
http://slate.msn.com/id/2095079
26
Tales of Hoffman, Steal This Movie! turns Abbie Hoffman into the civil
liberties hero he never was.
By Jared Hohlt, Posted Monday, Aug. 21, 2000, at 6:30 PM PT
Abbie Hoffman always wanted a movie made out of his life. Before becoming a
political activist, he briefly ran an art-house cinema in his hometown of
Worcester, Mass. Once famous, he hung out with actors and directors in Hollywood, and he titled his
1980 autobiography Soon To Be a Major Motion Picture (Universal optioned and then dropped the book).
Now, 11 years after Hoffman's suicide at the age of 52, Steal This Movie! (a reference to Hoffman's
famous manual on how to rip off the system, Steal This Book) has arrived.
Hoffman's longtime lawyer Gerald Lefcourt (played by Kevin Pollak in the movie) served as associate
producer, and Hoffman's ex-wife Anita consulted on the script and visited the set. (She died of breast
cancer in 1998.) Not surprisingly, the authorized-by-proxy biopic celebrates Hoffman as an anti-war
protest leader who promoted countercultural values and was harassed by the U.S. government. But not
until Hoffman goes underground following his 1974 arrest for cocaine dealing does Steal This Movie!
depart seriously from the facts. In a wholly fictional subplot, the movie teams the fugitive Hoffman with an
investigative reporter who helps him expose the FBI's ongoing covert war against the left.
The film begins with the investigative reporter visiting Hoffman (Vincent D'Onofrio) in the underground,
and his questions set off a series of extended, true-to-life flashbacks: Hoffman disrupts the New York
Stock Exchange in 1967 by throwing dollar bills to eager brokers on the floor; he attempts to "levitate" the
Pentagon along with other anti-war protesters later that year; he leads the Yippies (the Youth
International Party), the politicized group of hippies he co-founded, during their demonstration at the 1968
Democratic National Convention in Chicago. In the film, as in life, Hoffman and compatriots nominate a
pig named Pigasus for president and also end up in a bloody, nationally televised fight with police over
control of a city park. Steal This Movie! quotes heavily from the court transcripts of the subsequent
Chicago Seven trial (1969-70), at which the ill-behaved, judge-defying defendants—Hoffman, Jerry
Rubin, Tom Hayden, and others—were convicted for conspiring to cause a riot and were also sentenced
for contempt of court. (The Chicago Seven trial was the Chicago Eight trial until the judge separated
Black Panther Bobby Seale's case from the others. Seale is in the celluloid courtroom, but this bit of
history doesn't make it into the film.) As Steal This Movie! notes, the convictions were ultimately
overturned.
Other minor historical inaccuracies: The film depicts Hoffman scaling a wall during the Pentagon
demonstration and getting clubbed by military police. Jonah Raskin, Yippie minister of education and
author of For the Hell of It: The Life and Times of Abbie Hoffman, says in an interview that the assault
never happened. The movie pretends that the Chicago Seven defendants were united when they weren't.
According to Raskin, Hoffman joked that it would be "cruel and unusual punishment to be in the same cell
as Tom Hayden," and Hayden "wanted the trial to be about the war in Vietnam, [while] Abbie wanted it to
be about the cultural revolution." The movie also ignores Hoffman's role at 1969's Woodstock, where he
took the stage during a Who performance to make a political speech but was driven off by a knock to the
head from Pete Townshend's electric guitar. The Vietnam War fades from the movie's view after the
Chicago Seven trial, although it remained the country's most volatile political issue even after the
drawdown of U.S. troops began in 1969. Nor does the movie recount Hoffman's attendance at the 1972
Democratic National Convention in Miami. "Abbie and Jerry [Rubin] were greeted as heroes," says his
friend Jay Levin in Larry Sloman's oral biography of Hoffman, Steal This Dream.
The film then charts Hoffman's fall from radical favor, blaming it on government disinformation. Indeed,
the government did plot against him, but Hoffman hadn't been a hero to everyone on the left. Feminists
and others were fed up with his macho, look-at-me agitprop antics. In the radical newspaper Rat (1970),
Robin Morgan wrote: "Goodbye to the notion that good ol' Abbie is any different from any other up and
coming movie star … who ditches the first wife and kids. … Women are the real left." (Before coming to
New York, Hoffman was married and had two kids with his wife Sheila, whom he later divorced. A chapter
of his autobiography is titled "Faulty Rubber, Failed Marriage." The Sheila period isn't in the movie.)
27
The film doesn't gloss over Hoffman's numerous romantic liaisons, as wife Anita (Janeane Garofalo)
acknowledges his extramarital adventures in the film. But it does downplay his drug habit. At one point
the movie Anita tells Jerry Rubin, "You know Abbie didn't have a drug problem." But a friend quoted in
Steal This Dream remembers that "Abbie really loved coke, it was his drug of choice."
Although Raskin says that Hoffman dealt cocaine for about two years before his 1974 arrest, Lefcourt
alleges that Hoffman's bust was the unfortunate result of a "lark": Hoffman, perhaps with an eye toward a
book, wanted to explore some outlaw activity before New York state's new strict Rockefeller drug laws
went into effect. In the film, Anita also says the drug deal was part of the research Hoffman was doing for
a book and that the FBI "used drugs as an excuse to bust Abbie for his politics."
In both the movie and the true story, Hoffman skipped bail after his arrest, though he remained in contact
with Anita and their son, america. He was also diagnosed as manic-depressive and put on lithium. And
he fell in love with Johanna Lawrenson (Jeanne Tripplehorn), settling with her in upstate New York,
where, under his alias Barry Freed, he became a leader in the successful enviro campaign to save the St.
Lawrence River from winter dredging.
The movie distorts the record by mostly presenting Hoffman's life underground as one of isolation and
deprivation, lived in cheap motels while working short-order-cook jobs. He did live that way. But he also
lived in Mexico for a while, where he had a house with a swimming pool and horses; he and Johanna
went to Europe, where they pretended to be food critics so they could eat at fancy restaurants; and he
spent time in Los Angeles with people like Jack Nicholson and producer Bert Schneider. As a 1980
Washington Post article noted, "While underground, he would call the Associated Press and the local
gossip columnists if he did not like a story. He had interviews in Playboy and New Times magazines. He
breezed into New York to have a birthday party with friends at an expensive Chinese restaurant and to
autograph his new book in a bookstore downtown." Reporters had no trouble reaching him. The Post
reported, "Local leftwing operators always had his underground area code and dial information. Fifteen
minutes or two days later, Hoffman would get in touch."
According to Steal This Movie!, Hoffman broke his five-year silence underground to single-handedly
expose COINTEL, the FBI's secret campaign to disrupt the American left, with the aid of investigative
reporter David Glenn (Alan Van Sprang). As the cinematic Hoffman prepares to leave the underground
and take his chances with the law, he spies a newsmagazine with "COINTEL" splashed across the cover
and says, "Glenn came through after all. COINTEL is going to die."
This scene, set in 1979 or thereabouts, is preposterous. David Glenn is a purely fictional creation, and
Hoffman played no role in exposing COINTEL, which came to light in 1974 and was the subject of
congressional hearings in 1975. Clearly, the FBI was still monitoring Hoffman and his friends while he
was underground, though there is some debate as to how heavily. But by 1979, there was little to nothing
to uncover about COINTEL. By adding a COINTEL "exposé" to Hoffman's résumé, the movie gives
counterfeit political purpose to his underground years.
After surrendering to authorities in 1980, Hoffman spent a few months in prison and a little less than a
year in work-release. He remained politically active until his suicide—protesting U.S. involvement in
Central America, speaking on college campuses, and campaigning for the environment. The movie
doesn't dwell on the '80s, but ends with Hoffman successfully defending himself in court after he and
some Amherst students are arrested for protesting the CIA's actions in Nicaragua. (The suicide is
mentioned in a postscript.)
Steal This Movie!'s most egregious distortion is one of omission: It largely fails to capture Hoffman's
comic anarchism. In real life, he was always engaging in put-ons: He once made a speech saying he'd
"fucked" Spiro Agnew's daughter (a claim the FBI actually bothered investigating), and during the 1968
convention he threatened to spike Chicago's drinking water with LSD. (Skeptical police were forced to
guard the reservoirs.) While underground, Hoffman would send taunting letters to the FBI on stationery
from various hotels. Hoffman also loved to tell the story of how, as a fugitive, he'd pretended to be a
28
tourist and gone on a guided visit of the FBI headquarters. We'll probably never know whether or not that
story is true.
http://slate.msn.com/id/88471
29
Talking 'Bout His Generation, Paul Berman reconsiders 1968.
By Alan Ehrenhalt, Posted Tuesday, Aug. 13, 1996, at 12:30 AM PT
A Tale of Two Utopias: The Political Journey of the Generation of 1968, By Paul Berman
In a logical world, the number of books written about a political movement would have some rough
connection with its ultimate importance. The noisy crusades that became historical footnotes would all be
literary footnotes as well.
But we are living in a different sort of world, one in which it is possible, for example, to fill up the better
part of a library with volumes written by, and about, the leftists in Manhattan in the 1930s. Anything you
want to know about the sectarian ideological warfare of these quarrelsome people--Socialists, Stalinists,
Trotskyites, Schactmanites, Lovestonites--you can find in masochistic detail.
That's not because their arcane disputes held any fascination for most Americans at the time. Nor is it
because of their impact on the decades that followed. I have never heard anyone suggest that the key to
understanding American postwar politics lay in the arguments among the competing factions at CCNY in
1937 or their subsequent recreations on Riverside Drive. The truth is that 1930s leftism became a literary
genre because those who indulged in it enjoyed writing books, and were also willing to buy them.
It would be a mistake to conclude that, as those leftists depart from the scene, the tradition of political
verbosity will die out. Rather, a new generation of aging leftists--the campus radicals of 1968--is
emerging, and off to an appropriately windy start. If you happen to be interested in the internal
factionalism of SDS in 1965 or Tom Hayden's Port Huron statement of 1962, a large collection of books is
available to help you. Or you can purchase a two-volume account of the Sorbonne uprising in Paris,
compressed into a tidy 1,311 pages.
Now there is another book, A Tale of Two Utopias, by Paul Berman, a veteran of the 1960s leftist wars
who writes for The New Yorker, the New Republic, and other organs of the nonmilitant journalistic
mainstream. The difference between Berman's volume and most others I have seen on the subject is that
his is an entertaining book, readable and unpretentious, a little softhearted toward his old comrades, but
more than willing to take them to task for their absurdities.
Of all those absurdities, one stands out in high relief, and also links the
leftists of the 1960s to those of the Depression years. It is their consistent overestimation of their own
significance. That the student activists of Berman's generation believed a political revolution to be within
their grasp may seem a little puzzling from the vantage point of 30 years, but is perhaps understandable
as a product of youthful exuberance: Berman himself lapses into a romantic nostalgia when he describes
what it felt like to be in the vanguard of something new and magnificent--in politics, in culture, in the way
human beings treat each other.
"A utopian exhilaration swept across the student universe," he writes on the first page. "Almost everyone
in my own circle of friends and classmates was caught up in it. ... Partly it was a belief, hard to remember
today, that a superior new society was already coming into existence. And it was the belief that we
ourselves--the teenage revolutionaries, freaks, hippies and students--stood at the heart of a new society."
This was, as Berman readily concedes, an illusion. But what can it claim for its legacy, a generation later?
A Tale of Two Utopias is a lucid, three-pronged attempt to answer that question.
The first of Berman's quests for a larger meaning--in the experiences of the Americans who led the
Students for a Democratic Society and the other leftist political organizations of 1968--is not a pretty one.
By the end of 1969, the idealism of the SDS had degenerated into personal bickering and a flirtation with
Maoist terrorism that could be written off as mere goofiness had it not resulted in the loss of innocent lives
on both sides of the Atlantic.
30
That the radical enthusiasms of 1968 left no permanent political imprint upon the Western democracies is
a conclusion that a simple electoral history of the ensuing decades, the decades of Reagan, Thatcher,
and Kohl, would seem sufficient to prove. Berman sadly acknowledges it. But the politics of the 1960s
were a challenge to culture as well as politics. Berman locates his second search for meaning in the "gay
awakening" that had its symbolic start in the 1969 protest at the Stonewall bar in Manhattan. The students
who the year before had marched and chanted and proclaimed "it is forbidden to forbid" had been arguing
for liberation of the oppressed of virtually every kind, and when it comes to the sexually oppressed,
Berman says, the last two decades can be seen as a victory. "There is reason to think that on the matter
of homosexuality," he writes, "some small but important aspect of human personality has begun to
change."
That may be true, but the connection between the gay awakening and the student rebellion of 1968 is an
oblique one. Gay liberation, like feminism, is central to the whole individualist ethos of the last two
decades. But it doesn't exactly count as a trophy from the barricades. The world would have conceded
the claims of both gays and women without the student occupation of a single lecture hall anywhere in the
world.
There is one more search for meaning, and it takes Berman across the
ocean. Adam Michnik was arrested in a demonstration in Warsaw in
February of 1968, went on to become the leading theorist of the
Solidarity protest movement of the 1980s, and survived to take up a
role as middle-aged statesman in the Polish political world that
succeeded the collapse of communism in 1989. Václav Havel was in
New York in the spring of 1968, participated in the student strike at
Columbia, joined Alexander Dubcek in the short-lived liberal uprising in
Prague that summer, and became the president of Czechoslovakia in
1990. Shortly after taking office, he entertained Frank Zappa on a state
visit.
Perhaps, Berman speculates, Havel and Michnik represent the true
legacy of the worldwide revolutionary sprouting of 1968. Unlike the
students of Columbia or the Sorbonne, they persevered, and they won-something. But what?
The peaceful revolutions in eastern Europe in 1989 were indeed a
defeat for the forces of illegitimate authority. That they were the incarnation of the dreams of 1968 seems
a stretch, to say the least, even if some of the important players came out of the original cast. In the end,
Berman does not make such a claim. He ponders the hopeful predictions of Francis Fukuyama that
Western civilization is moving gradually, if fitfully, toward greater individual freedom. But he closes with
the darker speculations of Andre Glucksmann, a leader of the Sorbonne rebels of 1968 and a later
convert to conservatism, that the utopian enthusiasm of those years simply did not take sufficient account
of the fundamental reality of evil in the human psyche.
And so, at the end of his odyssey, Berman finds a bit of meaning, but little reassurance. How does the
world feel as the 20th century comes to a close? "The world feels this," Berman declares: "humble,
skeptical, anxious, afraid, shaken."
Maybe so. On the other hand, it is an oddly grandiose conclusion to an appealingly modest book. Berman
presents almost no evidence about the way the world is feeling. What he tells us is how the student
rebels of his own generation are feeling, and that they are not feeling very well. In magnifying the global
significance of their discontents, Berman ultimately succeeds in reinstating his lifetime membership in the
class of 1968.
http://slate.msn.com/id/2947
31
The Apes of Wrath, The radical political history of Planet of the Apes.
By Alex Abramovich, Posted Friday, July 20, 2001, at 5:30 PM PT
The tagline for Tim Burton's upcoming ape movie is "Rule the Planet,"
but given the cavalcade of products pegged to the film's release—
PotA action figures, baseball caps, binders, board games, comic books, fanny packs, key chains, lunch
boxes, scooters, skateboards, stickers, trick-or-treat bags, and temporary tattoos—"Rule the Playground"
might be more appropriate. Doubly appropriate, since the new iteration of Planet of the Apes started not
as a movie but as a marketing pitch. "A no-brainer," according to one of the remake's original producers.
"A can't-miss hit. A merchandising tidal wave."
Judging from the previews, Fox ended up with a shadowy space-opera whose flash libretto was drawn up
in crayon. It's a neat reversal of the pattern established by five previous Ape flicks: There, the production
values were comically shoddy but the story lines strikingly mature. Sammy Davis Jr. considered the first
film, which premiered in 1968, the best allegory on race relations he'd ever seen. Newsweek's reviewer
thought it caught the audience "at a particularly wretched moment in the course of human events, when
we are perfectly willing to believe that man is despicable and a good deal lower than the lower animals."
Peter Singer aside, humanity has a higher opinion of itself these days, and the old story lines seem so
radical it's a wonder the series—which is based loosely on a novel by Pierre Boulle—got made in the first
place.
As it happens, Boulle's own life would make a pretty good movie. Employed, at the outbreak of World War
II, as an overseer of a Malaysian rubber plantation, the French expatriate served as a guerrilla and
intelligence operative in China, Indochina, and Burma, was captured by Vichy loyalists on the Mekong
River, and escaped from a POW camp in Saigon—experiences which earned him the Croix de Guerre,
the Medal of the Resistance, and an appointment to the Legion of Honor, and informed his best-known
novel, the epic Bridge Over the River Kwai.
Published six years later, in 1963, Monkey Planet was no less epic and a good bit darker. Stagnation was
the metaphor Boulle was working with—it must have seemed a fitting subject for a Frenchman writing
after the war—with apes forging a civilization not out of humanity's rubble (as in the American adaptation)
but out of its ennui. Boulle's apes were more advanced than their American counterparts—they drove
cars, lived in cities, and, aside from the convenience of having four prehensile thumbs, resembled people
in most of their particulars. Boulle's point was that, for all the progress, ingenuity, and enterprising spirit
he saw in his fellow men, monkeys would do just as well.
But in an America rocked by race riots and sinking deeper into the quagmire of Vietnam, Rod Serling set
out to make a different point: Maybe the monkeys could do better. Serling's experience with The Twilight
Zone had taught him that "it was possible to have Martians say things that Democrats and Republicans
can't say." His fantastically misanthropic treatment for a Planet of the Apes film shouted those things from
the rooftops. The archetypal Aryan hero, played by Charlton Heston, is bound and gagged, caged and
(nearly) castrated, and finally made to stand trial for humanity's crimes against the earth. "He is the
perfect American Adam to work off some American guilt feelings of self-hatred on," Pauline Kael wrote in
her review of the film, catching an early whiff of what, as the series progressed, became an orgy of selfloathing.
Planet of the Apes was anything but subtle. Its metaphors—for race relations, imperialism, and the Cold
War—were broad enough to reach a wide audience, and the audience responded wildly; the film grossed
$100 million (in today's dollars), spawned four sequels, a TV series, a Saturday-morning cartoon, a
traveling theater troupe, and a slew of comic books (which, if childhood recollections serve me right, each
came with their own 45 rpm record). But instead of marginalizing politics as it progressed, the franchise
moved them to the fore.
Taken together, the films amount to a grand, if sometimes campy, tour of revolutionary and reactionary
propaganda. "The only good human is a dead human," a Gestapo-like gorilla says, by way of explaining
Ape Power. "It is our holy duty … to kill our enemies—known and unknown—like so many lice."
32
[Correction: This line is from the screenplay and never made it to the final film.] Other films feature slave
auctions, selection tables, concentration camps, and villages modeled on My Lai. The index for one study
of the series—Eric Greene's Planet of the Apes as American Myth—references anti-Semitism, Attica,
David Ben-Gurion, Stokely Carmichael, the Coleman Report, The Confessions of Nat Turner, George
Armstrong Custer, Moshe Dayan, Frederick Douglass, Medgar Evers, the House Committee on UnAmerican Activities, Kent State, Loving v. Virginia, Malcolm X, Manifest Destiny, Rosa Parks, Dred Scott,
Watergate, Watts, and White Citizens Councils. This isn't simply the work of an overheated academic
pop-culturalogist; watch the films again, and you'll see that none of these interpretations are much of a
stretch. Paul Dehn, who took over from Serling as the series' screenwriter, boasted that "they're all terribly
like Bertrand Russell, my chimpanzees," noted in the margins of his scripts that slave apes were "to be
whipped as the Negro slaves were," and explained that the ape insurrection in Conquest of the Planet of
the Apes was based on the Watts riots.
Not surprisingly, the films' identification of blacks and the underprivileged with apes could be problematic.
Take the Watts-like ape revolt. Here, Caesar, the chimpanzee destined to lead his fellow apes out of
bondage, is aided in his escape by MacDonald, a human administrator at the crypto-fascist Ape Control
agency. The following exchange, which consists of alternating close-ups of MacDonald (who happens to
be black) and Caesar is typical:
MacDonald: I wish there was some way that we could communicate, so you'd understand that
I—
Caesar: I understand, Mr. MacDonald. Yes, I'm the one they're looking for.
MacDonald: I never believed it. I thought you were a myth.
Caesar: Well, I'm not. But I will tell you something that is—the belief that human beings are kind.
MacDonald: No, Caesar. There are some—
Caesar: Oh, a handful perhaps, but not most of them. No, they won't learn to be kind until we
force them to be kind. And we can't do that until we are free.
MacDonald: How do you propose to gain this freedom?
Caesar: By the only means left to us. Revolution.
MacDonald: But it's doomed to failure!
Caesar: Perhaps. This time.
MacDonald: And the next.
Caesar: Maybe.
MacDonald: But you'll keep trying?
Caesar: You above everyone else should understand. We cannot be free until we have power.
How else can we achieve it?
The audience is clearly intended to sympathize with Caesar, and the implicit endorsement of
revolutionary black power—coming, as it does, from a major Hollywood studio—is striking. (Contemporary
33
reviewers pointed out that black audiences drowned out much of the dialogue by cheering "Right on!"
right through the screenings.) Still, it's unpleasant to see chimps telling black men that they, of all people,
should understand the monkey's lot. The implication is that human rights, having trickled down from
whites to blacks, will continue to work their way down the evolutionary ladder. No wonder placards
reading "NAACP: Planet of the Apes" became fixtures at white supremacist rallies. And no wonder the
executives behind Burton's remake shied away from the original's racial and political overtones.
But the Planet of the Apes movies succeeded, in no small measure, because they tapped into the terrific
pressures of their time. Given that history, it'll be interesting to see which analogies Burton embraces and
which he ignores. If he turns away from all of them, very little of the original movies—aside from hairy
prosthetics—may be left.
http://slate.msn.com/id/112241
34
Culture War. Stop blasting Hollywood and the '60s culture for senseless
killings.
By Herbert Stein Posted Thursday, July 29, 1999, at 12:00 AM PT
After the Columbine tragedy, one faction held guns responsible for the slaughter, and another blamed the
culture. Since the guns question has been debated into the ground, I confine myself here to the culture
question.
The culture that was blamed for Columbine was never clearly defined. Its nature was suggested by terms
such as "the '60s," "liberal," "permissive," "sex and violence," and "Hollywood, television, and video
games," and it was the media that bore most of the complaint. Hollywood was told to clean up its act, and
theater owners were urged to enforce the ratings system, to avoid exposing young people to sex and
violence.
Then along comes the Midwest assassin, who earlier this month killed a black man and a KoreanAmerican man and wounded nine other blacks, Asians, and Jews (coming out of a synagogue). This time
around, nobody is blaming the culture--at least, not the culture that supposedly caused Columbine.
The Midwest killings were categorized as "hate" crimes. Hate crimes include crimes not only against
blacks, Jews, and Asians but also against gays, "the government" (Oklahoma City), and the technological
age (the Unabomber). They are not crimes against an individual known personally to the perpetrator or
against whom he has a grievance. (If a man shoots his brother-in-law that is not called a hate crime,
although there is probably hate involved.) They are crimes in protest against the culture, intended to
make a statement of their hostility to the culture.
O
f course, nobody can blame the '60s culture or Hollywood for these crimes. The '60s' slogan was
"Make love, not war," and the Hollywood culture is a culture of acceptance--of blacks, Jews, and gays. In
fact, it is that very culture of acceptance that infuriates these madmen.
The Columbine killers, on the other hand, shot people they knew, some of whom they had real or
imagined grievances with. But they also fancied themselves as Nazis. They were making a deadly
statement against the American dream, of respectable, middle-class, suburban life. To that degree
Columbine was also a hate crime.
A
ccording to
one poll, in the two weeks bracketing the Columbine incident, the percentage of Americans who thought
that the country was on the wrong track rose from 49 percent to 60 percent. This remarkable swing is
commonly attributed to shock over the shooting. While I can understand the national bewilderment the
event caused, I cannot understand why it should be interpreted as a judgment against the way the
country is going. Two estranged young men who acquired Nazi attitudes--which they certainly did not get
from Hollywood--made a deadly protest against the way the country is going. This is not a sign that
something is going wrong, except for the ready availability of guns.
If you compare the murders linked to hate crimes with the murders linked to street crimes, the most
obvious thing you notice is how the number of street killings dwarfs that of hate killings. But despite this,
street killings do not cause revulsion against the way America is going, though they may more legitimately
raise a question about the media than hate killings do because they more resemble Hollywood depictions.
Most cinemaland murders are committed by bad guys whose motives are pragmatic, not symbolic. Also,
these murders are marked by indifference: The perpetrator does not value life, and he feels neither guilt
nor glory at having killed someone. The hate killer values life and thinks he is committing a great deed
and making a grand statement when he kills.
35
D
espite these similarities, one should be cautious about assigning Hollywood responsibility for the
culture in which street crime flourishes. The screen is not the only place--and probably not the most
influential place--where young people acquire ideas of what is acceptable behavior. They learn these
ideas at home, in school, at the shopping mall, on the street corner, and everywhere else where they
observe life and people. Young people in the ghetto don't have to go to the movies to hear shooting. What
they see on the screen seems real to them because it conforms to what they see in life. Otherwise it
would have no more effect on them than seeing the feud and swordplay between the Montagues and the
Capulets.
When I was a boy in Detroit we used to go to the movies on Saturday afternoon and watch the violence
between cowboys and Indians. We knew it wasn't real. Though we played cowboys and Indians in the
street, we did not kill any real Indians. The reaction of Native Americans to those movies may have been
different.
T
here is something in American attitudes that condones or glorifies murder, at least more so than in
other countries. If we want to change that attitude, Hollywood should not be the main place we look.
(Remember, most of the rest of the world watches Hollywood movies without engaging in orgies of
violence.) We have to try to do something about the real world in which children are growing up. The
crucial part of that world is the home where parents relate to children. What to do about that, I don't know.
Probably there is little that public policy can do. But the fixations on the media and on the '60s culture do
not help in the search for remedies.
It would also help our thinking if we could avoid the "sex and violence" mantra. Sex on the screen, or the
abundance and explicitness of it, has only a distant connection, if any, with the homicides that worry us.
Context isn't everything, but it's worth noting that the TV channel that shows the most violence is the
History Channel, with its endless replaying of World War II: I have not heard anyone say that is an
encouragement to crime.
http://www.slate.com/id/32288
36
Bob Dylan's 1963 generational anthem "The Times They Are A-Changin'"
Come mothers and fathers
Throughout the land
And don't criticize
What you can't understand
Your sons and your daughters
Are beyond your command.
There's a battle
Outside and it's ragin'
It'll soon shake your windows
And rattle your walls. ..
For the times they are a-changin' !
Jefferson Airplane's 1967 hit "White Rabbit,"
One pill makes you larger
And one pill makes you small
And the ones that mother gives you
Don't do anything at all.
Go ask Alice
When she's ten feet tall.
...Remember what the dormouse said
Feed your head.
37
Former Sen. Bob Kerrey
Is it time for a Vietnam truth
commission?
Suppressed atrocities haunt victims, perpetrators
and politics alike. That's why unshrouding the secret
history of Sen. Bob Kerrey and the Vietnam War is
imperative.
-----------By Bruce Shapiro
May 2, 2001 | A few weeks ago, I gave a university
lecture about journalism and human rights. I began
by talking about investigative reporter Seymour
Hersh's famous November 1969 expose of the My Lai massacre -- when American troops killed 300
civilians in a remote Vietnamese hamlet. After my talk, a young graduate student introduced himself and
explained that he had attended a military college, where students were required to study My Lai in depth.
He asked whether I thought such an atrocity represented a common occurrence in Vietnam or an
aberration.
I was startled because that student's question seemed so distant and remote, as open and objective as if
he were inquiring about Antietam or Bull Run. I thought of that student again this week: how it may be
impossible for him, or anyone born after the last American helicopters left Saigon in 1975, to fully grasp
the unexpectedly raw emotions unleashed -- visible on op-ed pages and talk shows -- by the revelation
that recently retired Sen. Bob Kerrey, as a young Navy SEAL lieutenant, participated in a massacre of 13
unarmed Vietnamese women and children in February of 1969.
When the film "Saving Private Ryan" opened, veterans' hospitals reported an upsurge in elderly World
War II soldiers seeking treatment for post-traumatic stress disorder for the first time in their lives.
Something like that has happened with the story of the encounter between then-Lt. Kerrey's eight-man
commando squad and the people of Thanh Phong.
After decades in which many politicians have done their best to gloss over the lingering damage done by
the war in Vietnam, the Kerrey story demonstrated that the books are far from closed, either in the private
realm of emotion or the public balance sheet of moral accountability.
The story of Thanh Phong and the squad who called themselves Kerrey's Raiders proves something that
the ancient Greeks would have understood: that an atrocity unacknowledged and papered over festers
like a body unburied, erasing time and space between the original event and revelation of the terrible
secret. In the scales of the Vietnam War Thanh Phong is a tiny and obscure incident, yet this Sunday's
New York Times Magazine story, like a post-traumatic flashback, instantly opened up yawning divisions
over the war, sending a surge of once-familiar language through the national adrenaline system of the
media: hooches and body counts and strategic hamlets and VC.
That's certainly true for Kerrey's Raiders, who after 32 years are for the first time actively revisiting the
events of that night, beginning to sort out facts with what appears to be the full gamut of human
expression from profound "shame" (Kerrey's word) to defensiveness to anger. It appears to be true in
Thanh Phong, where one surviving villager named Pham Tri Lanh told a CBS camera crew the story that
32 years earlier village elders tried to communicate to U.S. Army investigators -- who promptly covered up
what military records called the "alleged atrocities."
38
That abrupt upwelling of emotion goes beyond the immediate participants and survivors. Looking at the
press the last few days you'd think the war had never ended -- the terms of debate thrust back decades.
In the New York Times, William Safire -- President Nixon's speechwriter at the time of Thanh Phong -excoriates postwar American self-flagellation and defends the war as nobly motivated; Time magazine
worries over the consequences for American soldiers if they are sent into battle without proper
justification, a framework unchanged since 1975
On the surface, the question the Thanh Phong story presents is simple: Whose version do you believe?
By the account of Bob Kerrey -- today president of New York's New School University -- his SEAL
commando squad entered Thanh Phong in search of a Viet Cong official they planned to abduct or
assassinate, came under fire (or at least thought they did), shot back and afterward discovered they had
killed 13 women and children.
By the account of fellow commando Gerhard Klann, Kerrey's Raiders never came under fire at all. In
Klann's version, the SEALS rounded up the women and children while they searched the village, then
shot them at point-blank range to make sure they could leave the village unscathed. It is Klann's account
that is bolstered by village survivor Pham Tri Lanh.
There's a good argument to be made that Klann's account is the more credible. It's a story he apparently
confided to at least one SEAL commander later in his career, and he confirmed it to the Times magazine
only reluctantly. A CBS videographer who had never heard Klann's version of events gleaned Pham Tri
Lanh's corroborative interview. Kerrey's version, on the other hand, leaves unexplained the most salient
question of the raid: how a random nighttime firefight could have left a cluster of 13 civilian corpses lying
together. The heated denial from all the remaining members of Kerrey's Raiders other than Klann came in
a collective statement after publication of the Times story, which offered far more certitude than Kerrey
himself had provided that there had been an extensive firefight.
Yet the debate over Klann's vs. Kerrey's version of events is of secondary relevance. For one thing, the
one central and uncontested fact remains, regardless of whether Kerry or Klann is more credible: those
13 dead women and children in the center of a Vietnamese village. Kerrey himself understands this, and
does not give himself a moral pass: "Basically you're talking about a man who killed innocent civilians," he
told the Times magazine. But most Americans, obsessed with Vietnam as our own "national tragedy,"
have always had a hard time acknowledging the depth of Vietnamese suffering during the war.
Commentary on the Thanh Phong story has already sidelined those dead Vietnamese. The collective
statement by Kerrey's squad barely even acknowledges Thanh Phong's civilian dead: "We regret the
results of this night," is as close as it gets, a kind of chilly bureaucratese suggesting that some members
of the team are still working hard to keep those events in the back drawer of memory.
At the same time, the story of Thanh Phong raises another question: How many more Kerrey's Raiders
are out there, Vietnam veterans bearing the particular burden of a terrible secret? This is an
uncomfortable question for a country obsessed with "closure." The war forced combatants into awful
choices that are now practically the stuff of cliché: The inability, sometimes, to distinguish between
civilians and Viet Cong; the frequent knowledge that the orders of superiors would lead to disaster; the
decision about whether to intervene to prevent or halt singular acts of brutality (as one helicopter pilot
finally did at My Lai). With combatants of the Vietnam era approaching late middle age, the fallout from
those choices is sure to resurface as men and women who have numbed themselves with work suddenly
find time to reflect.
One of the most powerful and original books on the war, "Achilles in Vietnam: Combat and the Unmaking
of Character" by psychotherapist Jonathan Shay, relates the story of one sergeant haunted by atrocities
in which he participated in the war. He is sickened by his medals, and dreams -- though emotionally
shattered -- of someday returning to Vietnam to make restitution in some fashion with his labor. That
sergeant's voice echoed last week in Kerrey's comments when he was asked about the Bronze Star
39
awarded for Thanh Phong. "I've never worn that damn medal," Kerrey said in an interview. "I never
campaigned and said, 'Vote for me; I'm a hero.' If they want to take it away, I don't care."
The story of Thanh Phong haunts because this small, nearly forgotten incident so neatly captures not the
aberrations of Vietnam but the war's central logic. If Kerrey's Raiders targeted civilians that night, they did
so in the shadow of carpet bombing and the strategic hamlet program that routinely devastated Vietnam's
civilian population. If Bob Kerrey has lived for 32 years with what he decribes as "shame" at his actions
and at his unearned Bronze Star, it is because of the daily corruption of the body count, the routine
papering over of civilian dead, endorsed from the White House on down.
Whatever happened at Thanh Phong, there is plenty of blame to go around. And while the Pentagon's
own post-Vietnam syndrome involves blaming the press for losing the war, the fact is that until very late -indeed, through the very period in which Kerrey's Raiders landed in that village -- the press was fully
complicit in covering up American attacks on Vietnamese civilians.
Indeed, it took Seymour Hersh's My Lai exposé, months after Thanh Phong, before correspondents
began to admit that the systematic and indiscriminate killing of large numbers of civilians by American
troops was old news.
Until My Lai, most American reporters in Vietnam simply had not deemed human-rights atrocities
newsworthy. Frank McCullough of Time magazine, for instance, covered the war for four years without
ever reporting on atrocities. But after My Lai broke, he recalled seeing Viet Cong prisoners pushed from
airplanes by American troops, shot with their hands tied behind their backs and devoured by Dobermans
unleashed by interrogators. Many other reporters told similar stories: It was as if the floodgates had
opened, as if the press suddenly had official sanction to report a previously suppressed government
secret.
The Vietnam correspondent Neil Sheehan -- to whom Daniel Ellsberg entrusted the Pentagon Papers -eventually described the thinking of American reporters in those years. It bears listening to now, because
Kerrey's high-profile confession may have an impact similar to My Lai in released bottled-up atrocity
stories.
Sheehan recalled that in 1966, three years prior to the events in Thanh Phong and My Lai, he personally
witnessed American troops wipe out five fishing villages, killing as many as 600 Vietnamese civilians. The
raids "seemed unnecessarily brutal," but "it did not occur to me that I had discovered a possible war
crime." He went on: "I had never read the laws governing the conduct of war, though I had watched the
war for three years in Vietnam and written about it for five ... The Army field manual says it is illegal to
attack hospitals. We routinely bombed and shelled them ... looking back, one realizes the war crimes
issue was always present."
"The war crimes issue" is also what makes the Thanh Phong story so troubling -- and yet so relevant -today. We live in an era drenched with investigation of atrocity, past and present: Holocaust assets,
Chilean death squads, Bosnian ethnic cleansing; there are war crimes tribunals for the Balkans and
Rwanda, a South African Truth and Reconciliation commission and similar bodies in El Salvador and
Guatemala. In Northern Ireland, an independent commission is raking over British Army behavior on
Bloody Sunday of 1970.
Some of these efforts are tribunals aimed at bringing the worst perpetrators to justice; others, at just
setting the record straight. What all those nations realize is that suppressed atrocity haunts not just its
victims and shadows not just its perpetrators, but distorts the political life of entire societies.
When it comes to the Vietnam War, this is a realization that has yet to strike senators like Massachusetts'
John Kerry, Nebraska's Chuck Hagel and Georgia's Max Cleland -- all Vietnam vets with varying views of
the war -- who have taken pains to assure their former colleague that there will be no investigation of
40
Thanh Phong. Time to move on. (Sen. John McCain, while speaking supportively of Kerrey, has taken a
more cautious approach, simply saying that whether or not to investigate is the Pentagon's decision.)
I am not saying Congress or the Pentagon should initiate an inquiry into this single incident, or hold some
kind of war crimes trial for Kerrey -- who after he was maimed on a subsequent mission came to oppose
the war, and was an honorable voice in the Senate against nuclear proliferation and military adventurism.
But memory, especially the memory of a traumatic event, does not move on. Gerhard Klann and Bob
Kerrey have different accounts of events in Thanh Phong, but both describe the same motive for coming
forward with them: to "cleanse my conscience," in Klann's words.
That same graduate student who asked about My Lai asked me another question: Weren't there atrocities
committed on both sides in Vietnam? You hear that kind of moral-equivalence argument a lot now, as a
way of putting the Vietnam War on a safe, comprehensible shelf. I answered testily that only one side
backed up its atrocities with B-52s. But the story of Thanh Phong suggests a different way of thinking
about the war -- not about moral equivalence in battle, but about how lingering traumatic memory unites
American war survivors and their Vietnamese counterparts. While I don't think that Kerrey's version of
events adequately explains the events of Feb. 24, 1969, there's no reason to doubt the sincerity of his
effort -- once prodded by a reporter -- to scrape away the decades of silence and denial.
"This is in the early stages. I'm just trying to get a private memory public," he said at a press conference.
The story of Thanh Phong is part of the hidden history of the Vietnam War -- a history written not in
diplomatic cables or Pentagon memoranda, but upon the memories and bodies of the war's survivors. If
the story of Kerrey's Raiders and Thanh Phong resonates, it is because that hidden history is still carried,
unacknowledged and unratified, by thousands on both sides of the Pacific Ocean.
http://archive.salon.com/news/feature/2001/05/02/kerrey/index.html
41