James Frey's admission last week that he made up details of his life in his
best-selling book "A Million Little Pieces" - after the Smoking Gun
Web site stated that he "wholly fabricated or wildly embellished details
of his purported criminal career, jail terms and status as an outlaw 'wanted in
three states' " - created a furor about the decision by the book's
publishers, Doubleday, to sell the volume as a memoir instead of a novel.
It is not, however, just a case about truth-in-labeling or the misrepresentations
of one author: after all, there have been plenty of charges about phony or
inflated memoirs in the past, most notably about Lillian Hellman's 1973 book
"Pentimento." It is
a case about how much value contemporary culture places on the very idea of
truth. Indeed, Mr. Frey's contention that having 5 percent or so of his
book in dispute was "comfortably within the realm of what's appropriate
for a memoir" and the troubling insistence of his publishers and his
cheerleader Oprah
Winfrey that it really didn't matter if he'd taken liberties with the facts
of his story underscore the waning importance people
these days attach to objectivity and veracity.
We live in a relativistic culture where television "reality shows"
are staged or stage-managed, where spin sessions and spin doctors are an
accepted part of politics, where academics argue that history depends on who is
writing the history, where an aide to President Bush, dismissing reporters who
live in the "reality-based community," can assert that "we're an empire now, and when we act, we create our own
reality." Phrases like "virtual
reality" and "creative
nonfiction" have become part of our language. Hype and hyperbole
are an accepted part of marketing and public relations. And reinvention and
repositioning are regarded as useful career moves in the worlds of
entertainment and politics. The conspiracy-minded, fact-warping movies of Oliver
Stone are regarded by those who don't know better as genuine history, as
are the most sensationalistic of television docudramas.
Mr. Frey's embellishments of the truth, his cavalier assertion that the
"writer of a memoir is retailing a subjective story," his casual
attitude about how people remember the past - all stand in shocking contrast to
the apprehension of memory as a sacred act that is embodied in Oprah Winfrey's
new selection for her book club, announced yesterday: "Night," Elie Wiesel's devastating 1960 account of his
experiences in Auschwitz and Buchenwald.
If the memoir form once prized authenticity
above all else - regarding testimony as an act of paying witness to
history - it has been evolving, in the hands of some writers, into something
very different. In fact, Mr. Frey's embellishments and fabrications in many
ways represent the logical if absurd culmination of several trends that have
been percolating away for years. His distortions serve as an illustration of a
depressing remark once made by the literary theorist Stanley
Fish - that the death of objectivity "relieves
me of the obligation to be right"; it "demands only that I be
interesting."
And they remind us that self-dramatization (in Mr. Frey's case, making
himself out to be a more notorious fellow than he actually was, in order to
make his subsequent "redemption" all the more impressive) is just one
step removed from the willful self-absorption and shameless self-promotion
embraced by the "Me Generation" and its culture of narcissism.
"A Million Little Pieces," which became the second-highest-selling
book of 2005 (behind only "Harry Potter and the Half-Blood Prince"),
clearly did not sell because of its literary merits. Its narrative feels
willfully melodramatic and contrived, and is rendered in prose so self-important and mannered as to make the likes of
Robert James Waller ("The Bridges of Madison County") and John Gray
("Men Are From Mars, Women Are From Venus") seem like masters of
subtlety and literate insight.
The book sold more than two million copies because it was endorsed by Ms.
Winfrey, and because it rode the crest of two waves that gained steam in the
1990's: the memoir craze, which reflects our obsession with navel gazing
and the first person singular; and the popularity of recovery-movement
reminiscences, which grew out of television-talk-show confessions (presided
over by Ms. Winfrey, among others) and Alcoholics Anonymous testimonials.
These two phenomena yielded the so-called "memoir
of crisis" - a genre that has produced a handful of genuinely
moving accounts of people struggling with illness and personal disaster but
many more ridiculously exhibitionistic monologues that like to use the word
"survivor" (a word once reserved for individuals who had lived
through wars or famines or the Holocaust) to describe people coping with
weight problems or bad credit.
They also coincided with our culture's enshrinement of subjectivity - "moi" as a modus operandi for processing the world.
Cable news is now peopled with commentators who serve up opinion and
interpretation instead of news, just as the Internet is awash in bloggers who
trade in gossip and speculation instead of fact. For many of these people, it's
not about being accurate or fair. It's about being entertaining, snarky or
provocative - something that's decidedly easier and less time-consuming to do
than old fashioned investigative reporting or hard-nosed research.
Around the same time, biographies became
increasingly infected with personal agendas. There was biography as
pretentious exercise in deconstruction (Wayne Koestenbaum's
"Jackie Under My Skin"), biography as spin
job (Andrew Morton's "Diana: Her True Story"), biography as
philosophical manifesto (Norman
Mailer's "Portrait of Picasso as a Young Man") and biography as
feminist polemic (Francine du Plessix Gray's
"Rage and Fire: A Life of Louise Colet"). While some of these authors
were candid about what they were up to, Edmund
Morris's ludicrous 1999 book "Dutch: A Memoir of Ronald
Reagan" - which recounted Ronald Reagan's life through the prism of a
fictional narrator, who liked to talk about himself - was sold as "the
only biography ever authorized by a sitting president."
Equally egregious was "The Last Brother," Joe McGinniss's
speculative portrait of Senator Edward
M. Kennedy - a book in which the author acknowledged that he'd
"written certain scenes and described certain events from what I have
inferred to be his point of view," despite the fact that he did not even
interview the senator for the book. "This is my view, and perhaps mine
alone," Mr. McGinniss wrote, "of what life
might have been like for Teddy."
While books like these were further blurring the lines between fact and
fiction- a development that had begun years before with the rise of the new
journalism, which appropriated the techniques of fiction without assuming its
prerogative of invention - academics were questioning the very nature of
reality.
By focusing on the "indeterminacy" of texts and the crucial
role of the critic in imputing meaning, deconstructionists were purveying a
fashionably nihilistic view of the world, suggesting that all meaning is
relative, all truth elusive. And by focusing on the
point of view of the historian (gender, class, race, ideology, etc.), radical
feminists and multiculturalists were arguing that history is an adjunct of
identity politics, that all statements about the past are expressions of power
and that all truths are therefore political and contingent.
Variations on these arguments were used by Janet Malcolm, who disingenuously
suggested in "The Silent Woman," her highly partisan 1994 portrait of
Ted
Hughes and Sylvia Plath, that all biographers share her disdain for
fairness and objectivity.
The dangers of such relativistic theories are profound. As Deborah Lipstadt, the author of "Denying the Holocaust: The
Growing Assault on Truth and Memory," has argued, the suggestion that no
event or fact has a fixed meaning leads to the premise that "any truth can
be retold." And when people assert that there is no ultimate historical
reality, an environment is created in which the testimony of a witness
to the Holocaust - like Mr. Wiesel, the author of "Night" - can
actually be questioned.
In her 1994 book "On Looking Into the
Abyss," the historian Gertrude Himmelfarb argued that historians have always known "what
postmodernism professes to have just discovered" - that any historical
work "is necessarily imperfect, tentative and partial." Yet
postmodernists do not merely acknowledge the obstacles that stand in the way of
objectivity but also celebrate those obstacles, elevating relativism into a
kind of end in itself. They strive to be imaginative,
inventive or creative, instead of accurate and knowledgeable.
This, in a sense, is what Mr. Frey has done on a petty scale in "A
Million Little Pieces." But he's not the only one to indulge in this sort
of relativism or these sorts of situational ethics. President Clinton famously
answered a question, during the Monica
Lewinsky scandal, with the words "it depends on what the meaning of
the word is is." And members of the current
Bush administration, as Franklin Foer has written in The New Republic, have
promoted "the radically postmodern view that 'science,' 'objectivity' and
'truth' are guises for an ulterior, leftist agenda," arguing that
experts (be they experts on the environment, Medicare or postwar Iraq)
"are so incapable of dispassionate and disinterested analysis that their
work doesn't even merit a hearing."
The Bush White House has used similar arguments to try to discredit the
mainstream press and its watch-dog role, suggesting that there is no such thing
as truly independent reporting or even a set of mutually agreed upon facts,
that there are no distinctions between willfully partisan hacks and reporters
who genuinely strive to deliver the best obtainable truth.
This relativistic mindset compounds the public cynicism that has hardened in
recent years, in the wake of corporate scandals, political corruption scandals
and the selling of the war against Iraq on the discredited premise of weapons
of mass destruction. And it creates a climate in which concepts like
"credibility" and "perception" replace the old ideas of
objective truth - a climate in which the efforts of nonfiction writers to be
as truthful and accurate as possible give way to shrugs about percentage points
of accountability, a climate in which Ms. Winfrey can declare that the
revelation that Mr. Frey made up parts of his memoir is "much ado about
nothing."
IT'S been 18 years since Tom Wolfe heralded the
arrival of the "Me Generation," and 15 years since Christopher Lasch anatomized "the culture of narcissism"; in
those intervening years the self has become, in Robert Hughes's words,
"the sacred cow of American culture." In literature, in philosophy,
even in the writing of history, subjectivity has gained a new ascendency.
Post-modernists now place quotation marks around words like
"reality," insisting that the old notion of objective knowledge has
become obsolete. Multiculturalists argue for new curriculums not on the basis
of factual accuracy, but on the basis of "self-esteem." And
politicians and their spin doctors use pseudo-events and photo-ops to market
virtual-reality versions of themselves to the public.
Throughout our culture, the old notions of "truth" and
"knowledge" are in danger of being replaced by the new ones of
"opinion," "perception" and "credibility."
On television, on radio, in the pages of newspapers and magazines, on the
sidewalks and in taxicabs, we are pelted with opinions. There are reviews of
books, movies and plays; and there are reviews of hamburgers, spas and
cosmetics. Waiters editorialize about the menu selections in restaurants. And
polls are conducted daily, weekly, annually on everything from the state of the
nation to the state of one's marriage. Never mind that respondents sometimes
offer opinions on matters they know nothing about: in one famous poll, students
cheerfully dispensed opinions on three completely fictional nationalities (the Pireneans, the Danireans and the Wallonians).
At the same time, critics, columnists and opinion-makers seem to have
acquired a new visibility on the public stage. In a bizarre news conference
last week, Bobby Ray Inman largely blamed newspaper columns for his abrupt
decision to withdraw his name from nomination as Secretary of Defense. Two
radio-show loudmouths, Rush Limbaugh and Howard Stern, recently shared a cover
of Time magazine titled "Voice of America?" And two new series featuring
fictional pundits recently made their debut on television: Fox has begun
broadcasting a sit-com called "Monty," starring Henry Winkler as a
right-wing talk show host, while this week ABC introduced a new cartoon series
called "The Critic," which comes from the creators of "The
Simpsons" and features a pudgy, balding movie critic who reviews pictures
like "Home Alone 5," "Rabbi P.I." and "The Bad News
Mets."
On Oprah and Geraldo's public confessionals,
on Larry King and C-Span and Court TV, people
stand up or call in to volunteer their thoughts on Michael Jackson and the
Buffalo Bills, the future of NATO and the existence of U.F.O.'s. This week,
President Clinton's State of the Union Address was not only followed, on
assorted networks, by spin sessions from Democrats and Republicans, but also by
commentary from journalists and think-tank observers, and instant polls of the
American public. A handful of selected citizens offered their personal
reactions to the speech on CBS, while hundreds of others called the White House
public comment line.
People who prefer to consume passively the opinions of professional pundits
have also sent new books by Mr. Stern ("Private Parts") and Mr.
Limbaugh ("See, I Told You So") to the top of the best-seller lists.
The virtue of each man, their mutual book editor, Judith Regan, recently told
Time magazine, is that he "says what's on his mind." Their fans don't
necessarily agree with their views; rather, Mr. Stern and Mr. Limbaugh are
celebrated for their outrageousness, their vitriol, their ability to combine
show business with social apercus.
Bully Talk Shows
Similar dynamics are at work on some of those roundtable talk shows that pit
journalists (and the occasional politician) against one another in rancorous
debate. Reason and knowledge aren't the deciding factors in these shows;
aggressiveness, sarcasm and elan often are. John
McLaughlin, the bullying host of "The McLaughlin Group," goads his
fellow panel members to make snap predictions about recent political
developments on a scale of 0 to 10 ("Ten being metaphysical confidence;
zero being metaphysical doubt"), while members of "The Capital
Gang" cockily present their picks for "Outrage of the Week."
In "Sound and Fury: The Washington Punditocracy
and the Collapse of American Politics," Eric Alterman,
a senior fellow of the World Policy Institute in New York City, argues that the
sound bites offered on such shows are "designed to
trigger the emotions without disturbing the mind." [The actual truth of this and an indication of its
mechanisms have now been revealed. See the accompanying article asserting
that "Partisan Thought Is Unconscious" which I have named "when
faced with a contradiction.html". - FNC]
"Unlike print journalism," Mr. Alterman
writes in an essay, "television does not hold itself to recognized
journalistic standards of truth-telling and fact-checking.
Name calling, wimp-baiting and liberal-bashing prove more exciting to viewers.
The result of the rise of this punditocracy has been
an increase in invective, oversimplification and militarization of the
Washington political dialogue."
Accompanying this development has been the proliferation of a phenomenon
Daniel J. Boorstin identified in his 1961 book
"The Image: A Guide to Pseudo-Events in America": that is,
nonspontaneous, carefully planned happenings "planted primarily (not
always exclusively) for the immediate purpose of being reported or
reproduced." Since the publication of "The Image," the
pseudo-event -- in the form of news conferences, photo-ops, ceremonial
appearances -- has permeated political culture, as has the art of the spin. In
fact, in the current media environment, a lot of politics takes place along a
Mobius strip of perception: focus groups (carefully interviewed sets of voters)
are used to hone a politician's image, which is further polished -- and
assaulted -- by spin doctors on either side who, along with commentators,
reinterpret a candidate back to the voter.
The net effect of all this spinning and sculpturing can be an air of
unreality: in Mr. Boorstin's words, the
displacement of "the language of ideals" by "the language of
images." Interpretation comes to overshadow facts.
Improving History
It's a phenomenon visible throughout our culture today. Film makers like
Oliver Stone feel the license to refashion historical events, even recent ones,
into stories of their own making, while more and more biographers choose to put
their own spin on their subjects' lives, in some cases turning the story
they've supposedly researched into a parable meant to illustrate feminist or
other ideological points; in some cases, interpolating their own thoughts,
feelings and musings into the narrative itself. The trend culminated, most
distressingly, last summer with the publication of "The Last Brother,"
Joe McGinniss's highly speculative portrait of
Senator Edward M. Kennedy, which was filled with phrases like "it must
have seemed" and "it was entirely possible."
What's more, as reality comes to seem increasingly artificial, complex and manipulable, people tend to grow increasingly cynical,
increasingly convinced of the authenticity of their own emotions and
increasingly inclined to trust their ideological reflexes, a development
reflected by polarized public reaction to recent controversies in the news.
During the Anita Hill-Clarence Thomas hearings, for instance, completely
contradictory portraits of Ms. Hill emerged. While some witnesses depicted her
as demure and passive, others portrayed her as ambitious and calculating; her
supporters and detractors outside the courtroom similarly clung to highly
contradictory views. The recent Bobbitt and Menendez trials have also elicited passionate and sharply divided perceptions, which often have
more to do with innate sympathies and instincts than actual testimony and
facts. Realism, R.I.P.
Over the last few decades, literature, too, has turned toward the knowable
world of the self. In a much-talked-about 1989 essay, Tom Wolfe lamented the
demise of the realistic novel, the
sort of novel once written by the likes of Balzac, Zola, Dickens and Thackeray
and played out against the big, noisy public world of society at large.
"Young writers," Mr. Wolfe observed, "are constantly told,
'Write what you know.' There is nothing wrong with that rule as a starting
point, but it seems to get quickly magnified into an unspoken maxim: The only
valid experience is personal experience."
Not only has there been an efflorescence of fiction centered on the private
realm of domestic difficulties and existential woes, but there are also more
and more stories that focus on characters' inability to connect, what the
critic Benjamin DeMott has called their
"interpersonal blankness."
Earlier generations of writers, Mr. DeMott noted
in an essay, "were not yet tyrannized by the notion, spawned by the sexual
revolution, that the claims of public reality and long-term human connection
were less significant than those of private ego and merely personal will."
As the ambition to depict a larger social world has eroded, the
old-fashioned omniscient narrative has begun to lose ground to stories delivered by a single first-person narrator, or a
series of first-person narrators. The latter technique, pioneered by
William Faulkner in "As I Lay Dying," has become highly popular in
recent years, used by novelists as varied as Louise Erdrich,
Michael Dorris, Francis King and Julian Barnes [And
notably Gabriel García Márquez
(notably, in The Leaf Storm)] to emphasize the relativity of
perception and the fragmentary nature of truth.
Other writers, like Alice Hoffman and Fay Weldon, have appropriated the
techniques of magical realism -- pioneered by writers in places like Latin
America and Eastern Europe to grapple with those regions' surreal political
realities -- for purely personal ends. In their hands, techniques that once
conveyed the absurdities and disjunctions of the world at large have devolved
into charming bits of whimsy, meant to add sparkle to their characters'
domestic dilemmas.
Too Much to Know
Why, in this age of the "information highway," have people rushed
to embrace the principle of subjectivity? No doubt the exponential growth of
information is partly to blame. As the world grows more complicated and
knowledge becomes more specialized, it becomes difficult for anyone to have
a genuinely comprehensive overview of the world. On many subjects, people
either volunteer a largely uninformed opinion of their own, or rely upon the
opinions of advisers, specialists or so-called experts.
Meanwhile, the new accessibility of facts made possible by computer networks
and 24-hour television news has been matched by an awareness of how easily
those facts can be manipulated.
In her forthcoming book, "Tainted Truth: The Manipulation of Fact in
America," the Wall Street Journal reporter Cynthia Crossen
argues that biased polls and questionable statistics are routinely used by
lobbyists, advertisers and special-interest groups for their own ends.
"The proposition that there is no such thing as objective reality is
easier on researchers than on consumers of information," she writes.
"For researchers, it means the standard of proof is whatever can be
defended, and absent outright fraud, there is no real way for anyone to
disprove any particular set of findings. For the rest of us, it means that we,
who lack the tools, expertise and confidence to judge most information, must
delicately weigh results against motives before we believe anything. We are
left adrift in doubt, losing our grip on one of our most potent tools:
numbers."
On Campus
At the same time, in academia, the rise of post-modernism and deconstruction
has further galvanized the assault on objectivity. As far as
post-modernists are concerned, it's the critic, not the author, who counts. As
they see it, all texts (that is, all novels, all pronouncements, all historical
events, etc.) are indeterminate, subject to endless interpretation and
re-interpretation. In terms of history, the historian Gertrude Himmelfarb writes in her new book, "On Looking Into
the Abyss: Untimely Thoughts on Culture and Society," this view suggests
"a denial of the fixity of the past, of the reality of the past apart from
what the historian chooses to make of it, and thus of any objective truth about
the past."
Post-modernist history, she adds, "recognizes no reality principle,
only the pleasure principle -- history at the pleasure of the historian."
To complicate matters further, the idea of objective knowledge has also come
under assault from radical multiculturalists, who see history as an
ideological battleground for control of a nation's memory. As the
historians Joyce Appleby, Lynn Hunt and Margaret Jacob observe in their new
book, "Telling the Truth About History":
"The more extreme multiculturalists celebrate the virtues of
fragmentation. History for them has become an adjunct to 'identity politics,'
which seeks to realign political forces according to voters' ethnic, social,
and sexual identities. Some insist that since all history has a political --
often a propaganda -- function, it is time for each group to rewrite history
from its own perspective and thereby reaffirm its own past."
The Implications
Few would deny that all historical studies are, to some degree, tentative
and fragmentary. Nor is subjectivity, in and of itself, a bad thing. As the
basis for a philosophical outlook, it was a product of the Romantic movement, evolving as an antidote to the arrogant
rationality of the Enlightenment. In recent years, however, the virtual
enshrinement of subjectivity in both academia and popular culture has serious
implications.
The post-modernists' celebration of discontinuity and contradiction,
combined with their willful determination to view everything through an
ideological lens, suggests that the very search for truth (or truths) is an
absurd and irrelevant end. It suggests that the democratic idea of consensus is
futile. And it suggests that everyone, in Tocqueville's words, is confined
"entirely within the solitude of his own heart."
It also postulates a completely relativistic universe in which all theories
-- however reprehensible or preposterous -- must be accorded respect, a
universe in which truths are replaced by opinions. As the literary theorist and
law professor Stanley Fish once ominously declared, the death of objectivity "relieves me of the obligation to be right"; it
"demands only that I be interesting."
The Books
"THE CULTURE OF NARCISSISM: AMERICAN LIFE IN AN AGE OF DIMINISHING
EXPECTATIONS." By Christopher Lasch. W. W.
Norton & Company. $10.95. "THE IMAGE: A GUIDE TO PSEUDO-EVENTS IN
AMERICA." By Daniel J. Boorstin. Vintage Books.
$12. "ON LOOKING INTO THE ABYSS: UNTIMELY THOUGHTS ON CULTURE AND
SOCIETY." By Gertrude Himmelfarb. Alfred A.
Knopf. $24. "THE PURPLE DECADES: A READER." By Tom Wolfe. Farrar,
Straus & Giroux. $17.50. "SOUND AND FURY: THE WASHINGTON PUNDITOCRACY
AND THE COLLAPSE OF AMERICAN POLITICS." By Eric Alterman.
Harper Perennial. $12. "TAINTED TRUTH: THE MANIPULATION OF FACT IN
AMERICA." By Cynthia Crossen. (Forthcoming from
Simon & Schuster in June.) $23. "TELLING THE TRUTH ABOUT
HISTORY." By Joyce Appleby, Lynn Hunt and Margaret Jacob. (Forthcoming
from W. W. Norton & Company in April.) $25.