In my novel Nira/Sussa, there’s a scene in a Midwestern Japanese restaurant with holes in the floor so patrons can sit in what they perceive to be Japanese style without, you know, actually having to do so.
The novel uses this as a metaphor for how its customers want the experience of something different but don’t really want the often unsettling challenge of an encounter with the different.
Real difference, of course, isn’t comfortable. It can be disturbing. It can even shake your perceptions of reality, especially things you hadn’t examined before.
Diversity isn’t supposed to be comfortable. We know the “melting pot” model is bunk — it always was, and that term originated in a fictional work mocking the idea. Instead, we often talk of “multiculturalism” — meaning different cultures existing side by side, respecting one another.
That’s a concept that’s under attack. Indeed, David Cameron, the current British prime minister, has pronounced the failure of multiculturalism as a model — a sure sign of danger ahead. White Europe’s long been concerned with a perceived failure of immigrants to acculturate. The same may be seen in the U.S., where the GOP has long seemed to be rushing to demonize Mexicans, and more recently Arabs and Muslims (two quite different groups), as somehow the cause of America’s problems.
But while multicuturalism is threatened by the right, it’s also long been softened by the left. The mantra there seems to be that “group X is the same as you and me.” And of course, that’s largely true: in general, human beings breathe oxygen, want to avoid being poor or being killed, and want their children to have at least the same — if not actually better lives with more opportunity. Whether one turns to the stirring words of John F. Kennedy or of Shakespeare’s Shylock, we are far more alike than different — a fact now confirmed through scientific, genetic analysis. But sometimes, in making this case, the left tends to fall back into the melting-pot trap of pretending no differences exist. And those differences, whether cultural or genetic, are vital and improve everyone’s quality of life.
We most easily see the advantages of diversity in colleges, which have long been criticized as a playground for affirmative action. But having multiple viewpoints and experiences enriches our understanding, whether it’s of the literature being discussed or even of how we practice science. Because even in science, which is often held aloft as abstract and objective, the same statement can be made in different ways, from different points of view. And it’s often those who see differently who have the most profound insights.
This same distinction, between (1) the urge to support tolerance by eliding differences and (2) the urge to embrace differences despite the challenges they present, can be seen in how we deal with those with “disabilities.” I’m not a fan of political correctness, and I can’t stand euphemisms. But even if it’s mostly employed as a euphemism, there is truth to the phrase “differently abled.” Because “disabilities” can also allow for different perspectives.
For example, America is now rife with people promising to “cure” autism. But while we certainly need more treatment — and study to determine various treatments’ efficacy — we now understand that the unique perspective of those on the autism spectrum has been key to major insights from which we all benefit. Temple Grandin’s case is perhaps the most famous, but several of the highest achievers in economics, math, and writing have had Asperger’s Syndrome.
The often reviled feminist-whom-feminists-love-to-hate, Camille Paglia, has written brilliantly about how the desire to promote the acceptance of gays led, beginning in the 1980s, to a presentation of homosexuals as monogamous and inoffensive. In order to win acceptance, any challenge was systematically stripped from the equation. The public image of homosexuality became a smiling, clean-cut Ozzie and Harriet, only with one partner’s sex changed. But if you look at the origin of the gay movement (at the Stonewall Riots, for example), this wasn’t at all the case. Part of what Paglia found so intellectually stimulating about gayness was this radical difference, which challenged basic ideals such as monogamy.
Obviously, no one should want to force gays back into bathhouses and ass-less chaps, any more than gays should be made to conform to a whitewashed, cookie-cutter ideal of the 1950s American family. And of course, gays are deserving of the civil right of marriage, if the law is to recognize marriage for anyone.
But there’s a sad truth underneath all the conventional jokes (often made by gays) about how not having to worry about marriage used to be part of what made being gay appealing. It’s hard to argue that something hasn’t been lost, in the name of cultural acceptance.
This loss hasn’t been gays’ alone. As marriage and even monogamy collapse as social norms, if they ever were, it’s hard not to think that we might be further along in our collective process of dealing with this transition if we had been willing to embrace the truly different, rather than some neatly commodified version thereof.
Not to mention those who find that sadomasochism, or dominance and submission, is part of their sexual identity. That too we present in safe forms, or as fodder for comedy.
This same confusion of sanitized difference for actual difference is also visible in the arts. Since we’ve been talking about homosexuality, consider the “landmark” 1993 film Philadelphia, which addressed AIDS and homophobia. That’s all for the good, and the film did encourage Hollywood stars (such as Tom Hanks and Antonio Banderas, who played a gay couple in the film) to recognize that playing a gay wasn’t a career-ending choice. In fact, it could lead to acclaim.
But homosexuality and AIDS had long been addressed on TV, in both fiction and on the nightly news, as well as in literature. The film’s gay couple was thoroughly straight-acceptable. They were the kind of people affluent straights would want in their condo complex. And many moderate scenes of affection between the couple were cut, lest the straight audience be forced to confront more than the slightest hint at, you know, what gay actually meant.
True, the film’s main gay character confesses he got AIDS from anonymous sex in a porno theater. But he’s not proud of it, and he makes the confession in a courtroom, a setting that represents the triumph of “civilization” over anarchistic, socially dangerous desires. It’s a setting that’s comfortable for the audience and familiar from myriad courtroom dramas. And he makes this confession as part of a lawsuit against the people who unfairly kicked this sympathetic, dying man to the curb. His lawyer, of course, is black (played by Denzel Washington), so that we make the connection between racism and homophobia — again without having to deal with, you know, actual homosexual behavior.
Don’t get me wrong: I’m glad Philadelphia was made, and it represented a cultural net positive. It’s also well-written and well-acted. In fact, all of the above examples of how tame the film is are also evidence of how carefully it’s been written, so as to present its case against homophobia with the broadest possible cultural appeal.
But it was a watershed moment only for Hollywood, which is why it got so much attention.
And it’s an excellent example of the commodification of difference over actual difference. It let well-intentioned straights feel progressive and tolerant… without ever giving themselves anything to be tolerant of.
We see this same syndrome in many racial depictions, where social differences are elided in favor of making the audience feel tolerant.
For example, the 1999 film The Green Mile (also starring Tom Hanks) presents a sympathetic, wrongly-convicted black man on death row. Never mind that he’s executed without any of the (supposedly heroic) white characters trying to stop it. Why, the black man even has a change of heart and asks to die. (He’s also got magical powers, a syndrome known in criticism as the “magic negro” and used to present difference in non-threatening way.) And while he’s unjustly convicted and the film questions at least the way capital punishment is done, the film takes place largely in the 1930s, conveniently removed from the audience’s own, presumably more just time. Consequently, the white audience gets to feel tolerant and racially enlightened while never feeling any having to actually do anything.
Even worse is the 2006 film The Pursuit of Happyness, which presents a homeless black man (Chris Gardner) who succeeds in capitalist America by virtue of his own talent and hard work — and while raising a son by himself at the same time! Yeah, the black character’s got a hard life, but his success proves the audience doesn’t need to do anything — like actually changing the circumstances of the homeless or addressing social iniquities. (He even champions standard American English! Take that, Ebonics!) The ultra-conservative National Review even named the film one of its “Best Conservative Movies” for its positive portrayal of “the world of finance.” Never mind that Chris Gardner’s memoir, on which the film is based, presents a far more challenging depiction of difference. Gardner’s son was conceived in an affair while Gardner was married to another woman; he briefly sold drugs; he did cocaine, PCP, and pot; and he didn’t know where his son was for four of the months the film depicts. He was also arrested when the boy’s mother accused him of domestic violence, which the film transforms into an arrest for parking tickets. All the challenging edges get sanitized, letting audiences believe they’re experiencing difference when they’re actually being sold a commodified illusion of difference.
Of course, this same syndrome is also seen in “noble savage” stories, such as 1990′s Dances with Wolves, 2003′s The Last Samurai, and 2009′s Avatar — the highest grossing movie of all time. In these stories, white characters are used as audience identification figures but also act as saviors, without which a minority population would be lost. Yet despite this helplessness on the part of these minorities, their societies are presented as different only in that they are idyllic and utopian. Here again, any truly challenging differences are whitewashed, even if they’re differences (such as gender discrimination) shared by white societies of the same era. These stories do criticize their dominant societies, but they do so in a safe way that allows s members of the audience to feel as if they’re embracing diversity while almost entirely avoiding any of diversity’s challenges.
All of this is fairly well-known, at least among critics and academics. This is so much the case that it’s now possible for people to accept all of these ways in which difference is sanitized and still remain comfortably immune to the underlying point. It’s as if even these various syndromes have themselves been commodified, so that they too can be embraced without challenge.
At its root, this commodification of difference seeks to render difference polite and inoffensive. And with all due respect, few things are as offensive as the expectation that the world should be inoffensive.
Let’s return, briefly, to 1993′s Philadelphia, because there’s a reason it tied tolerance of homosexuality to AIDS. After AIDS was identified in the U.S. in 1981, health-care experts understood a crisis loomed and said so publicly. But because male gays were disproportionately likely to contract the disease, it was labelled a gay problem, not a wider one. This was spurred by the religious right, then a new phenomenon, which used demagoguery of homosexuals to help win elections for Republicans. Jerry Falwell, icon of the religious right and founder of Liberty University, said AIDS was “the wrath of God upon homosexuals.” Pat Buchanan, Reagan’s communications director, agreed, calling the disease “nature’s revenge on gay men.” To even address AIDS sympathetically became tantamount to defending “immoral” behavior. The Reagan administration even instructed its own Surgeon General, C. Everett Koop, not to address the disease.
Near the 1984 Democratic National Convention in San Francisco, over 100,000 marched to support funds for AIDS research — a very polite, inoffensive tactic, in the wake of what was not being called an epidemic. But these protesters were dismissed, by the right but also by may in the media, as largely being composed of gays who wanted federal funding for their own, private health care crisis.
President Reagan, who won reelection that year, had never said a public word about the disease. At a press conference in 1985, he briefly broke his silence — to speculate that scientists did not know conclusively how the disease was spread, thus stirring paranoia about the disease. He further questioned whether children with AIDS should be allowed to attend school with healthy children, thus stoking the paranoia and xenophobia of parents everywhere. Homosexuals, you see, might infect your children with this new incurable, deadly disease.
Later in 1985, Reagan’s friend and fellow actor, Rock Hudson, was hospitalized and revealed to have the disease. Still, Reagan did nothing. In 1986, as social pressure mounted and thousands died, the Reagan administration authorized C. Everett Koop to make a statement. Being a doctor, Koop quite sensibly encouraged the use of condoms as part of a comprehensive plan to address the epidemic. You know, to save lives. But the rest of the administration — and the religious right — was horrified, because this went against their stance that condoms encouraged promiscuity.
In response to the silence on the matter, ACT UP formed and began staging protests. Its first major event was a 24 March 1987 march on Wall Street, and it went on to stage similar events across the nation. It particularly targeted the Catholic church, which had taken reprehensible — but not entirely surprising — positions against condoms and AIDS education. These acts of civil disobedience were often condemned as rude.
Reagan finally publicly addressed AIDS again — and the first time with any seriousness — on 31 May 1987, at what was by then the third International Conference on AIDS. By the time he spoke, over 36,000 Americans had been diagnosed with AIDS and 20,849 had already died — many more times the number of Americans who died on 9/11. Reagan appointed a commission to study the problem, and it again offended the right by coming to sane conclusions such as increased drug treatment and AIDS research.
The moral of the story ought to be that our politics is demonstrably as tarnished by our politeness as our art is. Politeness kills more than debates.
If you ask people who support the arts, they’ll often say that they enjoy challenging work, work different from the bland and the normal.
Too often, what they really want is the commodified experience of this. That’s usually what wins the prizes: dainty literary exercises that pretend to offer one foreign experience or another, but are only challenging to the dominant high-culture narratives in the most indirect and polite of ways.
And this in an era in which artistic criticism has been so demeaned that daring to point out that a work is illogical or poorly constructed or anything other than fun can seem frighteningly counter-cultural. Alien, even, if not actually from Mars. In a time in which art grows slicker and slicker, yet has grown so vapid as to confuse simple themes with serious artistic interrogations, the cultural elites have offered little resistance. Nor have they generally sought to wed this new slickness with meaningful literary intelligence. It sometimes seems as if they’ve holed up into a regressive, self-affirming community every bit as much as the racists and the reactionaries.
And this brings us back to Nira/Sussa. I hope I can be forgiven such self-reference, but it expresses and encapsulates over a decade of evolving thought, so it’s perhaps not surprising that I turn to it at this moment.
In the novel, those holes in the floor of the Midwestern Japanese restaurant offer a metaphor for the absence at the heart of the way the Other is presented to us. It’s a metaphor extended to the narrator’s sexuality and relationships and — appropriately — his writing. It’s the hole at the heart of our relationships, in our pretense at understanding one another, but it’s a hole that’s also at the heart of the literary establishment.
It’s a metaphor for the difference between paying lip service to a thing, or playing at that thing, and actually practicing it, challenges and all.
Nira/Sussa doesn’t sanitize its portrayal. We’ve had entirely too much of that, when it comes to the novel’s emphasis on gender and sexuality. No, what’s needed — what’s so desperately, desperately needed — is a vigorous and frank discussion that simply cannot occur within polite confines.
It’s a novel that commits fully — even to the point of risking self-destruction — to the idea that we’re lying to ourselves to avoid these uncomfortable realities. And that doing so is manifestly and obviously damaging to the human psyche in ways we have yet to culturally admit.
No, it’s not polite. But only an idiot, or at least an anti-intellectual, could confuse such a thing with shock for the sake of shock.
Although, of course, shock does in itself serve a social function, not only by affirming the First Amendment but by recognizing that societies use the notion of politeness, or “proper” behavior, to reaffirm their current power structures and to preempt and prevent change.
Martian Lit isn’t political, in the sense that it’s focused on any given political outcome. But one shouldn’t ignore that bold art is always political in the broader sense.
Bold and challenging work only looks irresponsible to those who can’t distinguish between a hole in the floor from the real thing. Or between civil disobedience and dirty, anarchist hordes trying mindlessly to tear down everything.
In fact, when such work — whether artistic or political — is done intelligently, it is the very height of responsibility. The very embodiment of the values the “polite” world claims to hold dear.
And we must be mindful that words like “shocking” is a term usually applied to that which “polite” society finds threatening.
But “polite” society so obviously needs threatening.
Polite art does too.