The School for Stupefaction
SPIEGEL: Können Sie uns erklären, warum die Amerikaner Präsident Bush eigentlich ein zweites Mal gewählt haben?
[PHILIP] ROTH: Es hatte mit dem Krieg zu tun und mit politischer Dummheit.
Der Spiegel, 6/2008, 131-32.
Stupefaction. The word is closely related to stupor and stupidity, and means something like the production of the state or condition of stupor. Yet the more positive word stupendous also derives from stupeo, stupere, to be struck by amazement, struck perhaps to the point of stupefaction. No doubt this ambivalent root stup-, the Greek, will return to haunt us, but for the moment it is the stupid side of stupefaction that concerns me. In any case, to be expert in the workings of stupefaction would not be so fortunate a thing. Imagine the comments I might overhear in the hallway as I am leaving:
—Now, there’s a man who clearly knows whereof he speaks! What a perfect confluence of theory and practice, of the constative and the performative! He doesn’t simply talk about it, he incorporates stupefaction through and through.
Let me hope, then, for a relative lack of success. The questions I want to raise begin with a conversation I had some years ago with a normally congenial and yet at that moment very angry man. He was a fruit merchant from Trieste , on holiday and staying at my pension. He had explained to me, before he became angry, that he did not own the fruit stand in the central market where he worked: he was a simple man—he carried the crates, not the money bags. We struck up one of those holiday evening conversations that meander here and there, touching on the state of the world in general. The time was shortly after the reelection of George W. Bush to a second term as President, even after the abominations of the first term. The fruit merchant paused and shook his head. His voice lowered in discouragement and disgust.
—The American people, he said, must be the stupidest people who have ever lived on the Earth.
He then realized that he was addressing one of them and he began to apologize. I assured him that I took no offense, he needn’t apologize. After all, I was a philosopher, and I knew that his assertion was empirically unverifiable and transcendentally vacuous. Upon what standard could the stupidity of peoples be measured? Neither a transcendental deduction nor an empirical test could be devised. The American people were safe.
And yet. Is it not remarkable that millions upon millions of people all over the globe, from Australia to Labrador, from South Africa to Canada , and from New York to San Francisco , were thinking that very same thought? By the time of the reelection, indeed, they had been thinking that thought for a period of years. Note that they were not thinking about the stupidity of the President himself. Nor will I be speaking about that today. Some years ago there was a hot national debate on the topic of whether the President could actually be as stupid as he seemed, but that debate has cooled. No one is really interested anymore. The editor of one of the many book collections of “Bushisms,” that is to say, of those creative additions to the American tongue that the President—usually unwittingly—has coined, remarked astutely that it was not the question of actual stupidity that deserved our attention but the fact that the President took such obvious pride in his obtuseness, whether genuine or feigned. His, argued the editor, was a cultivated cretinism. Stupidity, however, may be like the madness that someone feigns, only to discover that the pretense cannot be dislodged, and that in fact one is as mad—or as stupid—as one believed one was pretending to be. At all events, the serious question is this: for whom, and for how many, was the President successfully cultivating and marketing his stupidity? Was he the headmaster—or at least the mascot—of a new school, the school for stupefaction?
I tried to reflect on this. I reasoned that in the case of the nation at large it could not be a matter of I.Q., the famous “intelligence quotient.” My fruit merchant and the other millions were making a judgment of quality, not quantity, concerning the voting public in the United States . One could of course make jokes about I.Q., inasmuch as Americans love to wear their I.Q. score on their tee-shirts as a badge of “genius,” as though the declared number were enough to compensate for their behavior. In spite of those proud numbers, could it be that, qualitatively speaking, a new kind of stupidity rides the land? Has the United States become for all the world a school for stupefaction? Is the equation for our time: globalization = Americanization = stupefaction?
Stupor and stupefaction, along with stupidity, but also that uncannily positive word stupendous, were the words that intrigued me. I began to teach a course at the university called “The School for Stupefaction.” In addition to the philosophical texts we were reading, Heidegger’s Was heißt Denken? and Hannah Arendt’s “On Thinking and Moral Considerations,” we studied a historical text by Richard Hofstadter entitled Anti-Intellectualism in American Life.1 It is an altogether remarkable book. Although written at the end of the Eisenhower era and on the threshold of the brief Kennedy years, Hofstadter’s work is more relevant today than ever; it teaches us so much about the history of American culture—and about the leading role that a willful stupidity has always played in that culture. Certainly no book has helped me more with my topic, and I urge those of you are interested in U.S.-American stupefaction to study it.
Perhaps most impressive in Hofstadter’s treatise are the sections on religion and education. With regard to the latter, Hofstadter is able to show how social adjustment rather than learning and critical inquiry has always dominated the U.S.-American curriculum; he demonstrates that intellectualism—that is, whatever lies beyond the mere calculative use of reason and mind, whatever it is that urges us to question and challenge fundamentals—has been spurned as “elitism,” condemned as “un-American.” It is disconcerting to see how much the validation of social status is still the goal of our educational systems from kindergarten through university. More striking still is Hofstadter’s analysis of evangelism in American religion. Whereas the Puritan tradition of the eighteenth century retained considerable respect for intellect and for learning—Jonathan Edwards and his “inward journey” do not scorn but appeal to the mind—the atavistic evangelism that spread through the American heartland during the nineteenth century, allying itself from the start with the political right, vilified all things intellectual as the work of the devil. The rhetoric of evangelist preachers, indistinguishable from the rhetoric of the Grand Wizard of the Ku Klux Klan, struck me and my students as the selfsame rhetoric that was emanating from the White House today. Yet this suggests that stupefaction has always been at work in America , and has always enjoyed success; it implies also that the continuous movement to the right in politics, in support of the most recidivist forms of chauvinism, xenophobia, and flag-waving militarism would hardly indicate a qualitative difference between past and present.
Furthermore, most American writers—with the exceptions of Gore Vidal, who has written recently quite brilliantly on the decline of the American empire, Philip Roth, whose recent novel, Exit Ghost, contemplates the moral and intellectual nadir to which the nation has sunk, and of course the late great Kurt Vonnegut—most American writers, I say, would tell me that I was and am placing too much emphasis on the new. Nathaniel West, author of The Day of the Locust, would say that I clearly knew nothing of the 1930s in the United States . And the most “American” of our writers, Samuel Langhorne Clemens (Mark Twain), author of those remarkable late essays, “The War Prayer,” “To the Person Sitting in Darkness,” and others, would remark rather wryly that clearly I had overlooked the entire nineteenth century. Stupidity in popular culture and in politics is obviously an august tradition in the United States . Even so, I persist in believing that the contemporary United States has risen—or descended—to a new level.
Some aspects of stupor, stupidity, and stupefaction struck my students as new. Most remarkable today, they asserted, is the almost complete inability to distinguish between image and reality. The television “reality shows” are successful because reality has already become a show. My students marveled at the paradox of a highly efficient calculative intellect at work in the production of these images and the blatant stupidity of the images themselves. A word about one of the class’s assignments: these were first-year students, and knowing how familiar they would be with television, I asked a group to make a presentation on the topic, “My Favorite Stupid TV Reality Show—and the Intelligence Behind It.” The idea was to demonstrate how knowledgeable and calculatingly clever the producers and directors of such shows had to be. (Perhaps I was thinking along the lines of analysis that Herbert Marcuse set out in One-Dimensional Man. ) The group chose a weekly show called “The Swan.” The concept of the show? Each week the production team presented the dramatic make-over of an ugly duckling into a swan. They scoured the land in search of the homeliest and most miserable women—women alone, it seems, could be ugly ducklings, perhaps because ninety percent of the viewership, the group told me, was female. For a period of about six weeks each ugly duckling was pared down, beefed up, tucked in, turned out, sent off to the dentist, the hair salon, and the silicone factory, and at the end of that period, totally remade, displayed to a marveling audience—who had meanwhile been shown film clips of the horror that once was the ugly duckling. At the same moment, the neophyte swan was—presumably for the first time—revealed to herself: there she stands, gasping before a gigantic mirror, the radiant swan herself in ecstatic disbelief. In order that the shock of misrecognition not be too overwhelming, the producers provided two full weeks of psychotherapy for the nascent swan; the inner ugly duckling would now be as svelte as the swan on the outside, something that Socrates too wished for himself, as you will recall from his prayer to Pan at the end of Phaedrus, “O friend Pan, and you other gods who haunt this region, grant that I may become beautiful within, and that what I have on the outside become a friend to what is within” (279b-c). Two full weeks of psychotherapy! Freud promoted infinite analysis, and it apparently took Socrates a lifetime of prayer, but reality TV works faster.
The students had a good time with this “reality” TV show, as you can imagine. I was in shock, but they had a good time. After a full discussion of the stupidity and vulgarity of the program, during which one student remarked that the show actually involved the transformation of interesting-looking individuals into the identical forty-second street prostitute, a young woman in the back of the room commented, quite shyly, almost inaudibly,
—I watch this show every week. I can see how stupid it is. But probably I’ll watch it again next week.
Her peers jeered, but this brave young woman showed us that the calculative intelligence of the producers and directors should never be underestimated. Her confession leads me to make two disclaimers about stupidity. The first is that where erotic matters are involved—and the Eros of “The Swan,” while Fellini-esque and carnival-like could not be entirely denied—we have to be forgiving of even extreme stupidity. The professor who fails to be forgiving in this regard will be Professor Unrat, that is, Professor Filth, as played by Emil Jannings in Der blaue Engel. He or she will be a clown, nothing more. For Eros and Momus are too closely related for comfort. The second disclaimer is that U.S.-American popular culture, even at its most banal and miasmic level, from Britney Speers to Paris Hilton to whichever jock they are hanging onto this week, garners the attention of the world for reasons beyond the fact that vulgarity and stupidity sell. It is the vitality of U.S.-American popular culture, its astonishing variety and often very sophisticated production and design levels, that is arguably the most fascinating and admirable facet of the country. For every Paris Hilton there is a Hedwig, eloquent and passionate over her Angry Inch; for every Britney Spears there is a Cat Power or a Rufus Wainwright; for every kitschy musical on Broadway there are a dozen wonderful plays off-off Broadway.
The only thing that troubles me about the imperial success of image-making thought in the United States is that it is driving out every other kind of thinking. Another anecdote, if I may, this one closer to home. After a prolonged absence from the United States (I had been teaching and living in Germany ) I returned for a family reunion. One evening my brothers and sisters were talking about local and national political figures, most of whose names were unknown to me, and I had to ask who they were. Yet every now and then, when I asked whether this Jones fellow was a senator, my family would laugh and say, no, he was a character on a TV drama. This happened so often that I eventually realized that political figures and fictional characters on television were somehow interchangeable. Not that my brothers and sisters did not realize that politicians and actors practice two different professions; it was that the difference seemed to be diminishing almost to the zero-point. That probably explains how a B-actor could have become a C-minus-President, and why the next President may well be an even more minor TV actor, presuming of course that along the way he does not get Terminated.
Allow me to insert here two more earnest notes on the quality of news reportage in the United States . The last U.S. newspaper with even a hint of critical journalism, namely, the weekly edition of The Washington Post, recently ran as one of its lead articles a piece that described how President Bush was inviting theologians and philosophers to the White House in order to learn where he went wrong. (So far, I have not been called, and I wonder whether any of you have been chosen. Few have been called, and fewer still will be chosen, and that is probably for the best.) Whether the story is true or not is hardly the issue; one has to ask what sort of readership is being wooed by such a story. Clearly, a readership that wants to grant this catastrophic era a Disney-like happy end, a touch of sentimental, tragicomic heroism, as the discredited leader withdraws into solitary contemplation. In earlier days, a monastery and a confessor would have been in the script. Now it is theologians and philosophers in the West Wing. The stupefaction of such a desired readership resists the powers of our imagination: we Americans have not even begun to look into the mirror of the catastrophe of this past decade. Years ago, during the Vietnam era, the comic strip character “Pogo,” a likable possum, pronounced at the end of one of his strips,
—We have met the enemy, and he is . . . us.
We today are not that far along. We today are better schooled in stupefaction.
A second note. What used to be “public television” in the United States has become a corporate-sponsored and controlled medium—something very much like CNN, which as you know stands for “Commerce, Not News.” The format of the main public television news program requires that the news team discuss with experts the principal stories of the day, in order to provide penetrating analysis. Not long ago, one of the principal stories involved the release by the U.S. Army of a troubling, indeed shocking, set of statistics: they revealed the percentage of current military personnel at all levels who approve of every form of torture and who affirm the slaughter of civilians, including women and children, in wartime. I will not try to recall to you the precise percentages, since my aim is not to shock; I believe you would find them hardly exceeded in even the most horrific military hordes of the past. And what did the expert commentators have to say afterwards? Not a word. The story was “buried,” as they say. Clearly, the corporate sponsors knew what a whirlwind of outrage and despair the story would have raised—outside the country, if no longer within it.
Now that Murdoch has bought The Wall Street Journal, he will soon doubtless take over The Washington Post, so that we will not have even the Bambi of critical journalism; clearly, he already owns what used to be public television. Indeed, the era of critical journalism has passed. Our journalists are now in bed with the Pentagon and the White House—“embedded journalism” indeed—so much so that not even the Pentagon stories will be brought to the attention of the consumers. The only politician who has broached a comment on this collapse of the media is Al Gore, with his recent book, The Assault on Reason. His title understates the matter. “The School for Stupefaction” is perhaps more apt.
But again, what kind of stupidity and stupefaction am I talking about here? How can its measure be taken? As a boy and young man, from high school through college, I worked during the summers on the ground crew of a Midwestern college, cutting grass, painting curbstones, and mopping floors. My sister taught at the college, and that’s how I got the job. Most of the other workers there came from the local state institution for the mentally retarded. My favorite coworker was Laurel, who, as though he were Zarathustra himself, every morning greeted the solar star:
—Good morning, Mr. Sunshine, shine your mighty light on me!
Laurel ’s mortal enemy was Boiler Jim, a very fat man who was not mentally retarded and who demonstrated his superiority by teasing Laurel in the crudest and cruelest ways. Laurel would turn silent and shrink into himself whenever Boiler Jim approached. Boiler Jim would say his worst, give me a wink of complicity, and waddle off to fail at his next task.
Laurel never struck me as being stupid. Boiler Jim never struck me as being anything else. There is a German word for the stupidity I am trying to think about, the word Borniertheit, from the French borné, meaning an arrogantly complacent narrow-mindedness. It is this obtuseness that takes pride in its own limitations that I mean by stupidity. The school for stupefaction would be the training ground for such smug narrow-mindedness—which the German again beautifully describes as the possession of a pinched forehead, Engstirnigkeit. Is this an intellectual or a moral failing? One can by this time see the issue falling on both sides of the traditional philosophical distinction between theory and practice, metaphysics and ethics. A colleague and friend, Walter Brogan, with whom I discussed this “school for stupefaction,” suggested that it was not so much stupidity that characterized the nation as a total selfishness, a blithe and unquestioning belief in U.S.-American exceptionalism, ultimately a stubborn and indefatigable form of avarice. He agreed with me that in our time such avarice was simply stupid; I agreed with him that such stupidity avidly fed on itself like any other vice. A narrow-mindedness that feeds on itself, a self-augmenting, self-solidifying stupidity: this would be the alma mater of the school for stupefaction.
Philosophy, whether theoretical or practical, has preferred to talk about reason and intellection, rather than stupidity and stupefaction. Yet there are many moments in the history of philosophy when stupor, stupefaction, and stupidity rear their jack-o-lantern heads. Allow me to refer to five such moments, though only briefly, and not very systematically.
1. Leibniz uses the words étourdissement, confusion, and Chaos throughout the Monadology, and especially in sections 21-24, when it is a matter of that which opposes intelligence and perspicacity. Étourdissement is that dullness of wit, that torpor or stupor, that we mean when we use the word stupidity. In section 60, he describes animals’ perception as borné, limited, though surely not in the sense of a culpable Borniertheit. At all events, these references certainly have a pejorative tone, the tone that humans adopt when they want to keep the animals at bay—our terror and repugnance before poor Pogo the Possum! Yet in the Nouveaux essais, if my memory serves me right, the famous petites perceptions that Leibniz discusses have a more intrinsic and more intimate connection with what he elsewhere derides as animal stupefaction or bedazzlement. It is almost as though Leibniz, at the moment he approaches what will later unabashedly be called the unconscious, discovers that perspicuity and stupor are no longer opposites. This requires a much closer reading—but I hasten on.
2. Leibniz is at least initially influenced by what Descartes calls hébétude, from the Latin hebesco, to be besotted. Hebetude, for Descartes, is the dull-wittedness of beasts; more dangerously, it is also the occasional stupor of the animal spirits themselves. When the petite glande that is their seat becomes excessively tracked by perceptions and rutted by memories, hebetude is the result. Indeed, all theories of perception and memory that rely on the notion of an imprint, engram, or type,—from Plato’s Theaetetus to Freud’s 1895 Project Toward a Scientific Psychology—are haunted by hebetude. Theaetetus is of course the classic source of the wax tablet model of the mind. Though useful as model for retention, the wax tablet is a quagmire for recall, so that the classic model for mind guarantees a place to stupidity. Socrates says:
But when a person has what the poet in his vast wisdom commends as a “shaggy heart,” or when the wax slab is muddy or made of impure wax, or oversoft or hard, then matters
stand in this way: the people with soft wax are quick to learn, but forgetful, those with hard wax the reverse. Where it is shaggy or rough, a gritty kind of stuff containing a lot of earth or filth, the traces obtained are indistinct; as they are when the stuff is hard, for they have no depth. Impressions in soft wax also are indistinct, because they melt together and soon become blurred. And if, besides this, they overlap through being crowded together into some narrow little soul, they are still more indistinct. All such persons are likely to opine falsely. When they see or hear or think of something, they cannot quickly assign things to their particular places. Because they are so slow and sort things into the wrong places, they constantly see and hear and think amiss, and we say that they deceive themselves with regard to beings and are incorrigibly stupid. (Theaetetus 194-95; Tr. F.M. Cornford, with alterations)
By the time of Descartes, Hobbes, and Locke, matters are much worse: even the best wax will not serve if the tracks and ruts are excessive in density and intensity. In all these cases, though especially in Plato—note the mocking reference to Homer’s hairy-chested heroes, the equation of earth with filth, and the incorrigibility of narrow little souls—intellectual insensitivity and impressionability are associated with moral failings. Indeed, one may say that the very violence of the typographic models for perception and memory, from Democritus’s and Descartes’ punch-card to Freud’s effraction or breaching (Bahnung), makes it impossible for us to separate metaphysics from morals, especially when it comes to stupefaction. As I tried to show many years ago in a book called Of Memory, Reminiscence, and Writing: On the Verge—forgive my quoting myself, but this topic has clearly rutted my brain—epistemology and morals coalesce: “Waxen glands, for all the marvels they produce, are prone to moral turpitude, lassitude, lethargy, and benumbment.”2 I’ll return to this notion of benumbment in a moment. For now, note only that it is quite impossible to rescue the typographic model’s optimal functioning from decrepitude—the overcrowding, scarring, and eventual unreceptivity of the tablet. The more we perceive, the more perceptive we are; the more we learn, the more learned we are; the more we think, the more sage we are. And yet the greater the number of imprints we have, the graver the likelihood of overcrowding, laceration, scarification, and hebetude. The typographic model of sensation, perception, memory, and mind provides the official seal of the school for stupefaction. For the, from the transitive verb , “to strike a blow,” becomes the Latin stupeo, “to suffer stupefaction.”
3. Schelling, in his astonishing unpublished magnum opus, The Ages of the World, does not say that God is stupid. Yet one of the consequences of the split in essence between God’s existence and the ground of his existence, a ground that is in him but not of him and which therefore takes on a life of its own, to wit, the life of nature, is that the being of the divine is characterized not by majesty or glory but by deprivation and squalor. Not the sovereignty of self-thinking thought and perfect self-possession, but languor and languishment, a mournful longing (Sehnsucht) and a certain poverty and wretchedness (egestas, Schmachten), mark that essence. Arguably, the anxiety that arises within the first potency of divinity dazes or bedazzles God, stuns and stupefies her—for at this point, God is wrathful, and the God of wrath is a woman. It is simply that God, in his stupor, does not yet realize this.
4. Heidegger, during the years 1926 to 1930, uses the words Benommenheit and Benommensein—a being dazed or stunned, bedazzled or stupefied, in a word, benumbed—in five different ways, ways that at least at first do not seem to cohere. First, in Being and Time, he takes Benommensein to characterize human existence in its fallen mode, its everydayness. Dasein is “relucent,” he had said in earlier lecture courses; that is, it tends to interpret itself in the mirror of all the “handy” things that surround it. We are bedazzled and benumbed by beings, opaque to the being-here (the Da-sein) that we are. Second, in his 1930 lectures on theoretical biology, Heidegger announces that the animal is poor in world—poor Pogo indeed! True, the animal is richer than the worldless stone, but it is more indigent than the world-shaping human. An abyss separates the human being from the animal, inasmuch as the essence of animality is Benommensein, “benumbment.” For those who had read Being and Time, it was as though everyday Dasein now exists on the level of the beast. And yet “handy” Dasein, precisely when most absorbed in the world of equipment, would seem to be world-shaping and not poor in world, not “benumbed.” Third, in a review of Ernst Cassirer’s Mythic Thought, the second volume of his Philosophy of Symbolic Forms, Heidegger says that for the so-called “primitive” human being the response to mana—that is, response to the overpowering and instantaneous (augenblickliche) revelation of being, could be captured in the word Benommensein. By this Heidegger does not mean that the so-called “primitive” human being is poor in world, bedazzled by beings, oblivious to being, and hence closer to the animal than to the properly human world. Quite to the contrary, the so-called “primitive” is more open to the being of beings than is the man of technique—or even the neo-Kantian philosopher. To understand this paradox, we have to return to Being and Time for a fifth sense of Benommensein. For, remarkably, this is the very word Heidegger uses in Division Two of Being and Time to describe the impact of the moment of insight, der Augenblick, when Dasein catches sight of its mortal existence, its Sein zum Tode. Incredibly, Benommensein means both our being bedazzled by everyday banalities and our being stunned by the confrontation with our own most proper, most appropriate existence. There are clearly two strains or strokes of benumbment, and the second somehow serves to counteract the first.
5. While still a young man I had the good fortune to know and to work with Hannah Arendt—during what turned out to be the final years of her life. In the middle of one of our conversations, exercising the privilege of youth, which is another phrase for stupidity, I gave Arendt a sort of assignment, saying,
—The nineteenth century demonstrated the cunning of reason, Hegel’s List der Vernunft, while the twentieth century is all about the cunning of unreason, the victory of Unvernunft. You’re the person who is best placed to write about this!
She gave me a long, squinty look through the haze of smoke that was killing her, and in that inimitable bass-baritone voice, all gravel, she replied,
—Leave me alone. I’m old. They say you’re only as old as you feel. You want to know how old you are? Count the candles! Count the candles!
And so the world never got her book on the cunning of unreason. If I could capture her ear now I would be just as impetuous, but this time I would ask her, yes, to complete The Life of the Mind, but then to begin a book on the life of stupefaction. It would be an expansion of her wonderful essay to which I alluded earlier, “On Thinking and Moral Considerations.” In that piece, which I first heard as a lecture in Manhattan , one of her last, she takes up the theme of Heidegger’s Was heißt Denken? a title we have to translate as both What is it that we call thinking? and What is it that calls on us to think? In her lecture she argues that thoughtfulness—not mere considerateness but what she may have been thinking of as Bedachtsamkeit or Besonnenheit, the capacity to think, the habit of lucid thinking—is sufficient to prevent the worst from happening. This is a positive pendant to her thesis on the banality of evil. Banality expresses itself in the Borniertheit of thoughtlessness. It is thoughtlessness that commits crime after crime, blinking innocently all the while, like Nietzsche’s “last human being,” who insists, “But we have invented happiness.” Thoughtfulness would shatter the complacency of the “last human” and prevent its worst violence.
Let me return now to the apparent confusion of Heidegger’s terminology, whereby Benommensein is predicated of both our being bedazzled by everyday preoccupations and our being called to confront anxiously our being toward death—that anxious and benumbing, momentary and momentous confrontation which is nothing less than the human capacity for thoughtfulness. Stupefaction would characterize both the wretched complacency of a besotted humanity and the tragic enlightenment of the thinking person. Here the deepest insight of the fundamental ontology of Dasein accords with ordinary language analysis—a marriage one would not have thought possible. It is time to take up that uncannily positive side of stupere, in that uncanny word stupendous. When a French man or woman leaves the theater elated by what he or she has just experienced, the exclamation we hear could well be:
—O là là! c’était absolument stupéfiant.
He or she does not mean to say that the play was stupid. Rather, astonishingly good, striking, full of marvels, wonderful, dazzling to the point where words fail. When Aristotle speaks of those from which philosophy takes its start, he does not mean the narrow straits that prevent the narrow-minded from thinking. To think is to allow to do what that word says, namely, to thrust themselves to the fore over and over again. To think is to be thunderstruck, to be dazed by problems over and over again. Such bedazzlement need not always produce edifying results. I knew a great mathematician who was always depressed and who died young.
—My life is stupid, he said to me during one of our last conversations. I spend years on a problem, and when it is finally solved I see that it was a stupid problem. How could it have taken me years? I’m depressed about this for at least six months. And then the worst happens: I get sucked into another problem.
Philosophy can be like that. There is a dark side to Benommensein, especially if one’s everydayness consists in doing philosophy. Yet Aristotle would still insist that if there is no bedazzlement, no being stunned and stopped in one’s tracks, there is no wonder and no philosophy. Wonder, , is Plato’s heraldic word for philosophy, but we cannot be sure that it applies only to Diotima gazing out over the vast sea of beauty-in-itself, and not, for example, to Meno, who can’t figure out what the slave boy is up to, much less Socrates. In order to appreciate the stupendous wonder of Socrates and Plato, we have to comprehend better than we usually do the stupefaction of all the interlocutors in the Dialogues, including Socrates. When Euthydemus and Dionysodorus dazzle us with their silly puns, how different are they from Socrates in Cratylus, who admittedly gets carried away with his etymologies? Of course there is a difference, but it is a difference in the mode and duration of stupefaction, Socrates tells us, not a difference established by . True, some of Socrates’ interlocutors are portrayed as anything but stupid: Callicles and Thrasymachus could both be working in the Vice-Presidential wing of the White House, so clever are they, and so devoted to power at any cost, so free of scruple. Yet if we are true devotees of Socrates, we have to say that even the calculatingly clever Callicles and the realpolitisch well-informed Thrasymachus belong to the school for stupefaction—as clever as they are, their lives are devoted to stupidity. In short, we need to think a lot more about , the incapacity to learn, which characterizes not only the self-righteous and pompous Euthyphro, who is after all just another thick-pated lawyer, but even the august Timaeus, who is both astronomer and philosopher—though perhaps also, in his own words, a birdbrain. We have to wonder whether the Stranger of Plato’s Sophist is a god of dialectic or a dunderhead who makes spurious distinctions—who pretends, for example, that the making of likenesses, , can be anything other than a trafficking in phantasms, . Everything hangs on the possibility of this distinction within image-making, , everything from Socrates’ personal fate to the history of what we call “Platonism,” and that means the history of philosophy.
Does it all boil down to the quality, mode, and duration of our stupefaction, and to our capacity or incapacity to choose between kinds of stupor—the stupor of stupidity and the stupor of the stupendous? Recall that Being and Time inserts its entire inquiry into Plato’s Sophist precisely at the point where the word being has left everyone, including the Stranger, stupefied and bereft of sense—benumbed, one might say, by this truly stupendous problem.
In the early 1950s Heidegger asserted that the most thought-provoking thing in our thought-provoking time is that we are still not thinking. In the early 1970s Luce Irigaray argued, brilliantly and unforgettably, that what calls on us to think is la différence sexuelle. For Derrida at that time it was différance itself, spelled with an a to designate the verbal sense of differencing and deferral; later on in his life it was the phantasm of absolute self-knowing or sovereign self-possession that put out the call. What has called on me to think during my last years in the United States—although for me to name Heidegger, Irigaray, Derrida and myself in one breath is the epitome of stupidity—is the apparently totalizing form of stupefaction at work there, a stupefaction that is not the threshold to stupendous wonder but the unending horizon of a boundless complacency. Heidegger called it Bedürfnislosigkeit, a needlessness that produces heedlessness.
Henry David Thoreau seems to have been worried about needlessness and heedlessness a long time ago, and in the United States . Near the end of Walden he employs the phrase, “As if there were safety in stupidity alone.”3 At one time I hoped to drape these words as a banner across the White House gates. I wanted to be the Betsy Ross of stupefaction. Then I realized I’d have to prepare the same banner for Capitol Hill and for the Supreme Court building, and where would it end? Thoreau himself, however, is talking not merely about politics but about the way in which in the English-speaking world the presumption is that every assertion has to be reduced to a univocal meaning. What he proposes instead is that every piece of writing has to be heard as music, demanding more than one hearing and calling for multiple possible interpretations. He writes:
It is a ridiculous demand which England and America make, that you shall speak so that they can understand you. Neither men nor toad-stools grow so. As if that were important, and there were not enough to understand you without them. As if Nature could support but one order of understandings, could not sustain birds as well as quadrupeds, flying as well as creeping things. . . . As if there were safety in stupidity alone. I fear chiefly lest my expression may not be extra-vagant enough, may not wander far enough beyond the narrow limits of my daily experience, so as to be adequate to the truth of which I have been convinced. Extra vagance! it depends on how you are yarded. The migrating buffalo, which seeks new pastures in another latitude, is not extravagant like the cow which kicks over the pail, leaps the cow-yard fence, and runs after her calf, in milking time. (ibid.)
What I admire most in Thoreau is this call for extravagance, extra vagance, that is, the search for meanings that lie afar, somewhere over the cow-yard fence; such a search has to respect the vagaries and the vagabondage of thinking, even if that sounds like stupefaction. It means that philosophy can no longer sound like the empty egotistical quarrels of a Cambridge debating society, in which the ruling desire is to score points against one’s opponents and to prove oneself the cleverest in the room. When I look over my own books and articles in philosophy, they disappoint me in that they are not extra-vagant enough. Indeed, there can never be “enough” of vagance, even if that word should rhyme with vagrance.
There is much that I could say about stupefaction in the university, among both students and faculty. But that would be another paper altogether, and it is time to draw to a close. At all events, my complaint is not with the university but with a government that remains untouched by the spirit of inquiry and that cuts university funding in order to pay for its own stupid and violent schemes. Will things change for the better in the school for stupefaction? We have to be satisfied with an ambiguous situation.
There is, Nietzsche says, a virtuous stupidity, eine tugendhafte Dummheit, which human beings need if they are to survive.4 The thinker, poet, and artist put their own survival at risk because they take positive pleasure in madness, that is, in dancing to the allegro of unreason. Yet there is no safety either in stolid intellectual work. Toward the end of his career, in Beyond Good and Evil (no. 227; KSA, 5:162-3), Nietzsche writes that if intellectual integrity, Redlichkeit, is our virtue, the danger is that it may become “our vanity, our façade and our fashion statement, our boundary, our Dummheit.” He continues: “Every virtue inclines to stupidity; ‘stupid to the point of sanctity,’ they say in Russia —let us take care that our intellectual integrity not turn us into saints and bores! Isn’t life a hundred times too brief—for us to be bored with it? . . .” (ibid.). Recall that Merleau-Ponty, at the end of his beautiful book Eye and Mind, warns himself and us not to employ the pompous expression philosophical interrogation when referring to “our state of perpetual stupor.”5
Now that I have mentioned Nietzsche, I have to report an objection that I think he would make to me. I’ve been complaining about the confusion of images with reality, and Nietzsche would simply say to me, “You inveterate Platonist! You still believe that philosophers are the ones who have access to being, while everyone else wallows in the mud of images? This is the oldest grumble and gambit of philosophy—the distinction between image and original is the philosophical stupidity par excellence.” I do not have an answer to Nietzsche’s objection. Large parts of my paper seem to rest on the assumption that virtue is knowledge, or if not knowledge then the capacity for knowledge, in any case something counter to stupidity. Yet this would be the moral-metaphysical error par excellence. Nietzsche is probably right to say that stupidity begins at home, and that one should look in one’s own back yard before criticizing others. Yet I wonder if it is still tenable to assert that we can be lost among images, icons, and phantasms, and that it is nonetheless possible for us to sense that we are lost even if we have no access to the idea, the “true,” for purposes of navigation.
Is there a qualitatively new power of stupefaction at work in America today? I believe there is, in spite of the lack of adequate or apodictic evidence, and in spite of the fact that a pure and unambiguous idea of stupefaction eludes me. As one of the most insightful German Americanists of our time, Ulrich Halfmann, put it to me recently, “Something there [in the United States ] has changed, and changed fundamentally.” My only hope is that we can in some small measure transform the stupor that is proud to resist all questioning into the stupendous power of inquiry. My holiday friend, the Triestine fruit vendor, will remain skeptical about this hope, as will I, most days of the week. Yet whether he knew it or not, his claim stunned me and stopped me in my tracks with an aporia. Whether my own stupefaction has merely dazed and confused me, or has enabled me to take a few halting steps in the direction of interrogative wonder, is something that you, not I, will be able to judge.
Perhaps we can agree that—at least in terms of the linguistic heritage of and stupeo—only the smallest gap separates the sustained stupefaction we undergo in interrogative, philosophic wonder from the totalizing and paralyzing stupefaction of complacency, prejudice, and blind violence. Nietzsche warns us, through Zarathustra, that the smallest gap is the most difficult to bridge. Yet let us try to bridge it with the guilelessness of Laurel greeting the morning sun. If Boiler Jim should object that the effort itself is stupid, let us reply that stupefaction clings as stubbornly to the philosophic mantle as the philosophic mantle clings to us, even when we think we have removed it.
Postscriptum. Nothing more obvious than the school for stupefaction, one must say, especially after the past eight terrible years, but nothing less obvious than the word obvious, which means both open to the path that leads to change and shut off from it. Much of the fervor surrounding the election campaigns of these past few months has to do with our desire to suppress and repress the monstrous stupidities of the Bush-Cheney-Rove era, the desperate desire that we, as a nation, cannot be as stupid as we surely have been. And as if that were not enough: someday, were I to live on, I’d like to write a piece about stupefaction in the university, with all its fads and foibles, all the -isms, ideologies, and idiocies that mar it left and right, right and left. Above all, the waxing self-righteousness and moralizing, the aggression, self-promotion, and mean-spiritedness of administration, faculty, and students. It seems that we too, for all our self-proclaimed resistance and opposition, have succumbed to so much Bushism. Trickle-down violence. Nothing is less obvious than what is so obvious. But, as Hölderlin’s Hyperion says, “More later.”
David Farrell Krell is professor of philosophy at DePaul University in Chicago and Guest Professor at the University of Freiburg . Recent books include The Tragic Absolute: German Idealism and the Languishing of God (Indiana University Press, 2005), The Purest of Bastards: Works of Mourning, Art, and Affirmation in the Thought of Jacques Derrida (Pennsylvania State University Press, 2000), and Contagion: Sexuality, Disease, and Death in German Idealism and Romanticism ( Indiana , 1998). Recently published by SUNY Press is his translation and critical edition of Friedrich Hölderlin, The Death of Empedocles: A Mourning-Play.