T. David Gordon
When discussing issues that concern the public, it is not unusual for some conversations to become partisan. When this happens, it is also not unusual that the partisan discussions sometimes become somewhat shrill. Ordinarily, however, the discussions only become shrill when there are obvious personal consequences. The early feminists were perceived to be a tad shrill, for instance, but they had personal reasons to bethey were effectively denied many of the freedoms and opportunities our Constitution guaranteed to all. Similarly, the homosexual community has often been perceived as shrill, but again, the matter affects them personally. I have wondered, then, why it is that technophiles are often so shrill, since one wouldn't think that the candid examination of the pros and cons of various technologies (or clusters thereof) would be a personal issue, the way gender or sexuality issues are.
As I use the terms, a "technophile" is a person who regards technological innovation as good, without having specific grounds for doing so. A "technophobe" is a person who regards technological innovation as bad, without having specific grounds for doing so. That is, they are mirror images of each other; and I regard them both as "technofools" for the same reasonthey hold their opinions, often quite passionately, without grounds; and tend to resist reasoning or evidence for alternative opinions.
We have few technophobes today; the Amish, for instance, may be close, because they object to machinery that is run by fuel or electricity. And I recall once, in the early nineties, a colleague of mine had a wife who would not permit a computer in the home, due to some rather curious speculations of a metaphysical nature. But, such technophobes today are a significant minority.
The technophiles, on the other hand, well, their name is "Legion," for they are many. For the technophile, newer is always better (despite the numberless Windows "upgrades" that crash as disconcertingly as their predecessors). I recall when the first color screens appeared on laptop computers, and my colleagues at Gordon-Conwell urged me to buy a new one, with color. I attempted to explain that teaching Greek was, alas, something of a black-and-white matter, and that displaying the aorist tense in, oh, let us say, magenta, would not likely have any pronounced effect on the students' understanding of how the tense worked. Similarly, lecture outlines for a course on Paul's understanding of the Law (of Moses) would hardly be improved by the addition of color. Oh, I suppose I could have displayed Paul's "negative" statements about the Law in red and his "positive" statements about the Law in green (for stop and goget it?), but any student so addlebrained as not to be able to distinguish positive from negative comments is beyond the aid of a color palette. So, I simply waited until my black-and-white laptop died a natural death, and then, since there was no longer an option, purchased a replacement with a color monitor.
What struck me as curious then, and strikes me as curious now, is why so many people were so eager for me to spend my money to "upgrade" my computer to a model that wasn't actually an upgrade for my line of work. It wasn't curious then, and isn't curious now, that they may have had perfectly good uses of color for their work; and therefore I never raised any questions of them about why they had made their choice. But they were obviously troubled, in some measure, by the fact that I had not upgraded as they had. A similar thing happened when substantially faster processors were engineered, which permitted the use of real images (does the reader recall the primitive days of MacDraw?). Again, I was urged to get a newer machine, because of its "superior graphics." My reply was, by then, predictable: I wasn't teaching my students illustrated manuscripts of the Greek New Testament. Though some of the late-medieval manuscripts were indeed "illuminated," I was not teaching medieval text-criticism. I was teaching exegesis and interpretation of the original text, not aesthetic appreciation for the beautiful illuminations of some rare fourteenth-century manuscripts. Had I been a professor of art criticism, a graphics-capable monitor may have been an aid to my instruction, but it simply wasn't necessary for the arena of my instruction.
Why was it so unsettling to others that I did not run out and purchase, at considerable expense, the first generation of color monitors, or the first generation of true graphics-capable computers? Other tools do not provoke this response. None of my colleagues ever inquired whether I split my firewood with "the latest" maul, or whether I climbed "the latest" ladder to clean my gutters (though improvements in both areas were not uncommon). Electronic technologies differ from other technologies in the unquestioningand, at times, rabidloyalty they seem to produce. ETs are virtually talismanic today; it is existentially important to have (or covet) "the latest."
Why? Why would a colleague in another field care if I remained "behind the times," as it were, teaching an ancient text in a fairly ancient manner? I wrote a little apology several years ago for why I don't use PowerPoint in my classes. It was an abbreviated explanation, with substantial indebtedness to Edward Tufte, for why the particular educational goals of the particular courses I teach would not be assisted by using this tool. I expressly said in the opening paragraph that it was an apology for my choice and not a criticism of choices made by others who taught different courses than I. Nonetheless, several of the people who came across it took great offense, and explained to me that they worked very hard on their PP presentations and employed them wisely, a matter that I had never questioned or even raised.
Why did these people become defensive about my own apology for my own pedagogical choices? Why did they even care? Would the same defensiveness have attended a discussion of the merits of white chalk versus yellow chalk? Well, these "why" questions are rarely answered completely. We grope about, knowing what little we do about human nature or human society, and attempt partial explanations, and I'll do so here, with an emphasis on the "partial." My explanation only partly explains why part of the population behaves as it does part of the time. Here it goes.
Contemporaneity is probably one of the most pervasive of values in our culture. For a variety of reasons, there has probably never been a generation more infatuated with itself, or more clueless about previous generations. There are many causes for this. Commercial forces, for instance, sell us their wares by first "selling" the idea that newer is better. Political forces that could not make a cogent argument for their political theory sell their platform by "selling" the idea of change. Electronic media separate us from the past; if you know the world via television, for instance, you know a world that is only a half-century old. All of these, and others, contribute to the pervasiveness of contemporaneity as a value in our culture. But they are not alone responsible.
I believe it is entirely possible that the first of the deadly sins in the medieval world is still the first sin of our world: pride. Pride takes many forms, of course, both individually and culturally, but its cultural forms appear to be the most difficult to notice, and the most intractable. The parable of the Emperor's new clothes reminds us of the social complicity of folly; Reinhold Niebuhr reminded us of the same; Socrates thought nothing was more dangerously entrenched than the received tradition of a culture; and Jesus was concerned that some people practiced even their religion in order to be seen (and approved) by men. Something in us desires to believe, in the face of all evidence to the contrary, that we, as a group, are more enlightened than others, nobler than others; to put it simply, superior to others. One of the conceits by which post-Industrial cultures do this is by regarding previous cultures (if only recently previous) as "primitive."
The torrent of new technologies as the second millennium moved into the third fed this wave of we're-superior-because-others-were-primitive attitude. We not only have telephones and televisions that some other generations did not have, but we have cellphones, laptops, PDAs, iPhones, GameBoys and the World Wide Web. We needed such bolstering at precisely this time, because otherwise the evidence was far from flattering. The twentieth was the greatest killing century of all times, by any measure. Not only did humans kill the greatest absolute number of their fellow humans in that century, but humans killed a higher percentage of the planet's population than ever before. Worse, this was true even if we discounted the wars. By genocide alone, we killed a higher percentage of fellow humans than ever before. And there were also the wars, some of which did not enjoy a consensus of public support (Vietnam, Iraq). Our "Shock and Awe" campaign shocked everyone (including us) and awed no one, because awe requires a more self-evidently moral cause. One of our wars only ended by the unleashing of a weapon of such destructive power that we have lived in the frightening shadow of its mushroom cloud ever since. Thus, by any fair observation, we must conclude that the twentieth century was the most barbarous of all.
And so, perhaps at some deep level of our corporate psyche, there is some need for us to "save face," as it were, in light of our recent, blood-stained history. And here, the conceit of technological progress saves the day, does it not? We may be morally inferior to previous generations, but we are, at least, technologically superior. We may not be so smart, but our tools are. We have smart phones, smart cars, even smart bombs, so we can afford to be a little stupid about our wars and genocides. We can do almost anything faster than any previous generation could do it, from balancing a checkbook, to doing our taxes, to writing a letter. The fact that so much of what we have done is unspeakably wicked and/or so breathtakingly inhumane (The primary leisure activity of our culture is watching television? How can we look at ourselves in the mirror?) can be overlooked since we do our wickedness efficiently (Hiroshima, Nagasaki) and cultivate our inhumaneness so effortlessly ("Laverne & Shirley" trumped War and Peace). If a person has only a flicker of memory, only a rough remembrance of the last century or so, such a person knows that the last century was nothing, as we used to say, "to write home about." And so, we find face-saving solace in the only possible arena where we could plausibly declare our superiority with a straight face: technology.
Technology will save us. Of course, it won't save us from our wickedness or from our banality (nothing is more banal than television). But it will permit us to save face. It permits us to overlook that our family lives are disintegrated (four people in a car on a trip, each "podded up" alone), our "art" is trivial, our leisure time is wasted, our governors are unworthy of our honor, our international policies are often indefensible and self-defeating, our business "experts" are utter imbeciles, and our killing is boundless. We have "the latest," and for us, the latest is the greatest. And this technological superiority, at least psychologically, covers a multitude of sins. So if anyone, even a middle-aged professor of an obscure language at an obscure college, questions whether the latest, at any moment, is necessarily the greatest, then he must be silenced. The Emperor's new garb must be honored.
If, on the other hand, tools are merely tools, then the only measure of them is the purposes to which they are put, or the skill or wisdom with which they are employed. We have rarely put our ETs to humane purpose, though perhaps we will in the future. We surely do not ordinarily employ them wisely (we permit face-to-face communication to be interrupted by a cellphone, or we permit email or IM to interrupt our reading of Socrates or Frost). But our tools are not mere tools; they are symbols of our last-grasp effort at saving face, talismans of our only alleged superiority to previous generations. And because they serve such an important (albeit perverse) social-psychological function, we get a little hysterical when they are regarded as what they are: mere toolsnothing more, nothing less.
Many of us, in other words, are personally invested in our ETs. We can be no more dispassionate about them than feminists or gay-rights activists can be; because they are etched into our psyches. We need them to provide us with self-congratulation; we need them to assure us that we are making progress (in light of the overwhelming evidence of considerable regress); we need them to erase our memory of the recent past; we need them to assure us that our whirling rat-race of mindless, ill-considered, and ill-fated multi-tasking is actually taking us somewhere good. And since nothing elsenothing in our actual human accomplishments in the present or recent pastwill provide us with such assurance, we cling to our technological creed: "Better tools make better people." And we shriek, somewhat hysterically, at those who challenge our creed.
 My ideal human, on this score, is "techno-wise," willing candidly to understand both the benefits and the costs, the assets and the liabilities, the pros and the cons, of specific technologies.
 Even this is not quite technically accurate, since a horse-driven buggy is fueled by oats or hay. Indeed, while the Amish are quite insightful cultural anthropologists, correctly observing that the tools a culture uses shape it in many unforeseen and unintended ways, there is also an apparent arbitrariness to their decision to embrace certain technologies but not others.
 I call them ETs (electronic technologies); it is not only an abbreviation, but also reminds us of their somewhat alien and mysterious nature.
 The Cognitive Style of PowerPoint: Pitching Out Corrupts Within (Cheshire, CT: Graphics Press, 2003).
 But watch carefully, and you will observe some blood pressures rising despite this triple qualification.
 The best proof of this is the so-called History Channel. There is almost never any history on the History Channel. There are programs about anticipated "mega-disasters" in the future; programs about people driving trucks on ice roads today in Alaska, and programs about contemporary inner-city gangs; but there is almost never any history on the History Channel, and what little there is is often mere sensationalism about a serial murderer from the 1980s. Just as Tom Cruise could not "handle the truth" in A Few Good Men, television cannot "handle the past."
 Niebuhr's most relevant work on this point was Moral Man and Immoral Society: A Study of Ethics and Politics (New York: Charles Scribner's Sons, 1932); Socrates made this case in his Apology. Jesus's words are found in Matthew 6:2.
 Lewis M. Simons, "Genocide and the Science of Proof," National Geographic 209, no. 1 (January 2006): 28.
T. David Gordon is a minister in the Presbyterian Church in America serving as Professor of Religion and Greek at Grove City College, Grove City, Pennsylvania. Ordained Servant Online, December 2011.