Go to table of contents
 
   

Get more information on Burton: Computing in the Social Sciences and Humanities

Buy this book
 

 

 

 

PART 1

The Digital Revolution


1

Technological Revolutions I Have Known

  EDWARD L. AYERS



Historians are trained to see things in the context of change, but even a historian might find it hard to gain a sense of perspective on the technological changes sweeping over us these days. The machinery itself is evolving with astonishing speed, and the larger culture seems obsessed with the evolution. Articles on the latest high-tech stock miracle fill the business pages while advertisements for automobiles and sport leagues bear their World Wide Web addresses like badges of honor. Books and magazines for and against the new media pepper the bestseller lists, and how-to books on computing dominate new sections of bookstores.

      The effects of the new technology in the classroom receive their share of attention as well. While the computer companies and politicians fall over one another with promises and proposals to equip classrooms with as many machines as possible, ominous voices warn that we are ushering in the end of real education with such innovations. Teachers will be replaced with machines, they warn, human interaction supplanted by keyboards and screens. Teachers view the changes warily, eager for the stimulation and excitement computers can bring yet leery of inflated expectations and skewed funding. Higher education is, if anything, even more confused and ambivalent than its primary and secondary counterparts. There, some scholars and teachers are eagerly innovating with the newest media while others hold it in open contempt.

      Educators at every level have been burned before, when gadgets ranging from filmstrips to overhead projectors to televisions have been ballyhooed as the saviors of the American classroom. The computers that have occupied corners of classrooms for the last decade have made some impact on matters involving rote learning but have not lived up to their earlier billing. Our classrooms still mainly involve the scraping of one rock against another, chalk on blackboard. Has the time finally arrived when the big changes will be felt? Have we achieved critical mass? Are we on the verge of a fundamental change in the boundaries and possibilities of the classroom? If so, what role might those of us in research universities play?

      Many academics who came of professional age some time between the late 1960s and early 1980s have already experienced what felt like—at the time, at least—three electronic revolutions. As a historian attracted to the potential of computing since the 1970s, I have seen these changes at close range. Like others of my generation, I confronted mainframe computers before personal computers. I remember how impressive it felt at the computer center: the heavy metal machinery, the hard math done automatically, the promise of being freed from uncertainty and imprecision. True, I had to copy records from dusty originals to coding sheets and then to brittle punchcards, but the excitement when all the cards were ready for batch processing was worth it. One made a ritualistic sacrifice of the cards to the priestly attendant behind the glass wall, then waited in a room where it always seemed to be a fluorescent-lit 3 a.m. Eventually reams of paper began to pour out, perhaps the findings on which so much depended. After proudly bearing the impressively large stack of paper through the rows of computer science graduate students, the humanist eagerly opened the stack to see what revolution in historical understanding might be revealed in the columns and numbers. Unfortunately, seeing the entire stack of paper filled with one message repeated 2,789 times—error number 17—was not as edifying as one had hoped. Eventually, though, I figured out the machinery enough to get some reasonable-looking numbers for a dissertation.

      Looking back, the incongruity in this first computer revolution is obvious. The same dissertation that drew on a computer the size of a 747 for its data manipulation had to be translated into English with a used Adler Satellite portable electric typewriter. It was a tactile experience, with the grinding little motor and belts, the keys interlocking tenaciously, the pockmarked surface where wet correction fluid had been typed over. And it was intellectually challenging as well, for it was not always easy to find another nine-letter word for, say, "lassitude" when, against all odds, "lassitude" appeared in consecutive paragraphs. Despite such obstacles, I managed to write enough of a dissertation with such a machine to get a job.

      That particular kind of challenge came to an end with the word-processing revolution. In 1981 the machines took over what had been our little faculty lounge. My department decided that dedicated Wang word processors were the wave of the future, and it certainly seemed so in the evenings, when professors could gain access to the two machines in what quickly became known as the Wang Room. There, we simply could not get over the fact that we could delete words we had written many pages back. We could delete "lassitude" every time it appeared, even replacing it with a five-letter word if we chose! It was miracle, even better than number-crunching, because these were the humanists' familiar and beloved words that could be manipulated so easily.

      Nothing was perfect, of course. A flicker of lightning in the next county often triggered a complete breakdown of the machines; the daisy wheel printer ate ribbons and daisy wheels the way the computer in graduate school ate paper; the disks, the size of small pizzas, seemed to erase themselves in the filing cabinet drawer; the cutting-edge Wang format soon proved to be a dead end in word-processing evolution. Nevertheless, once one had processed words, there was no going back.

      In 1985, thanks to the beneficence of my university, I got my very own machine at home and became a part of the Internet revolution. To my delight and the envy of my friends, it had a color monitor rather than the murky green of the Wang, its own actual hard drive able to hold 10 entire megabytes, and—shades of the future—a 2,400-baud modem. Trying to live up to such a machine, I learned to use yet another mainframe computer interactively, punchcards having been thrown on the computing trash heap along with the Wang. I taught myself multiple regression analysis and other things contrary to my character and abilities. But the real excitement came in the discovery of electronic mail. The combination of written language, informality, efficiency, and, in the mid-1980s, the feeling of being among the information technology elite proved surprisingly satisfying. And when the university's card catalog came online I thought we had approached the limits of technological progress.

      By then, the first two computer revolutions had been completely domesticated. The computer had become an appliance, about as exciting—and as essential—as the coffee maker that made my day as productive as possible. That was all the contact I wanted with electronic machinery. I had had enough of number crunching and SPSS runs. I was planning a new, nonelectronic project, a project that would take me back to ground level, a local study in which I knew the names of people. It seemed clear to me that my mild interest in computers made me something of a dinosaur in the age after the linguistic turn. I wanted to do the sort of highly inflected, nuanced, individualized history that had attracted me to social history in the first place. I wanted to write a narrative of human scale. And I could see no place for computing in that.

      But the machines, like so many cyborgs, tracked me down one more time. Trying to achieve a token balance on the committee that oversees computing at my university, they appointed me—a humanist—to occupy some space on a committee dominated by scientists, physicians, and engineers. At first I was befuddled by the lingo, but something interesting soon came up: IBM was interested in helping computing at the university and wanted our committee to suggest something. We batted it around for a while until I timorously noted that many of us in the humanities and social sciences had no computers on our desks at all. There was some good-natured joshing among the physicists and medical imaging specialists about aid to the third world of computing, but what could we do for such backward people who showed so little interest in helping themselves? Almost all my humanist colleagues seemed content with what little computing they had. No clamor of discontent arose from the quiet offices where pens still scratched on paper.

      Slowly, though, some forward-thinking people in computer science began to think that maybe computing in the humanities might be the most exciting frontier of all. What if we really could use computers to help make sense of the great store of human knowledge and striving locked away in archives and books? What if computers were just getting good enough for humanists to use, now that they could deal with images as easily as they could with linear numbers and letters, now that they were networked, now that they had enough storage space to hold the vast and messy stuff historians habitually collected?

      With this premise, IBM agreed to donate a number of RISC workstations, a server, and a technical advisor to create something we decided, after much debate, to call the Institute for Advanced Technology in the Humanities. Almost against my will, I was posted on the electronic frontier, armed with networking, digitization, JPEGs, and SGML even before Mosaic and the World Wide Web became household words. For the last several years I have been overseeing a project based in that institute. I converted my small-scale, intimate, handmade community study into a large archive on the World Wide Web and on CD-ROM. It is now known as the Valley of the Shadow Project and it has involved more than twenty people working to fill up several gigabytes of electronic storage with historical data.

      Once again I'm a true believer, eyes burning with fervor, brimming with enthusiasm, just as I was for the big mainframe of 1978, the sleek Wangs of 1981, and the interconnected IBM clone of 1985. We look back on those machines with a mixture of contempt and nostalgia; we will never be so innocent again as we were before. We know from painful experience that today's miracles will be tomorrow's embarrassments or, if they succeed, mundane paper clips. We have learned from those earlier revolutions that revolutions do not always happen at the speed people predict or want. From one point of view, we have seen blistering speed; from the other, things have moved slowly. Computing power and storage have increased dramatically, and universities have been wired. Word processing and e-mail and have become staples of life for many professors and students. But the failures are pretty obvious, too.

      Skepticism and even resistance to all things electronic by many humanists and even social scientists endures and even grows; a sort of passive aggression flourishes. There has been too much hype, too many commercials showing dolphins leaping out of computer screens. It is unfortunate that what computers, including networked computers, can do best is not particularly valued or necessary right now: providing more information, more specialized knowledge. If you want stuff, the Web has it. But the Web gives everything equal weight and authority, from conspiracy theorists to the federal government. Fortunately, the Web is too slow to be very satisfying over a modem, negating some of the appeal to the impatient young. The intellectual changes widely predicted to have accompanied widespread computing have not. The quantitative techniques so widely predicted as the wave of the future in the 1960s are now almost invisible within the historical profession; they have been replaced with a fascination with even closer attention to, of all things, words and texts. Students still turn to books for authority, still strive to write linear prose, and still print out much of what they discover online. Only a fraction of professors have integrated any form of electronic enhancements into their classes.

      So should humanists get out of the way? Should historians and literary scholars, anthropologists and poets put our energies toward what we already know how to do in traditional media, valuing that work as a humane counterweight to the arcadelike values of this new technology? Some of us should. There is no compelling reason for most teachers and scholars to throw themselves into the gears of the new machine. Search tools and e-mail can be helpful to almost everyone, to be sure, and few writers of nonfiction long for the days before word processors, but hours devoted to integrating things electronic do not always pay off. Thus far, so-called electronic classrooms have offered only limited returns; most multimedia lectures often are not worth the investment of money and time they demand.

      Some people have asked whether the Internet and the Web are like the citizens' band radio craze of the 1970s, except that you have to type rather than talk with a countrified accent. But today's technology more closely resembles what began as another technological fad: high fidelity. In the early 1950s one had to be a real nerd to care about woofers and tweeters; the records stamped "HiFi" were designed to show off what full stereo could do. Trains roared through your living room or birds called in the distance. But now high-quality sound reproduction is everywhere, from our cars to our homes to our offices to our malls to our televisions to our pockets. More than likely, that is how computers—or whatever we call them a few years from now—will evolve. Soon, they will be everywhere, taken for granted, boring.

      That is just what we need. As long as the machine itself is a fetish item, it will repel as much as attract, engendering fear as much as affection. As long as the machine is a separate box needing elaborate maintenance and full attention, it will be hard to integrate effectively into teaching. As long as the machine is held up as an alternative to traditional learning, it will be seen as a challenge and an affront to proven ways to sharing knowledge. It is not until we find ways to integrate electronic teaching into our established rhythms, strategies, and purposes that the very real potential of the new media will begin to be realized.

      Perhaps the first step is to dispense with the idea that the new forms of learning will necessarily displace others. Each kind of interaction between student and teacher accomplishes something unique. It might be useful to think of each form of learning located on a grid, with group and individual learning at the ends of one axis, active and passive learning at the poles of the other. Americans take as a matter of faith that learning that is both individual and active is best, and that which is group and passive is worst. But even passive learning can be effective. The most passive and isolating mechanism of all, television, has taught millions of people many things, some of them useful. Despite the criticism so often heaped on live lectures, they accomplish important and valuable purposes. The lecturer dramatizes, embodies the intellectual content and excitement of the material. The lecturer acts out the appeal and importance of the information, which could otherwise be presented more effectively in print. Generations of students at every college in the country eagerly compete to get into the best lectures, knowing that they are something more than television and more satisfying than many smaller classes with discussion.

      If lectures are at one end of the group versus individual axis, reading is at the other. Reading is the most individualized, active, and reflective intellectual activity and as such is the measure for intellectual work in general. Reading can also be passive and boring, with the reader trapped in language, pacing, and organization that hold little appeal and convey little useful information. When critics decry computers' displacement of reading, they tend to judge it against the best that reading can be rather than the average. In fact, a computer is more like reading than a lecture. A person using digital information, like a reader, tends to be alone and actively engaged in the information before him or her. The major difference between reading and using a computer is that computers do not seem to be friendly to reflection. The computer, unlike a text, is built for action; it sits there humming, waiting, demanding that you punch some key or click some button. It is distracting, perpetually promising something more interesting than your own unfocused thoughts or the words currently before you on the screen.

      In its demand for interactivity, in fact, the computer bears greater resemblance to a discussion group than it does to reading. Although a discussion, like a lecture, benefits from the physical presence of other people, from body language, it does not necessarily depend on them. Some of the most successful uses of information technology for teaching have been group discussions based on typing into a computer. Students and teachers claim that such discussions bring in a higher proportion of participants than traditional classrooms, that shy students will speak up in ways they would not otherwise, that the discussion tends to be less focused on the professor. Anyone with even a slow modem and a monochrome screen can participate in sequential discussions of themes of common interest. Unlike the World Wide Web, this text-based technology is inexpensive in time and in machinery, both to produce and to consume. It is an incremental technology, partaking of the benefits of reading and writing as well as the benefits of interconnectivity. It involves active, group learning disguised as individual effort.

      Another incremental technology is the CD-ROM. Just a couple of years ago, CD-ROMs were being written off by the cognoscenti as the eight-track tapes of information technology. Unlike information on the Internet, CD-ROMs are physical commodities, bound in plastic, static. On the other hand, unlike information on the Internet, they are fast, fluid, and local. Given the current state of the Internet, CD-ROMs' positive qualities often outweigh their negative ones. Anyone who wants to present large images, to create a unique and compelling visual environment, use sound intensively, or use customized search tools is driven toward CD-ROM. Even those engaged in producing CD-ROMs recognize that they are a transitional technology, but the transition may take longer than anyone had expected. Until the networks and the machines at the receiving end can transmit enormous files as easily as television currently does, there will be a place for CD-ROMs. They are currently on the individual and active parts of the learning grid, but recent advances permit users to marry those benefits with those of the World Wide Web: connection, conversation, collaboration, and expandability. That marriage permits students to toggle between individual and group learning, reflection and activity.

      That toggling may be the major advantage of the new media. They are protean, able to behave like a lecture or a book, able to foster individual or group activity. The new media should not be thought of as alternatives, rebukes, to traditional learning, but rather as ways to bridge some of the distances between those time-proven ways of teaching.

      The new media are simultaneously in their infancy and in their old age. No one has created a CD-ROM or Web site yet that can hold its own against a really good book or film. The World Wide Web, the most heavily discussed manifestation of the new technology, bears a family resemblance to the original Volkswagen Beetle. It runs, and it can even be spruced up so that it is fun to drive and look at, but it remains a Beetle, wheezing to get up hills, possessing little storage capacity, and threatening serious damage in a crash. Veterans of the computer revolutions of the last fifteen years cannot help but see the Web with the eyes of someone five years from now, simultaneously impressed with Java and embarrassed by being impressed, knowing that soon it will seem as primitive as Pong. To those of us who remember batch jobs, daisy wheels, and monochrome, it still seems a slight miracle that pictures, sound, and video can come over our phone lines. But even that novelty wears off.

      So how do we handle this new medium, so tempting and so ruthless, so postmodern in its simultaneous newness and obsolescence? There are some obvious truths: Use standard image formats, remain flexible, and look around. But there are other problems and issues. Perhaps these new media necessitate a compensatory style of writing, bending to the problems of nonuniform page sizes, page breaks, and short attention spans, maybe by presenting itself in shorter pieces, maybe by taking on the nonlinearity of hypertext. Or maybe new media writing should emphasize its traditional strengths of coherence and continuity. Maybe the computer screen should not be considered a place for serious, sustained writing at all until it is more portable and wirelessly interconnected.

      We need to give users the information and the techniques they need to handle the complexity of large databases, but such information threatens to swell to the size of DOS user's guides. The basic metaphor for the current networks is "surfing," but deep projects require breaking the surface and diving instead of skimming across the top. We need to give people a place to gather what they have learned, a place to assemble their new knowledge into larger and more durable constructs than lists of bookmarks. We need to use machines of great efficiency to generate creative inefficiency. Historians, for example, provide information that is inevitably dirty, contradictory, incorrect, and incomplete in a medium that prides itself on quickness, capaciousness, and attractiveness. Historical evidence was not created for the computer, so it is often an awkward fit between tidy machinery and smeared newspaper type, blurred handwriting, torn photographs, and thousands of sources, none of which were designed to fit together by their original creators.

      To academics who have internalized the conventions of the various forms of scholarly discourse—the article, the review, the monograph, the lecture—the new media can be confusing and even threatening. We know a good book when we read one. Do the new media call for new standards? The intellectual standards seem to be the same: originality, thorough grounding in the field, clarity of expression. But the standards of presentation in the new media are certainly different, whether we want them to be or not. We cannot judge a Web site by its cover—or its heft, its publisher's imprint, or the blurbs it wears.

      Whatever a project's scale and level of complexity, new media should meet several standards to justify the extra effort they take to create, disseminate, and use. We might as well admit that they are not as good as established media for some purposes. They cannot present a linear argument or narrative nearly as well as a book; indeed, they are generally not good for presenting substantial bodies of text. And they cannot convey reactive, personal energy as a good lecture or discussion group can.

      However, the new media can do things that traditional media cannot, and that is what should be emphasized in their creation. New media should be challenging intellectually but not technologically; if you need a user's manual, they are too difficult to use. New media should do things one cannot do with print pages; hypertext links, personal annotation, and effective searching tools are a bare minimum. They should be flexible; if a new media project merely poses a few problems and a few solutions, students cannot be expected to find it very appealing for very long. New media should permit points of accomplishment along the way; a project should not take hours of investment before it pays a dividend. New media should offer opportunities for collaboration; one of the great strengths of network-based projects is that they are open-ended, able to benefit from joint effort and imagination. New media should be cumulative; users can enrich the project, leaving behind a new insight, discovery, or criticism on which others can build. If a new media project can provide these benefits, then the form in which it is currently transmitted will soon cease to be such an issue.

      The lessons of the several minor revolutions we have witnessed over the last two decades is this: The technology will rapidly evolve no matter what we do. We have to decide what purposes we want to accomplish with the current state of the art and plunge in, with the full knowledge that we are chasing something we can never catch. To compensate for that inevitable frustration, we can take pleasure and satisfaction from knowing that we are participating, in however minor a role, in some of the more interesting changes of our time.

   

 

 

 

   
Previous chapter
Table of contents
Next chapter
       
    The content of this electronic work is intended for personal, noncommercial use only. You may not reproduce, publish, distribute, transmit, participate in the transfer or sale of, modify, create derivative works from, display, or in any way exploit this electronic work in whole or in part without the written permission of the Board of Trustees of the University of Illinois.
       
   

© 2012 by the Board of Trustees of the University of Illinois
All rights reserved