Tuesday, September 18, 2007

Ellen Rose, Author of

USER ERROR
Resisting Computer Culture

In conversation with Ansely Wong, July 2003


Source.

Ansely Wong: Who do you see as the audience for User Error?

Ellen Rose: I had a fairly diverse audience in mind as I was writing this book. Certainly, anyone who has ever used a computer or whose life has been affected by computerization-in other words, just about everyone-should be able to relate to the issues User Error raises. And I made it a point to write in a style that would be accessible to the general public, not just to academics or people involved in the information technology industry.


AW: How is User Error different from other books on computers and society?

ER: Most other books on computers and society have a very definite focus on the technology itself. More and more people are becoming computer users-and by this I mean not just users of PCs but also of ATMs, Palm Pilots, computerized home appliances, and so forth-but we continue to give very little thought to the role of the user, or to the way that role is constructed by computer software, user documentation, and so forth. Even books that address the social impacts of computerization are, when you come right down to it, focused on computers, not on the people who interact with them on a daily basis. With User Error, I sought to remedy this situation by offering a detailed exploration of the nature and social implications of computer use.


AW: You use the word "technolust" throughout the book. What does this word mean, and what effect does it have on modern society?

ER: The word is ironic in that it is part of the new vocabulary of computerization-words prefaced by "techno-", "cyber-", or just plain "e"-which is in itself an expression of our society's technolust. Technolust as I use it denotes the social and personal yearning to acquire technology and to be considered, by virtue of a knowledge and use of technology, "smart." It has as much to do with a fear of being left behind as it does with an overwhelming desire to partake of the power of technology. And lust is, I think, an appropriate term, given how little this desire is tempered by a consideration of the consequences of computer use. For example, it's largely technolust that keeps us buying upgrades of Microsoft Word, even though the improvements are dubious at best and the program is actually becoming absurdly complex, bug-ridden, and unwieldy.


AW: You mention that the major players in the industry, including IBM, Compaq and Macintosh, use marketing techniques to persuade consumers that computers will "empower" us, and make us "smart." Why do you think this approach is so effective?

ER: There's a whole mythology surrounding computer use. It goes back to that period of history known as the Enlightenment. During the 1700s, people began to reject the ways of knowing that had sustained society until then-for example, religious belief, traditional legends and lore-and to regard science and technology as the only sources of true knowledge. Over the centuries the idea that technological developments are the key to human progress has become entrenched as a mythology, by which I mean that it is a truism we no longer question or even think about. When advertisers tell us that computer use will empower us and make us smart, they are simply tapping into that powerful mythology. They're telling us what we expect and, in a way, want to hear.


AW: The issue of how computers and technology affect society is increasingly becoming a crucial concern as technology becomes more advanced. What specifically propelled you to write this book, and why now?

ER:
In many ways, I've been writing this book in my head for several years. I worked in the software development field for about fifteen years, and was therefore immersed in computer culture, but always in a role in which I served as the end user's representative or primary contact. Over the years, having a foot in both camps, I often heard programmers and tech support workers talk about the idiocy of users-in fact, my title comes from a joke a programmer shared with me-while at the same time I often heard users talk about how they felt the software downgraded their sense of being intelligent, competent human beings. I began to understand that these kinds of complaints were related: computer culture is fundamentally about developing and imposing on users solutions to problems caused by "stupid" people, not about inviting potential users to bring their knowledge and skills to the development table. And users have accepted this situation as just the way things are. Eventually, I found that I had to write about all this as a way of bringing it out into the open and, hopefully, encouraging users to take a more active role in directing the development and implementation of technology.


AW: What is "responsible action"?

ER:
The easiest way to explain what responsible action is, is to explain first of all what it is not. It's not the distanced, critical contemplation of technology that technology critics like Neil Postman [cultural critic and professor of Media Ecology at NYU] and Jacques Ellul [social critic and theologian] advocate. I'm a great reader and admirer of their works, but what I've noticed over the past is how little effect their analyses, insightful as they may be, have had on the way our society receives technology. I think that's because technology has become so much a part of our day-to-day existence that it's virtually impossible to stop using computers and other digital devices long enough to think critically about what using them really means. So what we need to do instead is to act responsibly. Responsible action goes hand-in-had with use. It's about using technology in ways that will make life better or improve the human condition. For example, e-mail is a convenient, easy way to communicate with friends and colleagues, but as a result of irresponsible use, many of us spend an hour or two every day answering e-mail messages, most of which were unnecessary. Sometimes a phone call or, better yet, a walk down the hall, can avert the need for three or four e-mail messages. That's responsible action.


AW: So responsible action also addresses a concern for the way in which technology has de-socialized daily activities?

ER: Yes, technology has provided us with the "efficiencies" of automated and online banking, automatic grocery checkouts, voicemail and email, but at a major cost to human interaction and direct personal communication lines. True, responsible action may involve some personal sacrifices to help prevent the loss of our sense of self and society. For example, I personally refuse to use automatic teller machines because I don't endorse the replacement of people with machines to save myself a few minutes in my day. And what are most people going to do with the time they save, anyway? It's less likely that they'll spend it with friends and family than that they'll use it to get caught up on that e-mail glut!


AW: You frequently refer to popular culture, including blockbuster movies, television shows and comic strips. Why did you choose to include these?

ER: Cartoons, movies, and TV shows like Star Trek tell stories about technology that most people are familiar with and can relate to. They enliven a discussion of computer culture, but I think that we should also take them very seriously because they're the means by which we express feelings and attitudes about technology that we don't usually communicate in our normal, day-to-day interactions. That's probably why so many people have Dilbert and other cartoons tacked on the wall next to their computers. For example, recently I saw a cartoon taped to a computer user's desk that depicted a user in a phone conversation with a tech support person. The user tells the tech support person he's having troubles with his hard drive. "Did you back up?" asks the tech support person, to which the horrified user replies, "Why? Do you think it's going to blow?" That cartoon is about many of the issues I write about in User Error: the hierarchy of computer knowledge, the miscommunication between those who know and those who don't know, the kind of underlying fear that users have for these mysterious machines. As for movies, I think that films like The Matrix and The Terminator are our modern fairytales. They express otherwise unspoken fears about how technology is taking over, robbing us of power. The success of those kinds of movies suggests the extent to which we want to explore those fears-ironically, in a very high-tech format.


AW: In your book, the popular use of computers and the Internet as entertainment is contrasted to television watching. Unlike watching the "boob tube," society still generally considers any computer use to be intellectually empowering. How is this misconception affecting the increasingly younger generations of computer users?

ER:
In my generation, the television was the "electric babysitter" that parents used to keep kids occupied, but there was often an element of guilt involved because TV was seen as offering mindless entertainment. The mythology surrounding the computer, and the marketing of it as an educational tool, means that parents can actually feel good about letting their kids spend hours in front of the computer. Of course, most kids are likely playing games, but we've managed to convince ourselves that even that has benefits, like better hand-eye coordination. And so we now have phenomena like computer camps. Where kids once spent the summer swimming, hiking, camping, and playing games, they now spend it glued to the screen. We've just replaced one screen for another, and it's difficult to ignore some of the consequences, like the rise in child obesity. You have to wonder, too, what the consequences will be down the road, when these kids grow up and look for jobs. After spending all that time glued to the screen, will they have the kinds of interpersonal skills that many employers are looking for?


AW: How do you think this younger generation of computer users, who are generally more comfortable with technology, will adopt "responsible action"?

ER:
Young computer users do tend to be very comfortable with technology, to the point that they don't even think about it. Marshall McLuhan once wrote that "one thing about which a fish knows nothing is water," which I think expresses the situation perfectly. For most children, the technological environment they've grown up in is as invisible and unconsidered to them as water is to a fish. To act responsibly with respect to technology means being able to see yourself as separate from the technology, but it's becoming less and less likely that today's children will be able to achieve this separation as adults. Isn't that the fundamental fear that films about cyborgs express? This is why it's so important that parents and teachers model responsible action for kids now.


AW: What is the danger of accepting our fate with "dummy proof" interfaces and ultimate user-friendliness?

ER:
Most of us realize, if only intuitively, that user-friendliness condescends to the user. That's why so many people hate Clippy, the office assistant who pops up occasionally when you're using Microsoft Office. What I show in the book is that user-friendly systems are based on the premise that we are "dummies," incapable of having any kind of meaningful input into determining where technology is going. The danger of accepting this social construction of the user as an idiot is that it reinforces the hierarchy of computer knowledge. It perpetuates a social order in which, as users, we're removed from the planning stages and simply compelled to use a software whose underlying rules and logic are a mystery to us. We begin to accept the notion that technology is a speeding bandwagon over which we have no control-all we can do is jump on and let it take us where it will. Of course, there's no bandwagon, just our own willingness to let others have the power to determine where technology is going and whose interests it will serve. And as I discuss in the last chapter of the book, there are lots of techno-elites out there who are vying for the power to "invent the future." But what we have to understand is that, despite all the glitz and hype surrounding research into tranhumanism and artificial intelligence and ubiquitous computing, all these possible futures are based on the same premise as user-friendliness, the idea of user incompetence, and so they simply perpetuate our powerlessness. Responsible action is about taking back some of that power.


AW: You reveal in your book that many computer users view "improved" and more proliferative technology with a sense of both omnipotence and inevitability. Do you think technology is unstoppable?

ER:
Absolutely not. As I said, there is no bandwagon, just the cumulative consequences of human choices and actions. It's people who control technology, not the other way around. That's why the possibility for change lies with the individual computer user's decision to act responsibly-in other words, to make wise decisions instead of relinquishing power to a supposedly runaway technology.

About the book

Events





SEARCH | ORDERING INFORMATION |
MANUSCRIPT SUBMISSION GUIDELINES
BETWEEN THE LINES EVENTS | AUTHOR INTERVIEWS
LINKS | HOME

No comments: