Change will occur only when manufactureres start taking the word personal in personal computers seriously.
Message 38:
Date: 8.1.96
From: <nicholas@media.mit.edu>
To: <ls@wired.com>
Subject:Where am I?Are the lights on?
Who's looking at me?
Am I indoors or outdoors?
What is all that noise or is there any?Try the following: Close your eyes and plug your ears. Imagine you are your own personal computer. Try it. You can't see you, you can't hear you you just get poked at now and again.
Not great, is it? No matter how helpful you want to be to you, it's tough going. When deprived of knowing what's happening around you, all the intelligence in the world won't make you into a faithful servant. It would be frustrating for you to be your own personal computer. The backchannels are far too limiting.
Two decades ago, computers were totally sensory deprived, even in the direction of computer to user. Today, the flow of information from computers to people offers a richer experience, with color, motion, and sound. However, the opposite path from people to computers enjoys no such amelioration. Computer inputting today is almost as tedious as it was 20 years ago. In this sense, the interface viewed as a whole has become wildly asymmetrical lots out, little in. Your personal computer doesn't have the recognition capability of a parrot.
Making computers listen
The primary channel of communication between computers and users during the next millennium will be speech people talking to computers and computers talking back. Yet statements I made in 1975 to the effect that speech would be the dominant interface in 20 years haven't come true. What went wrong?
Simple: we became lazy; corporate users did not complain. We also underestimated the speed at which computers would become popular with consumers. Remember when Ken Olsen, founder and at the time CEO of Digital, said that he saw no earthly reason to have a computer at home? That was in 1977.
Given that attitude, many computer corporations sat on their digital butts, enjoying a marketplace of corporate purchasing agents people who bought computers for others to accomplish the tasks outlined in their job descriptions. Under those conditions, users were expected to suffer the indignity of using a computer and to dutifully work around, among other things, its hearing impediment.
Now, suddenly, consumers like you and I are buying more than 50 percent of all PCs to use in our homes, to assist our children, and to entertain. Under these new conditions, a deaf (and dumb) computer is not acceptable.
Change will occur only when manufacturers start taking the word personal in personal computers seriously. By this I mean building speaker dependent voice recognition (which is so much easier than speaker independent recognition). Also, manufacturers must focus on highly interactive speech, not transcription, which even humans cannot do properly.
For those readers who think life will become terribly cacophonous in the presence of machines that talk and listen, let me say that we seem to work just fine with telephone handsets in our homes and offices. And for those of you who feel it is plumb silly to talk to an appliance, recall for a moment how you felt about answering machines not too long ago. No, speech really is the right channel, and it is time, once and for all, to move with resolve.
Pin the tail on the donkey
Imagine computer eyes all over the place not just stereo, but holovision. This new vision system will have cameras anywhere and everywhere, not just on the computer's front or sides. Computers can leave their eyes festooned everyplace.
In earlier Wired issues I have commented on the coincidence that PC based teleconferencing systems employ a camera above the display and a speaker below it, resulting in a unit that could serve equally well as a computer's eye and ear. Such a configuration can look and know you are there. It can know if you are smiling. That kind of "seeing the user" is important, because today's computers cannot even detect your presence something a common toilet can do.
But let's go a step further. It is not just a matter of removing the computers' blindfolds, but of giving them a new kind of vision, the ability to look into each room in your house, the oven, the closets, the garden even the traffic on your street. Furthermore, these eyes need not be like ours. They should be able to look at infrared and ultraviolet, like bats and radar. The value of looking at nonvisible light includes such examples as night vision and recognizing a smile by the minuscule change in heat that occurs at the corners of our mouth.
The user-near field
When people relate to one another, they are not simply in one of two states far away or touching. There is an important near field in human communication. Perhaps a nod occurs before a handshake, or a smile before a kiss. We enjoy a gray tone in our proximities with each other.
Computers have none of this. Either you are there (touching them) or you are not. Recall the deaf-mute PC exercise at the beginning of my rant. Now imagine that each communication with a user is like someone sneaking up behind you and yelling "Boo!" At this point you may be rolling your eyes (versus closing them), saying, "Nicholas, you've finally lost it." But think for a moment. How would you function if every interaction were limited to being there or not being there, no forewarning, no near field?
Backchannels are crucial. It is not simply a matter of making the interface more symmetrical, with as much in as out. It is a matter of including very tightly coupled signals of understanding and appreciation. Without these, talking to a computer will remain as fulfilling as talking to a lamppost.
Next Issue: The Future of Telephone Companies
Copyright © 1996 Wired Ventures Ltd.
Compilation copyright © 1996 HotWired Ventures LLCAll rights reserved.