Loudspeaker
Early-twentieth-century poet G. K. Chesterton once said: “The moment in history when we had nothing important left to say was marked by the invention of the loudspeaker.” The device makes it possible to listen to a Wagner opera-or any other “unimportant” stuff-while taking a bath, riding the subway or hiking in the forest. We can hear the electronically preserved voices of people long dead, as well as a universe of sounds unlike anything in nature. In a movie theater, loudspeakers surround us with sound and transport us into illusion. From Hitler to Hendrix, the century’s charismatic figures have reached the public through speakers.
Since the loudspeaker came on the scene around 1915, there’s been a constant quest to perfect the illusion. Now that audio recording and storage technologies are so good, loudspeakers are “easily the weakest link in the home audio system,” says William R. Short, Bose fellow at Bose Corp. in Framingham, Mass., and co-inventor of Bose’s Acoustic Wave system. “No way am I going to sit in my living room and imagine that I’m actually in Symphony Hall. It just doesn’t happen, and we really don’t know why yet.”
Contribution to the lexicon: “Pay no attention to the man behind the curtain!”
Touch-Tone Telephone
AT&T introduced “Touch-Tone” push-button phone dialing service in November 1963. By all accounts, practically everyone who tried it liked it better than rotary dialing. Bell Labs researchers went to great lengths to make sure people would accept the new interface: They tested 16 different arrangements of buttons, including crosses and circular patterns. They also considered sizes, shapes and spacing of buttons, springiness when pushed, and even the contour of the surface under the fingertips.
Cutting phone dialing time in half is nice, but from the beginning the intention was to transform the telephone into a remote data entry device-a capability that expanded with the introduction of the “*” and “#” keys in 1968. Though some of the services originally envisioned, such as using a telephone to turn on home appliances, have yet to materialize, the Touch-Tone phone has made possible phone trees, voice mail and a host of other services. Of course, sometimes the best way to get service is still to pretend you have a rotary phone and just stay on the line.
Contribution to the lexicon: “Press 1 for…”
Steering Wheel
The first cars had tillers. Tillers worked, but they transmitted the vibration from primitive roads to the driver’s hand, making it hard to steer. When engines moved to the front of the car, the increased weight made tillers impractical. The steering wheel puts a gear system between you and the car’s wheels, offering a mechanical advantage and isolating you from vibrations. Despite this extra layer of insulation, a good steering wheel manages to give the driver a feeling of intimate contact with the road.
One unforeseen problem with the wheel was that, as cars got speedier, people started getting impaled on steering columns in crashes. In the 1950s, concept cars were developed that did not have steering wheels-but the public wasn’t interested. A car without a steering wheel just isn’t a car.
Contribution to the lexicon: “Take the wheel”
Magnetic-Stripe Card
Machines let us through doors, dole out money, and extend credit. To do these jobs, they read an identity code embedded in a magnetic strip on a plastic card. Indeed, when you lose your wallet, the biggest concern isn’t the cash-it’s the cards that might enable someone else to abuse your privileges.
Part of the reason we’re scared is that we’re so good at abusing our own privileges. In the early 1970s magnetic stripes on credit cards streamlined the authorization of credit card purchases, making them more attractive to retailers; combined with interest charges and new kinds of payment plans, the magnetic stripe helped the credit-card industry gorge America on credit.
Will new incarnations of plastic data in the form of “smart cards” go even further and make mag-stripe cards disappear? Not so fast, says David Warwick, author of Ending Cash. “Chip cards are going to find niches in certain applications,” he says, “but I don’t see them replacing credit cards. No one wants to invest in new terminals.”
Contribution to the lexicon: “Swipe your card.”
Traffic Light
When African-American businessman Garrett Morgan patented the traffic light in 1923, trains had been using automated lighted signals for some time. But trains run on set schedules, in single file, and it’s no small task to stop; therefore, the default message from a train signal is “proceed.” Traffic lights for automobiles have a quite different task, and more often than we’d like, it’s to tell us to stop.
We hate being told to stop. Road-rage expert Leon James, professor of psychology at the University of Hawaii, says we link self-esteem to the gas and brake pedals. “If you see a light turning yellow, you have to accelerate. If you have to stop because the light turned red, you feel crestfallen.” James calls the intersection a “psychodynamic zone.” If so, it’s a zone increasingly under the dominion of the superego rather than the id. Some new traffic lights can take pictures of the license plates of cars that run red lights. The offender later receives a ticket in the mail-or a printed driving lesson. Others show drivers their current speed. At a traffic light, says James, “a lot can be done between the city transportation department and the driver. It’s a communication hotspot.”
Contribution to the lexicon: “Give it the green light”
Remote Control
The universal desire to avoid television commercials was the driving force behind the development of the remote control; the president of Zenith-of all people-hated the interruptions. The first remote, developed by Zenith in 1950, ran a cable from the viewer to the set. The first wireless remote, introduced by Zenith in 1955, used light sensors; later models used ultrasonics. Infrared remotes, which came along in the early 1980s, were so cheap everyone could afford them. Today the remote is standard equipment; 99 percent of television sets and 100 percent of VCRs come equipped for action at a distance. Especially for children who grew up with remote controls, surfing from channel to channel is part of the television viewing experience. Remote controls have been blamed for making us couch potatoes, but that’s an unfair rap; it’s not as if people without remote controls used to get up and change the channel frequently.
Contribution to the lexicon: “Channel surfing”
Cathode-Ray Tube
The cathode-ray tube (CRT) made its debut in 1897 in an oscilloscope, developed by German physicist Karl Ferdinand Braun. The “killer app” for the CRT, of course, was television, which appeared in the 1920s but didn’t enter most American homes until the 1950s. Now it’s everywhere. “TV is the main experience of waking life for most people in western industrial nations,” claims Jerry Mander, author of Four Arguments for the Elimination of Television. That may be overstating the case, but not by much; the average American watches several hours of television a day.
Put a computer interface on the screen, however, and we’re not quite so passive: We interact with it, turning the screen display into a means to an end rather than the end itself. But the emergence of terms like “Internet addiction” illustrates that often many of us would rather sit at a CRT than do anything else.
Contribution to the lexicon: “The tube”
Liquid Crystal Display
Television and computer screens convey massive amounts of visual information. The downside? They’re found in a big heavy box, because they generally require a cathode-ray tube. Liquid crystal displays (LCDs) make graphical displays portable. Although liquid crystals were discovered in 1888 by Austrian botanist Frederich Reinitzer, they weren’t used for displays until 1971, when Hoffmann-La Roche patented the “twisted nematic” LCD-the kind now found in calculators and watches. The active-matrix LCD, in which every pixel is controlled by a transistor, arrived in the 1980s, making possible laptop computers, miniature TVs and portable DVD players.
Although LCDs still have “issues of video speed and viewing-angle dependence” to be worked out, Webster E. Howard, vice president for technology at FED Corp. of Hopewell Junction, N.Y., predicts that thin, flat, liquid crystal displays will replace the bulky CRT monitors on our desks within five years. If they come into widespread use, he says, LCDs will owe their success to laptop computers: “The need for the portable computer was what made possible the investment in this technology that ultimately led it to be economical and cheap.”
Contribution to the lexicon: “Laptop”
Mouse/Graphical User Interface
“When I started with the mouse, very few were taking seriously that people would want to work online at a computer display,” says Douglas Engelbart, who invented the mouse and graphical user interface (GUI) in the 1960s. His mouse/GUI combination, further developed at Xerox Palo Alto Research Center (PARC) in the 1970s and popularized by Apple in the 1980s, made a computer’s contents visible. Before that, to edit a computer file, you had to remember its name and location. The reduced demand on short-term memory, combined with a visual-spatial environment users enjoyed, converted the computer display into a workspace. In his book Interface Design, Steven Johnson says Engelbart’s invention “probably had more to do with popularizing the digital revolution than any other software advance.”
Making everything visual rather than linguistic, however, means that semantically complex commands get left in the dust. With a command-line operating system (remember DOS?) a task such as making a copy of every file ending in “.txt” took a few keystrokes. A GUI offers no shortcut. Engelbart, whose original interface put a mouse in one of the user’s hands and a special “chording” keyset in the other, thinks today’s GUI is awfully primitive: “Here’s the language they’re proposing: You point to something and grunt.” Our cave-dwelling ancestors would have understood.
Contribution to the lexicon: “Point-and-click”
Barcode Scanner
In February 1992, George Bush was given a demo of a supermarket barcode scanner made by NCR. His response? “That’s amazing!” Contrary to news accounts of the incident, however, he wasn’t wowed by the mere existence of scanner technology, which has been around since 1974. He was marveling at a new, improved version that was able to read a barcode torn into seven pieces.
Scanners have come a long way since the first 10-pack of Juicy Fruit gum was scanned at Marsh Supermarket in Troy, Ohio. The initial draw for companies was accuracy of data entry: Barcode readers made a lot fewer errors than cashiers. But lurking in the laser’s capability was the potential to collect vast amounts of information-on what products are selling, and when, and in what combinations. Says Craig Maddox, product line director for barcode scanners at NCR, “it was a good 15 years before the grocery industry started to use the data.” Nowadays, retailers compile terabyte-sized databanks of every transaction in their stores and sell it back to vendors; barcodes have also speeded communication across the whole supply chain so much, remarks Maddox, that “some stores…don’t pay for the product until it’s already sold.”
Contribution to the lexicon: “Scan it”