News and Tribune


October 31, 2013

STAWAR: I’m not bad, just designed that way

— Earlier this month when the federal health care exchange went online and was plagued by technical glitches, due to what Time Magazine called “lousy design,” I wasn’t surprised. In fact after surviving three major software installations on the job in as many years, I’m amazed when these things work at all. In one program at work there’s a screen that contains a special button. If you click it, critical information, that should never be deleted, is irretrievably erased. The button has no legitimate use and there is no warning. Some staff call it the “suicide button.” It appears that its only reason to exist is to make trouble. I’ve wondered if it’s a design error or a vestigial remnant that served some important purpose in a past life — like when it was the software for a Pac Man game.

A few years ago we replaced our home stove. I always get the simplest model possible. I figure there is less to go wrong. Like our old stove, the new one has a drawer underneath the oven, where you can store pans, broilers and cookie sheets. If, however, you pull the drawer out just a little too far, it falls off the plastic track and won’t close.

Additionally, the inside edge is as sharp as a razor, so that you risk a major laceration every time you have to wrestle it back into position. Once, after cutting myself trying to fix it, I complained to the manufacturer. The company offered to send a repair man to “file down” the sharp edge. It sounded suspiciously like this wasn’t the first time they’d heard this complaint.

It is said that up to 90 percent of accidents are due to human error. Toronto psychologist Marc Green says, “In many cases, the real source of the error is the design rather than the human — someone created a product, facility or situation where safety depends on unrealistic or unattainable standards of behavior.” According to Green, designers often rely upon us users to compensate for poor design. If the stove manufacturer expected me to make up for the poorly designed drawer, they should have provided leather gloves, or perhaps a tetanus shot, as standard accessories. According to Green, “We are surrounded by so much poor design that most people simply take it for granted and then blame themselves for stupidity when they make an error.”

Recently my wife Diane and I were in the checkout line at a grocery store when we heard the cashier apologize to the man in front of us, because his receipt came out wrong. It said that he gave her cash, when he had paid with a check. Rather than herself, this cashier blamed the design of the cash register. The cash payment key was right next to the one for checks, so it was very easy to push the wrong one.  

Donald Norman, former chairman of the Department of Cognitive Science at the University of California is the author of The Design of Everyday Things. Norman says, “Well-designed objects are easy to interpret and understand. They contain visible clues to their operation.” He contends that “poor design predominates,” resulting in  objects that cannot be understood, devices that lead to error, and a  world filled with frustration. Norman says there are a lot of people today who can’t figure out how to use their microwaves, cameras, or washing machines, and many (like me) who “habitually turn on the wrong stove burner.”

Poorly designed objects are not only inconvenient they can also be expensive and even dangerous. Receiving a flawed grocery receipt, ruining a pie due to poorly designed oven controls, or getting off on the wrong floor because the “G” on the elevator button stood for “Garage” instead of “Ground floor,” may be frustrating, but those are minor inconveniences compared to, say, a poorly designed control for a jumbo jet’s landing gear.

Seth Porges, a New York based technology journalist, has described a number of recent tech design flaws such as: Unresponsive phone touchscreens, dangerously sharp laptop cases, hyper-sensitive page buttons on electronic readers, and slippery video game controllers that, when sweaty, are liable to decapitate a family member or crash into your widescreen television.

Human factors is the study of the design of devices that interact with people. It incorporates knowledge and techniques from psychology, engineering and design, as well as many other disciplines. Human factors researchers have identified a number of important general design principles. For example, one basic rule is that people soon quit reading labels after frequently using implements, thus never depend on labels alone to guide behavior or prevent errors.

Another is the concept of “mode errors.” Many modern devices operate in multiple modes, such as remote controls and digital clocks. The same controls function differently depending upon the mode. Modes save space and money, but increase the probability of errors, because in addition to deciphering the control, the user must maintain constant mode awareness. Poor keyboarders, like me, experience mode errors when we eventually look up at what we are typing and discover, to our dismay, that we have been typing for some time in the “caps lock” mode.   

A related concept is “creeping featureism.” Due to electronic advances, it’s easy for manufacturers to pile additional features onto their devices. Although this leads to an increase in mode errors, it is tempting, because it’s cheap and people make buying decisions based on the features.

In an article in “Quality and Safety in Health Care,” J. R. Grout from Georgia’s Berry College discusses reducing errors in medical settings through design, which he calls “mistake proofing.” Errors in medical settings are common and can have especially dire consequences. Recent studies on medication administration error rates, for example, are rather sobering. In one study, the medication administration error rate in one large hospital was almost 25 percent. An analysis of more than 90 studies yielded a median medication error rate of 19.6 percent. Other research has shown that error rates are even higher at night, on weekends, after interruptions and for intravenous administrations.

According to Grout “mistake proofing” should aim primarily at preventing errors that result in injury. Grout identified four approaches to mistake proofing: 1. designing the process so that errors simply cannot occur. This usually means automating or oversimplifying a task (idiot-proofing). 2. Using a design with a built-in mechanism that allows mistakes to be immediately discovered and corrected. Grout describes the use of radio_opaque sponges during surgery. Such sponges can be readily detected inside the patient, when they still can be easily retrieved. 3. Designing the process so that if it fails, the outcome is not so detrimental. Automobile airbags are an example of this approach. The error (crash) may still occur, but the consequences are somewhat mitigated. 4. Designing a work environment that encourages error prevention. Simplicity, cleanliness, and a lack of ambiguity characterize an environment that minimizes the chance for errors. Grout says, “… small design changes can have a profound impact on human errors. Thoughtfully changing the physical details of health care process design can be very effective in preventing errors or harm.”

Donald Norman concludes that, “Proper design can make a difference in our quality of life.” He encourages designers, as well as the public, to join in the battle for usability. He urges boycotting unusable designs and complaining to manufacturers and retailers who carry shoddy products. Finally Norman says we can support proper design by purchasing well-designed products, even if they cost more.

So the next time I bring home some expensive gadget, I hope Diane realizes that I’m only doing my civic duty.

— Terry L. Stawar, Ed.D., lives in Georgetown and is the CEO of LifeSpring the local community mental health center in Jeffersonville. He can be reached at Checkout his Welcome to Planet-Terry blog and podcast at

Text Only | Photo Reprints
Twitter Updates
Follow us on twitter