When Design Bombs
Imagine waking up and going about your morning routine when all of the devices in your home go off with this warning.
In case you missed it, two days ago this is what residents of Hawaii saw pop-up on their smart phones. If the information had been accurate, according to the Department of Defense in preparedness materials recently distributed to the public, residents of Hawaii would have had 15 minutes at most from the time US intelligence picked up the ballistic missile and issued the alert until an atomic bomb exploded on the islands.
Fifteen minutes – which makes it completely understandable that people went into panic mode.
They found out 38 minutes later that the alert was a mistake.
How did this mistake happen? Someone clicked the wrong button during a drill. It was human error. User error. I know, you’re thinking, how dare he (or she) make that kind of horrible mistake! That’s an understandable reaction, except that human error is almost never the fault of the person; it’s the fault of technology that was not engineered with human behavior in mind.
If you’re someone who is still unsure what user experience designers do, it’s preventing things like this from happening. Forget whether that system is pretty, modern, sleek, clean, blah blah blah. Does it work for the humans who use it and for the tasks they need to complete? That is design.
It was reported by several news outlets that a drop-down menu contained controls for selecting a ballistic missile test vs. actual ballistic missile located adjacent to each other in the menu. That was an easy one to spot. But where one major usability breakdown exists, others lurk. In the case of Hawaii’s Emergency Management Agency, the second shoe to drop was that there was no way to CANCEL the erroneous missile alert, as reported yesterday by the Washington Post.
For all the talk about and corporate investment in automation, machine-learning and technology that replaces the need for human involvement, the majority of software systems are still used by humans to help them do their jobs. Yet so many systems are actually working against us – not for us – because of poor design.
Granted, the stakes are not always so high, but there is always value in ensuring your systems are designed with intention and with knowledge that from end-to-end a user can accurately accomplish what needs to be done.
Remember all technology is designed by someone, whether they know what they’re doing or not, and regardless of whether they’ve ever spoken with or observed your end users.
The question you should be asking is this: how much money am I leaving on the table from technology that is not optimized by design?
Great article, Chrys!