Advertisement

​Click, click, nudge, nudge

Commentary: Using "choice architecture" can lead to a more secure user experience.

Tablet-click-pad-Creative-Commons

The carrot and the stick: These are the tools we’ve used, historically, to get others to do what we want. Education, too, can be effective for gaining compliance, giving people the tools to do the job right themselves. And yet, each of these techniques has its limits, as we in cybersecurity well know. We’ve tried them all, yet cyber users continue to engage in risky online behaviors.

Why is this? For one thing, there is no one-size-fits-all incentive. What motivates one person (for example, lower health insurance premiums for quitting smoking) may not work with another (such as a lifelong smoker insured on a spouse’s plan).

JR-Reagan-Deloitte-portrait

JR Reagan

Advertisement

Punitive measures can breed anxiety and resentment, and cripple productivity. Education can increase awareness, but people forget, or act impulsively, or shrug off what they know for the sake of expediency.

Fortunately, there is another way. Increasingly, organizations are trying a fourth approach: choice architecture, or “nudges,” using behavioral psychology to guide or influence people to make desired choices. Here are some examples:

  • In Copenhagen, painting sidewalks with green footsteps leading to trash cans reduced littering by 46 percent.
  • In the U.K., changing the wording of a letter to delinquent taxpayers increased the number of people who paid their taxes on time from 68 to 83 percent, generating another $45 million in annual revenue.
  • Hotel guests who were told that most others who slept in their room reused their towels reused theirs more frequently (49 percent) than those who saw a standard “please reuse your towel” sign (37 percent).

Nudge theory is nothing new. The Food Pyramid, the “Uncle Sam Wants You” military recruitment efforts during World Wars I and II, and the D.A.R.E. program were all successful deliberate-design campaigns.

In the for-profit sector, think of department stores laid out in a confusing maze to keep shoppers browsing longer; casinos’ eschewing clocks and windows so gamblers lose track of time; and supermarkets pumping the scents of warm bread or roasted chicken to prime us to buy more food.

Advertisement

The technique is catching on in the cybersecurity field, as well, envisioned as a way to help prevent user error, the No. 1 cause of data breaches. Here are some principles to keep in mind:

  • Understanding psychology is the key. Gleaned from years of psychological study, nudge theory provides glimpses into the reasons why we do the things we do. Humans are predictable creatures, it turns out — we tend to do what’s easiest and most expedient, as these insights show
  • When given options, people tend to go with the default choice. In countries where donating one’s organs is the default but people may opt out, donor rates are as high as 99 percent. How can we make the most secure option the default option? One major breach last year, for instance, occurred in part because the organization’s operating system hadn’t been updated with the latest patches. One idea: automatic upgrades, by default. As an option, users could choose to upgrade at their own convenience or not all, but would have to pay a premium for that choice.
  • Contextual cues can make all the difference. Where electronic signs show drivers their actual speed compared to the speed limit, traffic slows down. Likewise, contextual cues show great promise for nudging workers toward more secure behaviors online. In an experiment at Carnegie Mellon University, when devices sent daily nudges telling users how many times apps had shared their location, contact lists, and other information, those users adjusted their privacy settings more frequently.
  • We may think long-term, but we act short-term. Humans often act impulsively, seeking short-term gratification even when it runs counter to their long-term goals. This may be especially true at work. Aiming to get the job done as efficiently — and effortlessly — as possible, employees will choose the easiest and quickest route, perhaps putting their data — and that of their organization — at risk. Instead of spending time hunting for a secure Wi-Fi connection, for instance, employees working off-site might use an open network if it’s easily available. Why not make it easier, then, to spot the secure networks, and to connect to them? In a recent U.K. study, people opted for secure connections when they appeared first on the list of available Wi-Fi networks as opposed to an alphabetical listing, the current norm. Programming in automatic signups to secure networks, complete with payment, if needed, might complete the nudge.

A plethora of possibilities

The more we understand about behavioral psychology and nudge theory, the more ideas spring to mind for designing an online user experience where cyber-secure choices naturally feel like the right choices.

Why not direct the user to an approved app store when he or she tries to download an app from an unapproved site? What about devices that automatically type a clicked-on link into the browser window, one character at a time, instead of opening it? Knowing that, against all advice, people tend to create simple, easy-to-remember passwords, why not design a series of questions to generate passwords that are uniquely personal, memorable, and secure?

Advertisement

Far from replacing the carrot, the stick, or the chalkboard, behavioral nudges can enhance our repertoire of techniques for increasing online security. Rather than hammering home the need for caution or trying to coax or convince workers to work safely online, nudges focus on refining and shaping behaviors using insights into human nature — elegant additions, indeed, to the cybersecurity toolbox.

JR Reagan is the global chief information security officer of Deloitte. He also serves as professional faculty at Johns Hopkins, Cornell and Columbia universities. Follow him @IdeaXplorer. Read more from JR Reagan.

Latest Podcasts