Despite recent advances in increasing computer security by eliminating human involvement and error, there are still situations in which humans must manually perform computer security tasks, such as enabling automatic updates, rebooting machines to apply some of those updates, or enrolling in two-factor authentication. We argue that present bias—the tendency to discount future risks and gains in favor of immediate gratifications—could be the root cause explaining why many users fail to take such actions. Thus, we systematically explore the application of commitment devices, a technique from behavioral economics, to mitigate the effects of present bias on the adoption of end-user security measures. Offering users the option to be reminded or to schedule such tasks in the future could be effective in increasing their proclivity. While some current systems have begun incorporating such commitment nudges into software update messaging, we are unaware of rigorous scientific research that demonstrates how effective these techniques are, how they may be improved, and how they may be applied to other security behaviors. Using two online experiments, with over 1,000 participants total, we find that both reminders and commitment nudges can be effective at reducing the intentions to ignore the request to enable automatic updates (Study 1), and to install security updates and enable two-factor authentication, but not to configure automatic backups (Study 2). We also find that intentions of Mac OS users are generally more affected by these nudges.
Alisa Frik, Serge Egelman, Marian Harbach, Nathan Malkin, and Eyal Peer. Better Late(r) than Never: Increasing Cyber-Security Compliance by Reducing Present Bias. Workshop on the Economics of Information Security (WEIS), 2018.