Contextualizing Privacy Decisions for Better Prediction (and Protection) (CHI ’18)
Modern mobile operating systems implement an ask-on-first-use policy to regulate applications’ access to private user data: the user is prompted to allow or deny access to a sensitive resource the first time an app attempts to use it. Prior research shows that this model may not adequately capture user privacy preferences because subsequent requests may occur under varying contexts. To address this shortcoming, we implemented a novel privacy management system in Android, in which we use contextual signals to build a classifier that predicts user privacy preferences under various scenarios. We performed a 37-person field study to evaluate this new permission model under normal device usage. From our exit interviews and collection of over 5 million data points from participants, we show that this new permission model reduces the error rate by 75% (i.e., fewer privacy violations), while preserving usability. We offer guidelines for how platforms can better support user privacy decision making.
SIGCHI Honorable Mention Award!
Primal Wijesekera, Joel Reardon, Irwin Reyes, Lynn Tsai, Jung-Wei Chen, Nathan Good, David Wagner, Konstantin Beznosov, and Serge Egelman. Contextualizing Privacy Decisions for Better Prediction (and Protection). Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’18), 2018.