“Won’t Somebody Think of the Children?” Examining COPPA Compliance at Scale (PETS ’18)

Abstract We present a scalable dynamic analysis framework that allows for the automatic evaluation of the privacy behaviors of Android apps. We use our system to analyze mobile apps’ compliance with the Children’s Online Privacy Protection Act (COPPA), one of the few stringent privacy laws in the U.S. Based on our automated analysis of 5,855 of the most popular free children’s apps, we found that a majority are potentially in…

Contextualizing Privacy Decisions for Better Prediction (and Protection) (CHI ’18)

Abstract Modern mobile operating systems implement an ask-on-first-use policy to regulate applications’ access to private user data: the user is prompted to allow or deny access to a sensitive resource the first time an app attempts to use it. Prior research shows that this model may not adequately capture user privacy preferences because subsequent requests may occur under varying contexts. To address this shortcoming, we implemented a novel privacy management…

TurtleGuard: Helping Android Users Apply Contextual Privacy Preferences (SOUPS ’17)

Abstract Current mobile platforms provide privacy management interfaces to regulate how applications access sensitive data. Prior research has shown how these interfaces are insufficient from a usability standpoint: they do not account for context. In allowing for more contextual decisions, machine-learning techniques have shown great promise for designing systems that automatically make privacy decisions on behalf of the user. However, if such decisions are made automatically, then feedback mechanisms are…

The Feasibility of Dynamically Granted Permissions: Aligning Mobile Privacy with User Preferences (Oakland ’17)

Abstract Current smartphone operating systems regulate application permissions by prompting users on an ask-on-first-use basis. Prior research has shown that this method is ineffective because it fails to account for context: the circumstances under which an application first requests access to data may be vastly different than the circumstances under which it subsequently requests access. We performed a longitudinal 131-person field study to analyze the contextuality behind user privacy decisions…

“Is Our Children’s Apps Learning?” Automatically Detecting COPPA Violations (ConPro ’17)

Abstract In recent years, a market of games and learning apps for children has flourished in the mobile world. Many of these often “free” mobile apps have access to a variety of sensitive personal information about the user, which app developers can monetize via advertising or other means. In the United States, the Children’s Online Privacy Protection Act (COPPA) protects children’s privacy, requiring parental consent to the use of personal…

Keep on Lockin’ in the Free World: A Multi-National Comparison of Smartphone Locking (CHI ’16)

Abstract We present the results of an online survey of smartphone unlocking (N=8,286) that we conducted in eight different countries. The goal was to investigate differences in attitudes towards smartphone unlocking between different national cultures. Our results show that there are indeed significant differences across a range of categories. For instance, participants in Japan considered the data on their smartphones to be much more sensitive than those in other countries,…

The Anatomy of Smartphone Unlocking: A Field Study of Android Lock Screens (CHI ’16)

Abstract To prevent unauthorized parties from accessing data stored on their smartphones, users have the option of enabling a “lock screen” that requires a secret code (e.g., PIN, drawing a pattern, or biometric) to gain access to their devices. We present a detailed analysis of the smartphone locking mechanisms currently available to billions of smartphone users worldwide. Through a month-long field study, we logged events from a panel of users…

Android Permissions Remystified: A Field Study on Contextual Integrity (USENIX Sec ’15)

Abstract We instrumented the Android platform to collect data regarding how often and under what circumstances smartphone applications access protected resources regulated by permissions. We performed a 36-person field study to explore the notion of “contextual integrity,” i.e., how often applications access protected resources when users are not expecting it. Based on our collection of 27M data points and exit interviews with participants, we examine the situations in which users…

Are you ready to lock? understanding user motivations for smartphone locking behaviors (CCS ’14)

Abstract In addition to storing a plethora of sensitive personal and work information, smartphones also store sensor data about users and their daily activities. In order to understand users’ behaviors and attitudes towards the security of their smartphone data, we conducted 28 qualitative interviews. We examined why users choose (or choose not) to employ locking mechanisms (e.g., PINs) and their perceptions and awareness about the sensitivity of the data stored…

The effect of developer-specified explanations for permission requests on smartphone user behavior (CHI ’14)

Abstract In Apple’s iOS 6, when an app requires access to a protected resource (e.g., location or photos), the user is prompted with a permission request that she can allow or deny. These permission request dialogs include space for developers to optionally include strings of text to explain to the user why access to the resource is needed. We examine how app developers are using this mechanism and the effect…