Actions Speak Louder than Words: Entity-Sensitive Privacy Policy and Data Flow Analysis with PoliCheck (USENIX Sec ’20)

AbstractIdentifying privacy-sensitive data leaks by mobile applications has been a topic of great research interest for the past decade. Technically, such data flows are not “leaks” if they are disclosed in a privacy policy. To address this limitation in automated analysis, recent work has combined program analysis of applications with analysis of privacy policies to determine the flow-to-policy consistency, and hence violations thereof. However, this prior work has a fundamental…

Disaster Privacy/Privacy Disaster (JASIST ’20)

AbstractPrivacy expectations during disasters differ significantly from nonemergency situations. This paper explores the actual privacy practices of popular disaster apps, highlighting location information flows. Our empirical study compares content analysis of privacy policies and government agency policies, structured by the contextual integrity framework, with static and dynamic app analysis documenting the personal data sent by 15 apps. We identify substantive gaps between regulation and guidance, privacy policies, and information flows,…

The Price is (Not) Right: Comparing Privacy in Free and Paid Apps (PETS ’20)

AbstractIt is commonly assumed that “free” mobile apps come at the cost of consumer privacy and that paying for apps could offer consumers protection from behavioral advertising and long-term tracking. This work empirically evaluates the validity of this assumption by comparing the privacy practices of free apps and their paid premium versions, while also gauging consumer expectations surrounding free and paid apps. We use both static and dynamic analysis to…

50 Ways to Leak Your Data: An Exploration of Apps’ Circumvention of the Android Permissions System (USENIX Sec ’19)

Abstract Modern smartphone platforms implement permission-based models to protect access to sensitive data and system resources. However, apps can circumvent the permission model and gain access to protected data without user consent by using both covert and side channels. Side channels present in the implementation of the permission system allow apps to access the data without permission; whereas covert channels enable communication between two colluding apps so that one app…

On The Ridiculousness of Notice and Consent: Contradictions in App Privacy Policies (ConPro ’19)

AbstractThe dominant privacy framework of the information age relies on notions of “notice and consent.” That is, service providers will disclose, often through privacy policies, their data collection practices, and users can then consent to their terms. However, it is unlikely that most users comprehend these disclosures, which is due in no small part to ambiguous, deceptive, and misleading statements. By comparing actual collection and sharing practices to disclosures in…

Do You Get What You Pay For? Comparing The Privacy Behaviors of Free vs. Paid Apps (ConPro ’19)

AbstractIt is commonly assumed that the availability of “free” mobile apps comes at the cost of consumer privacy, and that paying for apps could offer consumers protection from behavioral advertising and long-term tracking. This work empirically evaluates the validity of this assumption by investigating the degree to which “free” apps and their paid premium versions differ in their bundled code, their declared permissions, and their data collection behaviors and privacy…

“Won’t Somebody Think of the Children?” Examining COPPA Compliance at Scale (PETS ’18)

Abstract We present a scalable dynamic analysis framework that allows for the automatic evaluation of the privacy behaviors of Android apps. We use our system to analyze mobile apps’ compliance with the Children’s Online Privacy Protection Act (COPPA), one of the few stringent privacy laws in the U.S. Based on our automated analysis of 5,855 of the most popular free children’s apps, we found that a majority are potentially in…

Contextualizing Privacy Decisions for Better Prediction (and Protection) (CHI ’18)

Abstract Modern mobile operating systems implement an ask-on-first-use policy to regulate applications’ access to private user data: the user is prompted to allow or deny access to a sensitive resource the first time an app attempts to use it. Prior research shows that this model may not adequately capture user privacy preferences because subsequent requests may occur under varying contexts. To address this shortcoming, we implemented a novel privacy management…

TurtleGuard: Helping Android Users Apply Contextual Privacy Preferences (SOUPS ’17)

Abstract Current mobile platforms provide privacy management interfaces to regulate how applications access sensitive data. Prior research has shown how these interfaces are insufficient from a usability standpoint: they do not account for context. In allowing for more contextual decisions, machine-learning techniques have shown great promise for designing systems that automatically make privacy decisions on behalf of the user. However, if such decisions are made automatically, then feedback mechanisms are…

The Feasibility of Dynamically Granted Permissions: Aligning Mobile Privacy with User Preferences (Oakland ’17)

Abstract Current smartphone operating systems regulate application permissions by prompting users on an ask-on-first-use basis. Prior research has shown that this method is ineffective because it fails to account for context: the circumstances under which an application first requests access to data may be vastly different than the circumstances under which it subsequently requests access. We performed a longitudinal 131-person field study to analyze the contextuality behind user privacy decisions…