Our goal is to understand how users perceive various smartphone-related risks, their preferences for how their sensitive data should be used by third-party applications, and the threat landscape, and then creating new user-centric systems that allow them to make more informed decisions.
Smartphones have become the most commonly-used computing platform. These devices allow third-party applications to create rich user experiences by granting the applications access to sensor data (e.g., location, accelerometers, etc.) and stored personal information. However, privacy and security problems exist when users cannot make informed choices about how their information may be used.
We’re broadly interested in answering the following questions:
- Under what circumstances do users want to be prompted with information about third-party applications may be accessing their personal information and/or sensor data?
- What steps do users take to mitigate risks on their devices?
- How can the permission-granting user experience be improved to facilitate informed consent?
- Disaster Privacy/Privacy Disaster (JASIST ’20)
- The Price is (Not) Right: Comparing Privacy in Free and Paid Apps (PETS ’20)
- 50 Ways to Leak Your Data: An Exploration of Apps’ Circumvention of the Android Permissions System (USENIX Sec ’19)
- On The Ridiculousness of Notice and Consent: Contradictions in App Privacy Policies (ConPro ’19)
- Do You Get What You Pay For? Comparing The Privacy Behaviors of Free vs. Paid Apps (ConPro ’19)
- “Won’t Somebody Think of the Children?” Examining COPPA Compliance at Scale (PETS ’18)
- Contextualizing Privacy Decisions for Better Prediction (and Protection) (CHI ’18)
- TurtleGuard: Helping Android Users Apply Contextual Privacy Preferences (SOUPS ’17)
- The Feasibility of Dynamically Granted Permissions: Aligning Mobile Privacy with User Preferences (Oakland ’17)