Author: serge

Log: It’s Big, It’s Heavy, It’s Filled with Personal Data! Measuring the Logging of Sensitive Information in the Android Ecosystem (USENIX Sec ’23)

AbstractAndroid offers a shared system that multiplexes all logged data from all system components, including both the operating system and the console output of apps that run on it. A security mechanism ensures that user-space apps can only read the log entries that they create, though many “privileged” apps are exempt from this restriction. This […]

Security and Privacy Failures in Popular 2FA Apps (USENIX Sec ’23)

AbstractThe Time-based One-Time Password (TOTP) algorithm is a 2FA method that is widely deployed because of its relatively low implementation costs and purported security benefits over SMS 2FA. However, users of TOTP 2FA apps face a critical usability challenge: maintain access to the secrets stored within the TOTP app, or risk getting locked out of […]

Lessons in VCR Repair: Compliance of Android App Developers with the California Consumer Privacy Act (CCPA) (PETS ’23)

AbstractThe California Consumer Privacy Act (CCPA) provides California residents with a range of enhanced privacy protections and rights. Our research investigated the extent to which Android app developers comply with the provisions of the CCPA that require them to provide consumers with accurate privacy notices and respond to “verifiable consumer requests” (VCRs) by disclosing personal […]

Deployment of Source Address Validation by Network Operators: A Randomized Control Trial (Oakland ’22)

AbstractIP spoofing, sending IP packets with a false source IP address, continues to be a primary attack vector for large-scale Denial of Service attacks. To combat spoofing, various interventions have been tried to increase the adoption of source address validation (SAV) among network operators. How can SAV deployment be increased? In this work, we conduct […]

Privacy Champions in Software Teams: Understanding Their Motivations, Strategies, and Challenges (CHI ’21)

AbstractSoftware development teams are responsible for making and implementing software design decisions that directly impact end-user privacy, a challenging task to do well. Privacy Champions—people who strongly care about advocating privacy—play a useful role in supporting privacy-respecting development cultures. To understand their motivations, challenges, and strategies for protecting end-user privacy, we conducted 12 interviews with […]

Deciding on Personalized Ads: Nudging Developers About User Privacy (SOUPS ’21)

AbstractMobile advertising networks present personalized advertisements to developers as a way to increase revenue. These types of ads use data about users to select potentially more relevant content. However, choice framing also impacts app developers’ decisions which in turn impacts their users’ privacy. Currently, ad networks provide choices in developer-facing dashboards that control the types […]

Developers Say the Darnedest Things: Privacy Compliance Processes Followed by Developers of Child-Directed Apps (PETS ’22)

Abstract We investigate the privacy compliance processes followed by developers of child-directed mobile apps. While children’s online privacy laws have existed for decades in the US, prior research found relatively low rates of compliance. Yet, little is known about how compliance issues come to exist and how compliance processes can be improved to address them. […]

Runtime Permissions for Privacy in Proactive Intelligent Assistants (SOUPS ’22)

AbstractIntelligent voice assistants may soon become proactive, offering suggestions without being directly invoked. Such behavior increases privacy risks, since proactive operation requires continuous monitoring of conversations. To mitigate this problem, our study proposes and evaluates one potential privacy control, in which the assistant requests permission for the information it wishes to use immediately after hearing […]

Balancing Power Dynamics in Smart Homes: Nannies’ Perspectives on How Cameras Reflect and Affect Relationships (SOUPS ’22)

AbstractSmart home cameras raise privacy concerns in part because they frequently collect data not only about the primary users who deployed them but also other parties—who may be targets of intentional surveillance or incidental bystanders. Domestic employees working in smart homes must navigate a complex situation that blends privacy and social norms for homes, workplaces, […]

Join us

Interested in joining us?

We welcome students from any major and experience — not just CS or EE. Many of our projects involve human computer interaction (HCI) research, including quantitative and qualitative social science and cognitive science methods. Non-technical and technical students interested in conducting research in usable security and privacy, please say hi! Berkeley students interested in getting involved can […]