Enhance User Privacy Controls in Mobile Applications
Problem Summary
Ensuring user privacy and data protection in mobile applications is complex. The challenges stem from various factors, including inadequate control and visibility for users over how apps and services access their data. This complexity arises from several issues not being fully addressed, such as managing access by third-party libraries, controlling detailed permission granting, and offering options for partial consent.
Rationale
Mobile applications face significant privacy control gaps, including limited visibility over third-party library access, binary consent options, and a lack of detailed user controls. These challenges necessitate the development of more sophisticated, user-centric mechanisms for managing privacy. By integrating contextual privacy controls, privacy by design principles, and user-friendly interfaces, users can be empowered to adjust their permissions dynamically, make informed decisions, and better protect their personal data.
Solution
To develop user-centred mechanisms for managing privacy in mobile applications. These mechanisms provide granular control over app permissions and data sharing, offer partial consent options, and use intuitive, privacy-by-design interfaces. Together, they enhance user autonomy and visibility over how apps and third-party libraries access and use personal data, addressing privacy concerns dynamically and effectively.
Chitkara et al. [1] introduced ProtectMyPrivacy (PmP) for Android, focusing on distinguishing between app-native and third-party library data accesses. As a key feature, it provides contextual control over data sharing to minimise the decisions required, thus minimising user decision fatigue.
PmP leverages Android's permissions system and the Xposed framework for runtime monitoring and control, allowing actions like allowing, denying, or faking data access. PmP's backend collects user decisions and app behaviours, enhancing privacy management through contextual understanding. The app minimises user decision fatigue by managing accesses from common third-party libraries, reducing unnecessary data exposure.
Bock, Chowdhury, and Momen [2] introduced a mobile application that incorporates a "Maybe" button, offering users a novel partial consent feature for data access permissions. This approach allows users to grant temporary access to their data, enhancing privacy controls by enabling users to reassess permissions over time. Implemented on Android, this solution explores user acceptance and the practicality of conditional access, aiming to balance functionality and privacy. By allowing users to navigate the often binary choice of consent, the app provides a nuanced option to manage privacy preferences dynamically, reflecting a more realistic approach to personal data management and control.
Ataei, Degbelo, and Kray [3] introduced a user interface (UI) designed for managing location privacy settings. This UI is grounded in privacy theory, privacy by design principles, and general UI design principles. It allows users to control when, with whom, and where their location information is shared through intuitive visual controls and icons. The interface emphasises fine-grained privacy management, enabling users to set time-based, distance-based, and relational privacy controls. Integrating these controls directly into a location-based service (LBS) application aims to enhance user autonomy and privacy, making it straightforward for users to manage their location privacy preferences directly from their mobile devices.
Aydin et al. [4] presented a privacy-enhancing framework for Android applications called VisiDroid, focusing on user control over personal data shared with advertising and analytics libraries. It operates in three phases: offline analysis to track data flows and UI contexts, visual configuration allowing users to set privacy preferences on a per-widget basis, and code instrumentation to enforce these preferences by anonymising data before it's sent out.
This last step ensures that the user's privacy preferences are enforced by substituting actual data with anonymised values right before they are sent out in ad or analytics requests. The approach aims to minimise potential side effects on app functionality that might arise from modifying the data used by these requests.
Quay-de la Vallee, Selby, and Krishnamurthi et al. [5] proposed a permission management assistant that aids users in managing permissions post-installation. To save users' time and attention, the PerMission Assistant sorts installed apps by their worst-rated permissions, helping users address the most concerning issues first. It guides users to manage app permissions through their device settings, as it cannot directly adjust other apps' settings for security reasons of the operating system.
Fawaz and Shin [6] proposed a solution for enhancing location privacy on Android smartphones. The authors introduce a system that leverages cloaking and obfuscation techniques to protect users' location data from unauthorised tracking, profiling, and identification. This system allows users to maintain the functionality of location-based services while significantly reducing the risk of privacy breaches. The proposed solution operates in real-time, dynamically adjusting the level of location privacy based on the context and user preferences. It intelligently decides when and how to cloak or obfuscate location data, ensuring minimal disruption to app functionality. The system is designed to be user-friendly, providing a seamless experience without requiring extensive user intervention.
Platforms: mobile devices
Example
Overview of PmP app [1]. (See enlarged)
Interface of the prototype app using partial consent as in [2]. (See enlarged)
The user interface for managing location privacy settings, from left: (a) Main UI; (b) UI for whom to share; (c) Adjusting time restriction; (d) Visualised feedback on an activated time restriction [3]. (See enlarged)
The VisiDroid configuration interface [4]. (See enlarged)
The Permission Assistant home page [5]. (See enlarged)
Left: When the app first accesses the location, LP-Guardian prompts the user to set the app's anonymisation rule with two options: 'Do nothing' or 'Hide Me.' Choosing 'Hide Me' activates the appropriate anonymisation mechanism. Right: When an app attempts to access the location from a new place, LP-Guardian prompts the user for a decision [6] (See enlarged)
Use cases
- Facilitating dynamic and flexible permission management.
- Allowing users to control and minimise what information is shared based on their comfort levels
- Empowering users with customisable permission granting.
- Enhancing privacy management and data protection in mobile environments.
- Aiding users to better control permission granting in mobile apps.
Pros
- Offers users better insights into how their data is accessed and used, potentially increasing privacy awareness. The solution has been deployed and evaluated with real users, demonstrating its practicality and effectiveness in reducing privacy leaks to third-party libraries. Additionally, by focusing on popular third-party libraries common across many apps, PmP reduces the number of privacy decisions users have to make, simplifying the user experience. It provides granular control over privacy-sensitive data, distinguishing between app and library accesses [1].
- The "Maybe" button introduces a nuanced approach to permission granting. It allows users more control over their data by providing temporary access to apps and enhancing privacy management. Many participants in a user study expressed interest in using the "Maybe" button on their private devices, indicating a demand for more flexible privacy controls beyond the binary allow/deny options. It was most frequently used for permissions like memory and contacts, highlighting its potential effectiveness for managing access to more sensitive data types. Participants also valued control and privacy protection over user-friendliness, indicating a strong preference for privacy-enhancing features, even if they might complicate the user experience slightly. The study also found that the majority of participants did not perceive recurring requests for partial consent as annoying, suggesting that users are open to more frequent interaction if it means better privacy control [2].
- In a user study, the UI for fine-grained control of location privacy settings was well-received, meeting users' expectations for managing location privacy. Users expressed a strong desire to control whom they share their location with, especially unknown companies [3].
Cons
- Despite its advantages, 80.6% of participants did not use the "Maybe" button, indicating challenges in adoption and a need for more user education. The default one-day setting might have influenced its use, indicating that customisable duration options could improve its utility [2].
- Solutions like PmP require devices that have undergone a 'rooting' process (a process that allows administrator permissions to be obtained on the device), limiting their use to technically experienced users and presenting security risks [1]. Others, like the Permission Assistant, cannot directly edit other apps’ settings due to security restrictions from the operating system, requiring users to adjust permissions through the device settings manually [5].
- Time- and space-based sharing restrictions were used less frequently than expected, indicating a need to improve their presentation or user understanding. Additionally, the lab setting and hypothetical scenarios may not fully reflect real-world behaviour, limiting the findings' generalisability [3].
Privacy Choices
Considering the design space for privacy choices [7], this guideline can be applied in the following dimensions:
- Contextualised
This is particularly relevant to solutions like the one in [3] that offer granular control over data sharing based on the app's context or the user's situation. For example, allowing location data access only when the app is in use or providing temporary permissions fits this category. These solutions adopt the principle of contextual integrity by enabling users to make privacy decisions that are sensitive to the context, enhancing their ability to align data-sharing practices with their privacy expectations.
- Binary choices
Initially, most mobile app permissions follow a binary choice model, where users either grant or deny permission for data access (e.g., access to location and contacts). Some solutions aim to refine this model by offering more nuanced controls but still operate within a framework with some level of binary decision-making, such as opting in or out of specific data processing activities.
- Multiple choices
Solutions like the PmP [1] and VisiDroid [4] expand beyond the binary model by offering users multiple choices for managing their data. For instance, allowing users to decide the granularity of location data shared with apps or to manage permissions based on the app usage context introduces a multi-option privacy setting that aligns with this subdimension.
- Just in time
Many solutions propose providing privacy choices when a specific data practice is about to occur, such as when an app is about to access location data or other sensitive information. This approach is particularly evident in solutions that offer granular control over app permissions or introduce mechanisms to inform users about the implications of granting access to their data.
- On-demand
Users can actively seek out and modify privacy settings as needed, supported by solutions that provide comprehensive application privacy management interfaces. This allows users to make informed decisions on their schedule rather than being limited to initial setup or just-in-time notifications.
- Context-aware
Some solutions offer context-aware privacy choices based on the user's current situation or the application's context. For example, privacy settings can be adjusted based on the type of data being accessed or the third-party libraries involved. This approach aligns with delivering privacy choices that are more meaningful and tailored to the specific privacy risks present in a given context.
- At Setup
While not the primary focus of these solutions, some aspects of privacy control can be initially configured at setup, such as during the installation of an application or the first time a feature is accessed.
- Visual
This guideline presents solutions that rely heavily on visual interfaces to communicate privacy options and controls to the user. This includes using text to describe permissions and privacy choices, icons for representing different data types or permissions, and graphical user interfaces to configure privacy settings.
- Combined
Users might initially configure their privacy settings using visual interfaces, but underlying these choices could be machine-readable configurations that systems use to enforce privacy preferences automatically. Furthermore, considering future extensions or integrations, these solutions could incorporate auditory or haptic feedback for alerts or reminders related to privacy choices, thereby enhancing user awareness and control.
- Feedback
The discussed solutions apply the Feedback subdimension to ensure that users know the status of their privacy choices. In PmP [1], users receive immediate feedback on their privacy choices through the app interface. In the partial consent solution [2], users choosing the "Maybe" option for data access permissions in this framework are informed about the temporary nature of their consent and how long it will be valid.
In [4], the user interface for managing location privacy settings offers visual feedback on the privacy options selected by the user.
VisiDroid [5] provides feedback to users by showing how their privacy settings alter the behaviour of apps in real-time.
- Enforcement
Some of the discussed solutions also deal with enforcing users' privacy decisions. In the case of PmP [1] and VisiDroid [4], the systems enforce privacy decisions by blocking or modifying the data accessed by apps and third-party libraries based on user preferences.
- Presentation
Privacy choices always have a presentation that involves a system providing clear and easily understandable information to users about potential data practices, available options, and how to communicate privacy decisions, often incorporating multiple components and integrating with related privacy notices, requiring careful consideration of design dimensions such as timing, channel, and modality [5]. The solutions discussed in this guideline resort to displaying privacy controls in an understandable manner and with user-friendly interfaces.
- Primary
The solutions involve direct interaction between the user and the system (e.g., mobile applications) where privacy choices are embedded. For instance, the privacy management tools integrated within apps or operating systems allow users to control permissions and privacy settings directly through the app's or OS's interface. This integration ensures that privacy choices are part of the user's immediate interaction with the system, enhancing contextual relevance and ease of use.
Control
This guideline presents solutions that primarily address the Control privacy attribute [8]. All the discussed solutions aim to enhance users' control over their data by allowing them to manage permissions more granularly and make informed decisions about data sharing. Other related privacy attributes:
The discussed solutions often address the trade-off between app functionality and privacy, ensuring that users do not compromise their privacy for functionality.
All discussed solutions aim to increase transparency regarding how apps access and use personal data, providing users with more information and insights into data processing practices.
References
[1] Saksham Chitkara, Nishad Gothoskar, Suhas Harish, Jason I. Hong, and Yuvraj Agarwal (2017). Does this App Really Need My Location? Context-Aware Privacy Management for Smartphones. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article 42, 2017, 22 pages https://doi.org/10.1145/3132029
[2] Sven Bock, Ashraf Ferdouse Chowdhury, and Nurul Momen (2021). Partial Consent: A Study on User Preference for Informed Consent. In: Stephanidis, C., et al. HCI International 2021 - Late Breaking Papers: Design and User Experience. HCII 2021. Lecture Notes in Computer Science(), 2021, vol 13094. Springer, Cham. https://doi.org/10.1007/978-3-030-90238-4_15
[3] Mehrnaz Ataei, Auriol Degbelo, and Christian Kray (2018). Privacy theory in practice: designing a user interface for managing location privacy on mobile devices, Journal of Location Based Services, 2018, 12:3-4, 141-178. https://doi.org/10.1080/17489725.2018.1511839
[4] Aydin, Abdulbaki, David Piorkowski, Omer Tripp, Pietro Ferrara, and Marco Pistoia (2017). Visual configuration of mobile privacy policies. In Fundamental Approaches to Software Engineering: 20th International Conference, FASE 2017, Held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2017, Uppsala, Sweden, April 22-29, 2017, Proceedings 20, 2017, 338-355.Visual Configuration of Mobile Privacy Policies. In: Huisman, M., Rubin, J. (eds) Fundamental Approaches to Software Engineering. FASE 2017. Lecture Notes in Computer Science(), vol 10202. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-54494-5_19
[5] Hannah Quay-de la Vallee, Paige Selby, and Shriram Krishnamurthi (2016). On a (Per)Mission: Building Privacy Into the App Marketplace. In Proceedings of the 6th Workshop on Security and Privacy in Smartphones and Mobile Devices (SPSM '16). Association for Computing Machinery, New York, NY, USA, 63–72. https://doi.org/10.1145/2994459.2994466
[6] Kassem Fawaz and Kang G. Shin (2014). Location Privacy Protection for Smartphone Users. In Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security (CCS '14). Association for Computing Machinery, New York, NY, USA, 239–250. https://doi.org/10.1145/2660267.2660270
[7] Yuanyuan Feng, Yaxing Yao, and Norman Sadeh (2021). A Design Space for Privacy Choices: Towards Meaningful Privacy Control in the Internet of Things. In CHI Conference on Human Factors in Computing Systems (CHI ’21), May 8–13, 2021, Yokohama, Japan. ACM, New York, NY, USA, 16 pages. https://doi.org/10.1145/3411764.3445148
[8] Susanne Barth, Dan Ionita, and Pieter Hartel (2022). Understanding Online Privacy — A Systematic Review of Privacy Visualizations and Privacy by Design Guidelines. ACM Comput. Surv. 55, 3, Article 63 (February 2022), 37 pages. https://doi.org/10.1145/3502288