Implement Contextual Privacy Controls for Enhanced User Data Protection

Problem Summary

It is unrealistic to expect users to specify their privacy preferences without considering the specific context in which these decisions are made. Users need contextual information to make informed and appropriate privacy choices. Without this context, they may struggle to understand the implications of their decisions. Systems should analyse these contexts to provide relevant privacy settings that align with users' current situations and preferences.

Rationale

By implementing contextual privacy controls, users can have more relevant privacy settings based on their specific circumstances and activities, i.e., the context in which privacy decisions are made. Contextual controls can dynamically adjust privacy settings, reducing the burden on users to manually manage their preferences continually. By considering context, the system can better align privacy settings with user expectations and behaviours, leading to higher user satisfaction and trust.

Solution

To develop systems that provide dynamic, personalised privacy recommendations based on contextual factors. These systems utilise various machine learning and data analysis techniques to consider the user's social context, location, time, activities, and relationships to make precise and relevant privacy decisions. The goal is to enhance user data protection by ensuring that privacy settings are continuously aligned with the user's current context and preferences.

Misra and Such [1] highlight the importance of social context in forming access control policies, ensuring privacy settings align with the user’s social interactions and relationships. Their solution proposal is PACMAN, a personal assistant agent that recommends personalised access control decisions based on the social context of information disclosure. It incorporates communities generated from the user's network structure and utilises profile information, along with the specific content to be shared.

Lee and Kobsa [2] developed a machine-learning model to predict user privacy preferences in IoT environments by considering various contextual factors for training. The model incorporates parameters such as location, time, monitoring entity, and purpose to capture the context of data collection and usage. It also includes data on individual users' privacy preferences gathered through surveys, experiments, or historical data. Users with similar privacy preferences are grouped into clusters, which aids in understanding and predicting the privacy needs of different user groups.

Sanchez et al. [3] propose a model that uses contextual factors and user traits to dynamically and personally recommend privacy settings in the fitness domain. The proposed system incorporates user traits and feedback to refine and personalise privacy recommendations continuously. This dynamic adjustment ensures that privacy settings remain relevant to the user's current context, aligning with the guideline’s goal of enhancing user data protection through contextual privacy controls.

Bernsmed et al. [4] propose a Privacy Advisor system based on Case-Based Reasoning (CBR) to assist users in making informed privacy decisions. The system leverages past cases to provide personalised privacy recommendations, adapting solutions from similar previous instances to fit the current context. The Privacy Advisor continuously learns from user feedback to improve its recommendations over time, ensuring they align with individual user preferences and situational factors. The authors validated their model through a prototype implementation and a focus group study, demonstrating its effectiveness in helping users navigate complex privacy settings.

Bilogrevic et al [5] propose the Smart Privacy-aware Information Sharing Mechanism (SPISM), which uses machine-learning techniques to make semi-automatic decisions about whether to share information and at what level of detail. SPISM considers a wide range of contextual features, such as the time of day, current location, and identity of the requester, to make informed decisions about information sharing. The decision-making process is based on personal and contextual features and past user behaviour. If SPISM can confidently make a decision based on the user's past choices, the request is processed automatically. Otherwise, the user is notified and asked to decide, with the option to postpone. The user's decision is then stored for future reference. SPISM uses active learning to adapt and reduce user input over time continuously.

Platforms:

Related guidelines: Leverage Personalised Recommendations for Enhanced User Management of Privacy Settings

Example

The SPISM mobile application interfaces <a href="#section5">[5]</a>.

The SPISM mobile application interfaces allow users to register and log in, check other users’ current locations, view nearby devices and their availability, and access features such as past activity records and contacts lists [5]. (See enlarged)

Use cases
  • Implementing contextual privacy controls to enhance user data protection across various domains and applications.
Pros

  • Combining relationship-based attributes, communities, and profile attributes to represent both the "who" and the "what," effectively capturing the social context of information disclosure [1].
  • The simplicity of the privacy profiles is an advantage, as they are easy to understand, unlikely to overfit the data, and likely to generalise to other scenarios, providing a convenient shortcut to help users with their privacy decisions while still allowing for manual adjustments [3].
  • The Privacy Advisor provides intelligent privacy support by identifying and comparing privacy policies with similar practices, as demonstrated by a focus group study indicating commonalities in user perceptions of privacy practices [4].
  • The study demonstrates that the type of information requested and the requester's social ties are influential factors in users' sharing decisions, highlighting the importance of these features in the decision-making process [5].
  • The decision tree models built using contextual information and cluster membership achieved 77% accuracy in predicting users' binary privacy decisions for IoT scenarios. This indicates the effectiveness of the model in understanding and forecasting privacy preferences based on identified contextual factors [2].

Cons

  • The tool's usability, the alignment of its recommendations with users' own understandings, and the ease with which users can provide feedback on their privacy decisions all need further evaluation [4].
  • The study participants were predominantly students aged 18-25, introducing a sampling bias that may limit the generalisability of the results. Additionally, previous cluster analysis on a different demographic (Amazon MTurk workers) yielded slightly different outcomes, indicating the need to consider demographic information when building machine learning models for predicting privacy preferences [2].
  • The manual selection of a privacy profile might be difficult for users, which also requires further evaluation [3].
  • The system has a slight bias toward over-sharing information, which can be mitigated by introducing error penalties for this type of error [5].

Privacy Choices

Privacy choices give people control over certain aspects of data practices. Considering the design space for privacy choices [5], this guideline can be applied in the following dimensions:

  • Contextualised
    The guideline directly supports contextualised choices by adjusting privacy settings based on context-specific factors such as time, location, purpose, and social interactions.
  • Personalised
    The guideline considers individual user contexts and preferences and supports personalised timing for presenting privacy choices, empowering users to make informed decisions.

  • Context-aware
    The guideline inherently supports context-aware timing by delivering privacy choices at moments when specific contextual factors (such as entering a new location or engaging in a particular activity) are detected.
  • Just in time
    Contextual privacy controls can present privacy choices just as a specific data practice is about to occur, ensuring that users make privacy decisions in the most relevant context.

  • Combined
    Utilising multiple modalities ensures that users receive contextual privacy information in the most accessible and effective manner.
  • Machine-readable
    Contextual privacy controls can be encoded in a machine-readable format, enabling automated systems and privacy agents to manage privacy settings on behalf of users based on contextual data.
  • Auditory
    Auditory cues, such as voice alerts or spoken instructions, can be used in smart home or IoT environments to notify users of privacy choices relevant to their current context.
  • Visual
    This guideline can provide a visual presentation of privacy choices, including icons, text, and notifications, and can help users understand and manage their privacy settings in a contextual manner.

  • Presentation
    Privacy choices always have a presentation that involves a system providing clear and easily understandable information to users about potential data practices, available options, and how to communicate privacy decisions [5].

  • Secondary
    When primary channels are not available, secondary channels (such as mobile apps or websites) can provide contextually relevant privacy settings.
  • Primary
    Contextual privacy controls should be integrated directly into the primary interaction channel (e.g., app, website) to provide seamless and relevant privacy choices within the use context.

Control

The guideline focuses on dynamically adjusting privacy controls based on contextual factors such as location, time, social interactions, and specific activities. This aligns with the Control [6] attribute as it emphasises empowering users to manage and influence their privacy settings actively, ensuring that their preferences are respected in varying contexts. Other related privacy attributes:

The guideline can support the Purpose attribute since, by providing context, users can be informed about the specific purposes for which their data is used.

By dynamically adjusting privacy controls based on contextual factors, the system must provide users with clear information on how these adjustments are made and why, ensuring that users understand and trust the process.


References

[1] Gaurav Misra and Jose M. Such (2017). PACMAN: Personal Agent for Access Control in Social Media. In IEEE Internet Computing, vol. 21, no. 6, pp. 18-26, November/December 2017. https://doi.org/10.1109/MIC.2017.4180831

[2] Hosub Lee and Alfred Kobsa. (2017). Privacy preference modeling and prediction in a simulated campuswide IoT environment. In IEEE International Conference on Pervasive Computing and Communications (PerCom), Kona, HI, USA, 2017, pp. 276-285 https://doi.org/10.1109/PERCOM.2017.7917874

[3] Odnan Ref Sanchez, Ilaria Torre, Yangyang He, and Bart P. Knijnenburg (2020). A recommendation approach for user privacy preferences in the fitness domain. User Modeling and User-Adapted Interaction, 30, pp.513-565. https://doi.org/10.1007/s11257-019-09246-3

[4] Karin Bernsmed, Inger Anne Tøndel and Åsmund Ahlmann Nyre. Design and Implementation of a CBR-based Privacy Agent. In: Seventh International Conference on Availability, Reliability and Security, Prague, Czech Republic, 2012, 317-326. https://doi.org/10.1109/ARES.2012.60

[5] Igor Bilogrevic, Kévin Huguenin, Berker Agir, Murtuza Jadliwala, Maria Gazaki and Jean-Pierre Hubaux (2016). A machine-learning based approach to privacy-aware information-sharing in mobile social networks. Pervasive and Mobile Computing, 25, 125-142. https://doi.org/10.1016/j.pmcj.2015.01.006

[6] Yuanyuan Feng, Yaxing Yao, and Norman Sadeh (2021). A Design Space for Privacy Choices: Towards Meaningful Privacy Control in the Internet of Things. In CHI Conference on Human Factors in Computing Systems (CHI ’21), May 8–13, 2021, Yokohama, Japan. ACM, New York, NY, USA, 16 pages. https://doi.org/10.1145/3411764.3445148

[7] Susanne Barth, Dan Ionita, and Pieter Hartel (2022). Understanding Online Privacy — A Systematic Review of Privacy Visualizations and Privacy by Design Guidelines. ACM Comput. Surv. 55, 3, Article 63 (February 2022), 37 pages. https://doi.org/10.1145/3502288