The design spaces presented here are conceptual spaces. It can be understood as the range of design choices and configurations that are possible for a particular problem, allowing practitioners to explore and analise a broad range of design options [1][2]. With the understanding of the design space, it is possible to evaluate trade-offs, optimise specific criteria and arrive to an optimal or satisfactory solution [1][2].
Privacy Notices

The purpose of a privacy notice is to inform users or customers about how personal data is handled within a system or company [3].

Refers to showing users privacy notices considering timing, like the task that the user is engaged in, considering the delay between the notice and the privacy decision and so on.

  • Context-dependent
    Refers to showing a context-dependent privacy notice, like, for example, in a change of location or other situational parameter [3].
  • On demand
    It refers to showing privacy notices when users request them, like providing links to privacy notices, contact information, etc. [3].
  • Persistent
    Refers to privacy notices that are persistently shown when information is being collected or transmitted, like a non-intrusive status bar showing that the location is being shared [3].
  • Periodic
    Refers to showing privacy notices periodically according to some predetermined frequency, like remembering users of data collection by some health device [3].
  • At Setup
    Refers to showing privacy notices at first use of a system or service [3].
  • Just in time
    Refers to showing a privacy notice at the moment of a data practice, like when the information is being collected, used or shared [3].

Refers to privacy notices that include control options (privacy choices). The authors of the design space for privacy choices consider that this dimension has its merits but fails to cover all the considerations that are needed in order to cover meaningful privacy choices.

  • Decoupled
    It refers to privacy notices that are decoupled from the actual system or device, being provided for example in a website or manual [3].
  • Non-blocking
    Refers to privacy notices that are non-blocking, providing controls that do not force user interaction, like for example using privacy decisions from previous actions [3].
  • Blocking
    Refers to privacy notices that are blocking, requiring that the user make a choice, like in consent situations [3]

Refers to different modes of delivering privacy notices, depending on what the privacy notice strives to achieve.

  • Visual
    Refers to visual privacy notices that can be text, images, icons or a combination of them [3].
  • Haptic and other
    Refers to privacy notices that are delivered through haptic and other like, for example, a smartphone vibration to notify users [3].
  • Auditory
    Refers to privacy notices that are delivered via spoken words or sounds [3].
  • Machine-readable
    Refers to privacy notices that are machine-readable and can be communicated to other devices or systems where the information can be rendered as a privacy notice [3].

Refers to the channels that can be utilised to deliver privacy notices.

  • Secondary
    It refers to delivering the privacy notice in a channel other than the primary, for some limiting reason like constrained interaction capabilities [3].
  • Primary
    Refers to delivering the privacy notice in the same platform or device that the user is interacting with [3].
  • Public
    This refers to public channels used to deliver privacy notices to especially secondary or incidental users, like public signs about video surveillance [3].
Privacy Choices

The authors in [4] constructed a design space for effective privacy choices, considering that there was not enough guidance on the design of them.

This dimension refers to the available types of privacy choices available to users.

  • Privacy rights-based choices
    This case comprises a type of privacy choice that supports privacy rights such as access, erasure, rectification, erasure, portability, etc. These choices are usually beyond the capability of binary or multiple choices and may require additional interactions between the user and the system [4].
  • Contextualised
    Contextualised privacy choices, rooted in the contextual integrity framework and involving context-specific decisions that often combine binary and multiple options, address data privacy within complex systems by considering contextual attributes like time, location, and purpose against data collection practices. These choices aim to align with diverse privacy preferences but face challenges in system support, data collection trade-offs, and potential user burden. There is a potential for mitigation through software privacy agents. They are presented in context-specific privacy situations, for example, when a potential privacy violation may happen, such as asking to allow or deny location sharing when it is about to happen [4].
  • Multiple choices
    Refers to providing users with more options to choose from, like the sum of multiple binary choices, for example [4].
  • Binary choices
    Binary choice means users have to choose one out of two options [4].
    • Opt-in/out
      Opt-in/out choices don't necessarily affect system access but may impact data control and functionality. The default value is crucial when designing choices. Setting opt-in as the default assumes certain data practices are allowed unless the user specifies otherwise.

In this dimension, it is considered the appropriate time to deliver a privacy choice, since it can affect the effectiveness of user engagement in privacy decision-making. This dimension can be also decoupled, integrated or mediated with the dimension Timing of the Privacy Notices since the availability of privacy notices can affect for example the user perception of risk.

  • On-demand
    On-demand privacy choices can be integrated or decoupled from privacy notices, usually presented when users actively seek available privacy choices [4].
  • At Setup
    Privacy choices are commonly presented to users who interact with the system for the first time, often during initial setup. This allows users to review privacy notices and make privacy choices before actively using the system, albeit with potential drawbacks associated with users opting for choices that enable system usage.
    This timing approach is commonly associated with consent-based privacy choices, such as software license agreements during installation or terms of use during account signup [4].
  • Just in time
    Just in time privacy choices are also usually integrated with privacy notices and presented when a data practice is about to happen, like in the case of the "ask on first use" model of app permissions [4].
  • Personalised
    Personalised privacy choices can be integrated, decoupled or mediated from privacy notices and involve tailoring the timing mechanisms based on individual preferences [4].
  • Periodic
    Periodic privacy choices can be integrated or decoupled and shown multiple times to users, for example, when a data practice changes or is required for some legal reason to provide periodic privacy choices [4].
  • Context-aware
    Context-aware privacy choices are also often integrated with privacy notices, and they comprise leveraging the context the user is in (temporal, spatial, social) to present the privacy choice [4].

Modalities are similar to channels and refer to the different modes for the two-way communication of privacy choices between systems and users, considering factors such as the nature of privacy decisions, system capabilities, user engagement and so on.

  • Visual
    Privacy choices are typically presented visually, using text, images, icons, or a combination of these elements. Visual privacy choices often include clear descriptions of available options, allowing systems to obtain affirmative privacy decisions from users when necessary [4].
  • Auditory
    In this category, privacy choices are delivered primarily via spoken words and sounds [4].
  • Machine-readable
    In this modality, a system's data practices and available privacy choices are available in a machine-readable format. This allows communication of privacy options to other systems automatically as well as serves as the foundation for software agents [4].
  • Combined
    In this category, one can have a combination of modalities, like machine-readable privacy options used by privacy software agents with alerts by haptic or visual notifications for users when they have to make a privacy decision regarding some privacy choice [4].
  • Haptic and other sensory
    If available, it can be used with other modalities for best outcomes since they are difficult to use to convey a large volume of information [4].

This dimension intends to capture functionalities to support the exercise of privacy choices.

  • Enforcement
    The enforcement of users' privacy decisions involves a system equipped with functionalities to implement different privacy choices submitted by users, requiring authentication, automated or human-mediated enforcement actions, and the capability to record and enforce changes to users' privacy decisions, with particular technical challenges for complicated choices such as contextualised ones [4].
  • Feedback
    Timely and accurate feedback is vital for the system to support ongoing privacy choices. When users make privacy decisions that can be immediately implemented, the system should promptly notify them of the updated privacy settings [4].
  • Presentation
    Privacy choices always have a presentation that involves a system providing clear and easily understandable information to users about potential data practices, available options, and how to communicate privacy decisions, often incorporating multiple components and integrating with related privacy notices, requiring careful consideration of design dimensions such as timing, channel, and modality [4].

Various channels can be used to present privacy options to users and effectively communicate their privacy decisions back to systems; for privacy choices, it is a two-way communication between systems and users.

  • Public
    Public channels can be used to communicate with users about available privacy choices pointing to other channels, to communicate with non-users like passers-by or incidental users (like users broadcasting their preference of not being recorded/tracked) and so on [4].
  • Secondary
    Secondary channels are used when there is some limitation with the primary channel. For example, when the primary channel is not available or when has a limited user interface [4].
  • Primary
    A primary channel refers to the platform or device through which users interact with the system [4].
References

[1] Biskjaer, Michael Mose and Dalsgaard, Peter and Halskov, Kim. A Constraint-Based Understanding of Design Spaces. In: Proceedings of the 2014 Conference on Designing Interactive Systems. New York, NY, USA: Association for Computing Machinery, 2014. (DIS ’14), p. 453–462. DOI: https://doi.org/10.1145/2598510.2598533

[2] Mustaquim, Moyen Mohammad and Nyström, Tobias. Information System Design Space for Sustainability. In: Donnellan, B., Helfert, M., Kenneally, J., VanderMeer, D., Rothenberger, M., Winter, R. (eds) New Horizons in Design Science: Broadening the Research Agenda. DESRIST 2015. Lecture Notes in Computer Science(), vol 9073. Springer, Cham. DOI: https://doi.org/10.1007/978-3-319-18714-3_3

[3] Schaub, Florian and Balebako, Rebecca and Durity, Adam L and Cranor, Lorrie Faith. A design space for effective privacy notices. In: Eleventh symposium on usable privacy and security (SOUPS 2015). [S.l.: s.n.], 2015. p. 1–17. Available at: https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf

[4] Yuanyuan Feng, Yaxing Yao, and Norman Sadeh. A Design Space for Privacy Choices: Towards Meaningful Privacy Control in the Internet of Things. In CHI Conference on Human Factors in Computing Systems (CHI ’21), May 8–13, 2021, Yokohama, Japan. ACM, New York, NY, USA, 16 pages. https://doi.org/10.1145/3411764.3445148