Communicate Privacy Risk with Colour-Coded Privacy Indicators
Problem Summary
For users to make privacy-informed decisions, information must be provided in an easily understandable way, with visual communication of privacy properties, direct feedback, and leveraging familiar concepts so that users can align their behaviour with their concerns.
Rationale
The understanding is that providing visual privacy risk level indicators that are informative, simple, and easy to understand in decision-making situations (information disclosure, app permission granting, and so on) can help the user make privacy-informed decisions.
Solution
A colour-coded privacy indicator that summarises privacy risk level. Such colour code is chosen as green (low risk), yellow (medium risk) and red (high risk) [1][2][4][5][6]or a gradation from dark-green to light-green, yellow, and orange to red [3].
The intent is to leverage, for example, the metaphor of traffic lights, which is familiar to users.
The colour-coded privacy indicator leveraged by this guideline assumes that there is a calculation or assessment of privacy risk underlying it. However, there's no universal solution and the works referenced by this guideline [1-6] took different approaches, including considering context [2][3][4][6] or building mockups with hardcoded coloured indicators, to assess users' perceptions of privacy risk [3].
Paturi, Kelley and Mazumdar [1] introduced three privacy categories they called 'privacy granules': the user location, identity and search queries. To represent threats to these privacy granules, they used the colours red (danger), green (safety) and yellow (unsure).
Tucker, Tucker and Zheng [2] created a list of privacy threats and linked Facebook's requested permissions to these categories, assigning each a risk value. Bal [4] used researchers' expertise to assess various factors related to privacy, evaluated the apps' potential to extract specific private information, and aggregated these assessments to assign an overall privacy rating to each app. Kumar et al. [5] designed a dynamic probabilistic model based on Contextual Integrity and the privacy risk is calculated as a combination of instantaneous privacy risk, longitudinal privacy risk and cross-platform privacy risk.
Jackson and Wang [6] calculated privacy risk considering permission requests and user privacy concerns collected using the Mobile Users' Information Privacy Concerns (MUIPC). That is, it needs a mechanism to serve as the basis for the colour coding - knowledge base, expert rating, mathematical models, crowdsource and so on.
Since users' understanding of the conveyed meaning of icons is paramount, and there are no universally accepted privacy icons set, two works have selected the icons they have used through a user validation study [3][6].
Bock and Momen [3] recognised favouritism for the label bar with written descriptions. They state that "It can also be considered as achromatopsia-friendly". The assessed label bars are similar to the bootstrap pill badges component. The authors opted for an odd-numbered (5) scale instead of three to have a gradation from 'very low' to 'very high' risk instead of 'low, medium and high' risk. The decision on the scale gradation number will depend on the use case. Example using bootstrap:
low medium high
very low low medium high very high
Platforms: personal computers, mobile devices
Related guidelines: Promote User Awareness and Decision-Making on Permission/Authorisation Requests, Enhance Privacy Awareness by Communicating Privacy Risks
Example
The privacy discrepancy interface (left) highlights the discrepancy between user privacy concerns and the riskiness of each app's permission request [6]. Privacy pal interface (top right) helps users to visualise privacy and security risks associated with granting permissions to third-party apps [2]. Summary interfaces for privacy granules (bottom right) [1]. (See enlarged)
Use cases
- Nudging user behaviour in contexts where users need guidance or reminders to consider their privacy preferences.
- App discovery/download/installation/update/details interfaces.
- Any permission or authorisation request interfaces.
Pros
- Leverages colour-coded schemes familiar to users and can promote better alignment of user privacy behaviours to their concerns. It also provides direct feedback so users can modify permissions to match their privacy concern levels [6].
Privacy Notices
The main purpose of a privacy notice is to inform users about personal data handling. Considering the design space for privacy notices [7], this guideline can be applied to the following dimensions:
- On demand
The proposed guideline can be used to present a privacy notice to users when they actively seek privacy information, for example, in privacy dashboards or privacy settings interfaces.
- Context-dependent
This guideline can be used to present a privacy notice to users triggered by a change in context, like a location or data sharing change.
- Just in time
The proposed guideline could be used to present a privacy notice to users when a data practice is active (data being collected, used or shared, for instance). According to the authors of the design space for privacy notices, just-in-time notices precede data collection, often near input fields or with summary dialogues, reducing user interruptions.
- At Setup
The proposed guideline can be used to present a privacy notice to users when they are using the system for the first time or about to engage in a data-sharing situation, so they can be aware of the risk level.
- Decoupled
This guideline can be applied to privacy notices decoupled from privacy choices.
- Non-blocking
This guideline can be coupled with non-blocking controls (privacy choices [7]), providing control options without forcing user interaction.
- Blocking
This guideline can be paired with blocking controls (privacy choices), requiring users to make decisions or give consent based on the information in the notice.
- Visual
This guideline is for a visual notice, using visual resources such as colours, text and icons.
- Secondary
This guideline can be applied to secondary channels if the primary channel does not have or has a limited user interface.
- Primary
This guideline can be applied to the same platform or device the user is interacting with.
- Public
This guideline can be applied to public notices.
Transparency
Transparency [8] is the main privacy attribute since this mechanism involves the proactive distribution of information to users, promoting visually accessible communication of privacy risk level, and helping users to make privacy-informed decisions. Other related privacy attributes:
Communication of privacy risk level leverages control by allowing users to make self-determined decisions about the sharing of their personal data.
References
[1] Anand Paturi, Patrick Gage Kelley, and Subhasish Mazumdar. Introducing privacy threats from ad libraries to android users through privacy granules. Proceedings of NDSS Workshop on Usable Security (USEC’15). Internet Society. Vol. 1. No. 2. 2015. http://dx.doi.org/10.14722/usec.2015.23008
[2] Rachel Tucker, Carl Tucker and Jun Zheng. Privacy Pal: Improving Permission Safety Awareness of Third Party Applications in Online Social Networks. In: IEEE. 2015 IEEE 17th International Conference on High Performance Computing and Communications, 2015 IEEE 7th International Symposium on Cyberspace Safety and Security, and 2015 IEEE 12th International Conference on Embedded Software and Systems. [S.l.], 2015. p. 1268–1273. https://doi.org/10.1109/HPCC-CSS-ICESS.2015.83
[3] Sven Bock and Nurul Momen (2020). Nudging the user with privacy indicator: a study on the app selection behavior of the user. In: Proceedings of the 11th Nordic Conference on Human-Computer Interaction: Shaping Experiences, Shaping Society. [S.l.: s.n.], 2020. p. 1–12. https://doi.org/10.1145/3419249.3420111
[4] Gökhan Bal (2014). Designing privacy indicators for smartphone app markets: A new perspective on the nature of privacy risks of apps. In: Proceedings of the 20th Americas Conference on Information Systems, AMCIS 2014. [S.l.: s.n.], 2014. https://aisel.aisnet.org/amcis2014/MobileComputing/GeneralPresentations/6
[5] Abhishek Kumar, Tristan Braud, Young D. Kwon, and Pan Hui (2020). Aquilis: Using contextual integrity for privacy protection on mobile devices. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, ACM New York, NY, USA, v. 4, n. 4, p. 1–28, 2020. https://doi.org/10.1145/3432205
[6] Corey Brian Jackson and Yang Wang. Addressing the privacy paradox through personalized privacy notifications. Proceedings of the ACM on interactive, mobile, wearable and ubiquitous technologies, ACM New York, NY, USA, v. 2, n. 2, p. 1–25, 2018. https://doi.org/10.1145/3214271
[7] Florian Schaub, Rebecca Balebako, Adam L Durity, and Lorrie Faith Cranor (2015). A Design Space for Effective Privacy Notices. In: Symposium on Usable Privacy and Security (SOUPS 2015). [S.l.: s.n.], p. 1–17. https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf
[8] Susanne Barth, Dan Ionita, and Pieter Hartel (2022). Understanding Online Privacy — A Systematic Review of Privacy Visualizations and Privacy by Design Guidelines. ACM Comput. Surv. 55, 3, Article 63 (February 2022), 37 pages. https://doi.org/10.1145/3502288