GD19 - Communicate Privacy Risk with Colour-Coded Privacy Indicators
Problem Summary
For users to make privacy-informed decisions, information must be provided in an easily understandable way, with visual communication of privacy properties, direct feedback, and leveraging familiar concepts so that users can align their behaviour with their concerns.
Rationale
The understanding is that providing visual privacy risk level indicators that are informative, simple, and easy to understand in decision-making situations (information disclosure, app permission granting, and so on) can help the user make privacy-informed decisions.
Solution
A colour-coded privacy indicator that summarises privacy risk level. Such colour code is chosen as green (low risk), yellow (medium risk) and red (high risk) [1][2][4][5][6]or a gradation from dark-green to light-green, yellow, and orange to red [3].
The intent is to leverage, for example, the metaphor of traffic lights, which is familiar to users.
The colour-coded privacy indicator leveraged by this guideline assumes that there is a calculation or assessment of privacy risk underlying it. However, there's no universal solution and the works referenced by this guideline [1-6] took different approaches, including considering context [2][3][4][6] or building mockups with hardcoded coloured indicators, to assess users' perceptions of privacy risk [3].
Paturi, Kelley and Mazumdar [1] introduced three privacy categories they called 'privacy granules': the user location, identity and search queries. To represent threats to these privacy granules, they used the colours red (danger), green (safety) and yellow (unsure).
Tucker, Tucker and Zheng [2] created a list of privacy threats and linked Facebook's requested permissions to these categories, assigning each a risk value. Bal [4] used researchers' expertise to assess various factors related to privacy, evaluated the apps' potential to extract specific private information, and aggregated these assessments to assign an overall privacy rating to each app. Kumar et al. [5] designed a dynamic probabilistic model based on Contextual Integrity and the privacy risk is calculated as a combination of instantaneous privacy risk, longitudinal privacy risk and cross-platform privacy risk.
Jackson and Wang [6] calculated privacy risk considering permission requests and user privacy concerns collected using the Mobile Users' Information Privacy Concerns (MUIPC). That is, it needs a mechanism to serve as the basis for the colour coding - knowledge base, expert rating, mathematical models, crowdsource and so on.
Since users' understanding of the conveyed meaning of icons is paramount, and there are no universally accepted privacy icons set, two works have selected the icons they have used through a user validation study [3][6].
Bock and Momen [3] recognised favouritism for the label bar with written descriptions. They state that "It can also be considered as achromatopsia-friendly". The assessed label bars are similar to the bootstrap pill badges component. The authors opted for an odd-numbered (5) scale instead of three to have a gradation from 'very low' to 'very high' risk instead of 'low, medium and high' risk. The decision on the scale gradation number will depend on the use case. Example using bootstrap:
low medium high
very low low medium high very high
Platforms: personal computers, mobile devices
Related guidelines: Enhance Privacy Awareness by Communicating Privacy Risks, Promote User Awareness and Decision-Making on Permission/Authorisation Requests
Example
The privacy discrepancy interface (left) highlights the discrepancy between user privacy concerns and the riskiness of each app's permission request [6]. Privacy pal interface (top right) helps users to visualise privacy and security risks associated with granting permissions to third-party apps [2]. Summary interfaces for privacy granules (bottom right) [1]. (See enlarged)
Use cases
- Nudging user behaviour in contexts where users need guidance or reminders to consider their privacy preferences.
- App discovery/download/installation/update/details interfaces.
- Any permission or authorisation request interfaces.
Pros
- Leverages colour-coded schemes familiar to users and can promote better alignment of user privacy behaviours to their concerns. It also provides direct feedback so users can modify permissions to match their privacy concern levels [6].
Privacy Attribute(s)
Transparency [7] is the main privacy attribute since this mechanism involves the proactive distribution of information to users, promoting visually accessible communication of privacy risk level, and helping users to make privacy-informed decisions.
Other related privacy attributes:
Control
Communication of privacy risk level leverages control by allowing users to make self-determined decisions about the sharing of their personal data.
References
[1] Anand Paturi, Patrick Gage Kelley, and Subhasish Mazumdar. Introducing privacy threats from ad libraries to android users through privacy granules. Proceedings of NDSS Workshop on Usable Security (USEC’15). Internet Society. Vol. 1. No. 2. 2015. http://dx.doi.org/10.14722/usec.2015.23008
[2] Rachel Tucker, Carl Tucker and Jun Zheng. Privacy Pal: Improving Permission Safety Awareness of Third Party Applications in Online Social Networks. In: IEEE. 2015 IEEE 17th International Conference on High Performance Computing and Communications, 2015 IEEE 7th International Symposium on Cyberspace Safety and Security, and 2015 IEEE 12th International Conference on Embedded Software and Systems. [S.l.], 2015. p. 1268–1273. https://doi.org/10.1109/HPCC-CSS-ICESS.2015.83
[3] Sven Bock and Nurul Momen (2020). Nudging the user with privacy indicator: a study on the app selection behavior of the user. In: Proceedings of the 11th Nordic Conference on Human-Computer Interaction: Shaping Experiences, Shaping Society. [S.l.: s.n.], 2020. p. 1–12. https://doi.org/10.1145/3419249.3420111
[4] Gökhan Bal (2014). Designing privacy indicators for smartphone app markets: A new perspective on the nature of privacy risks of apps. In: Proceedings of the 20th Americas Conference on Information Systems, AMCIS 2014. [S.l.: s.n.], 2014. https://aisel.aisnet.org/amcis2014/MobileComputing/GeneralPresentations/6
[5] Abhishek Kumar, Tristan Braud, Young D. Kwon, and Pan Hui (2020). Aquilis: Using contextual integrity for privacy protection on mobile devices. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, ACM New York, NY, USA, v. 4, n. 4, p. 1–28, 2020. https://doi.org/10.1145/3432205
[6] Corey Brian Jackson and Yang Wang. Addressing the privacy paradox through personalized privacy notifications. Proceedings of the ACM on interactive, mobile, wearable and ubiquitous technologies, ACM New York, NY, USA, v. 2, n. 2, p. 1–25, 2018. https://doi.org/10.1145/3214271
[7] Susanne Barth, Dan Ionita, and Pieter Hartel (2022). Understanding Online Privacy — A Systematic Review of Privacy Visualizations and Privacy by Design Guidelines. ACM Comput. Surv. 55, 3, Article 63 (February 2022), 37 pages. https://doi.org/10.1145/3502288