Promote User Awareness and Decision-Making on Permission/Authorisation Requests
Problem Summary
Users usually lack awareness over disclosing their personal information in contexts related to permission or authorisation requests due to several issues like overlooking long permission lists, unawareness of potential third-party access to their data, the sensitivity of the disclosed data and a clear risk assessment of accepting such requests.
Rationale
Providing users with comprehensive insights into privacy threats arising from permission requests to disclose their data would enhance awareness, guiding users in making privacy-informed decisions.
Solution
A comprehensive privacy notification enhanced interface that combines a user-friendly privacy summary that is icon-based or utilises a privacy meter.
The purpose here is to present users with a privacy summary interface that visually communicates privacy risks in contexts where they have to analyse the impacts of permission granting to their personal data.
What summary interface would communicate better depends on the context. The central unity of analysis is the permission, but its aggregation depends on the approach taken by the researchers.
Kang et al. [1] calculated the privacy risk by counting the number of privacy-sensitive permissions in the app's request, dividing that count by the maximum number of dangerous permissions, and using this ratio as the privacy risk score. Results are communicated to users via a privacy meter, leveraging users' familiarity with similar meters, such as those used for password strength warnings. Also using the popular appearance of password strength meters to communicate privacy risks to users, Tucker, Tucker and Zheng [6] assessed app permissions for potential misuse and advantages to determine privacy risks.
Bal, Rannenbert and Hong [2] introduced a privacy risk communication system for Android that provides users with privacy risk information based on the second-order (threats to privacy coming from user-profiling and data-mining capabilities based on the long-term data access behaviour of apps) privacy risk perspective.
Liccardi et al. [3], a sensitivity score is calculated based on the presence of sensitive permissions in an app's permission list, provided that network permission (full internet access) is also present. The results are communicated to users using icons that inform the sensitivity score.
Paturi, Kelley, and Mazumdar [4] employed multiple methods to establish privacy categories, which they referred to as 'privacy granules.' These methods included analysing permission requirements and app features, and conducting both static and dynamic code analysis, including the examination of third-party libraries. The results of this analysis are communicated to users considering two categories, namely app providers and third-party libraries. Within each category, privacy risks are presented according to location, identity and query privacy granules. The icons representing these privacy granules were validated by users.
Lin et al. [5] explored the idea of privacy based on user expectations of what an application should or should not do. Results are presented to users as a percentage, illustrating the extent to which these expectations were violated.
To measure this violation of expectations, the authors analysed the privacy permissions of a group of apps and conducted a survey to confirm instances where the apps diverged from users' expectations. Using the survey results, they created a prototype privacy summary indicating how other users feel about the app requesting certain permission.
Kelly, Cranor and Sadeh [7] presented a short "Privacy Facts" display as a prototype implemented as a modification to the Android marketplace interface. This simplified display, which fits on the main application display screen, presents a checklist of data practices, including types of information collected (e.g., personal, location) and usage (e.g., advertising, analytics), and can assist users in selecting apps that request fewer permissions.
Platforms: personal computers, mobile devices
Related guidelines: Enhance Privacy Awareness by Communicating Privacy Risks, Encourage Users to Consider Privacy Implications Before Sharing Online, Communicate Privacy Risk with Colour-Coded Privacy Indicators
Example
Screenshots of the proof-of-concept version of Styx [2]. (See enlarged)
From left to right summary interfaces for privacy granules [4] and for sensitivity score within the permission list [3]. (See enlarged)
From left to bottom right summary interfaces for user expectations [5], privacy pal [6] and privacy meter [1]. (See enlarged)
Privacy facts display [7]. (See enlarged)
Use cases
- Inspection of installed app details.
- Implementation in app markets for use during app discovery or update
- Implementation of permission or authorisation request interfaces.
Pros
- Helps users to compare different applications regarding their privacy properties [2].
- Users largely favoured the icon-based privacy interface, making identifying threats and queries easier than traditional permission lists. User studies highlighted that icons, with simple designs and accurate threat representations, captured users' attention more effectively than text-heavy descriptions [4].
- Users exhibited improved accuracy in specifying the resources utilised by target apps. Additionally, the study highlighted that users feel more at ease when informed about the necessity of their sensitive resources [5].
- Moving privacy/permissions information to the main screen clearly and simply can influence user decision to choose apps that request fewer permissions, particularly when applications are similar [7].
Cons
- Developing a scalable privacy evaluation system entails merging automated application analysis with crowdsourcing techniques. It's vital to conduct effectiveness tests, especially when users briefly view the interface (e.g., for 5-10 seconds). Challenges persist in determining the necessity of accessing sensitive resources and understanding their impact on user privacy perception [5].
- Needs further testing of the interface and long-term effects on user perceptions and behaviour [2][4].
Privacy Notices
Considering the design space for privacy notices [8], this guideline can be applied to the following dimensions:
- At Setup
The proposed guideline can be used to present a privacy notice to users when they first use the system. Privacy notices at setup should be concise and focus on relevant data practices [8].
- Just in time
According to [8], just-in-time notices precede data collection, often near input fields or with summary dialogues, reducing user interruptions. Another situation is when apps are requesting access to sensitive information, which should be accompanied by a just-in-time privacy notice [8].
- On demand
The proposed guideline can be used to present a privacy notice to users when they actively seek privacy information, for example, in privacy dashboards or privacy settings interfaces.
- Non-blocking
This guideline can be coupled with non-blocking controls (privacy choices), providing control options without forcing user interaction.
- Decoupled
This guideline can be applied to privacy notices decoupled from privacy choices.
- Blocking
This guideline can be paired with blocking controls (privacy choices), requiring users to make decisions or give consent based on the information in the notice.
- Visual
This guideline is for a visual notice, using visual resources such as colours, text and icons.
- Primary
This guideline can be applied to the same platform or device the user is interacting with.
- Secondary
This guideline can be applied to secondary channels if the primary channel does not have an interface or has a limited one.
Transparency
Transparency [9] is the main privacy attribute since this mechanism involves the proactive distribution of information to users, promoting visually accessible communication of privacy information.
The research papers collectively underscore the importance of making privacy information accessible and understandable to enhance user awareness and decision-making.
Other related privacy attributes:
Providing users with comprehensive insights into privacy threats arising from permission or authorisation requests leverages control by allowing users to make self-determined decisions about sharing their personal data.
References
[1] Jina Kang, Hyoungshick Kim, Yun Gyung Cheong, and Jun Ho Huh. Visualizing Privacy Risks of Mobile Applications through a Privacy Meter. In: Lopez, J., Wu, Y. (eds) Information Security Practice and Experience. ISPEC 2015. Lecture Notes in Computer Science(), vol 9065. Springer, Cham. https://doi.org/10.1007/978-3-319-17533-1_37
[2] Gökhan Bal, Kai Rannenberg, and Jason I. Hong (2015). Styx: Privacy risk communication for the Android smartphone platform based on apps' data-access behavior patterns. Computers & Security, vol. 53, pages 187-202, 2015. https://doi.org/10.1016/j.cose.2015.04.004
[3] Ilaria Liccardi, Joseph Pato, Daniel J. Weitzner, Hal Abelson, and David De Roure. 2014. No technical understanding required: helping users make informed choices about access to their personal data. In Proceedings of the 11th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services (MOBIQUITOUS '14). ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), Brussels, BEL, 140–150. https://doi.org/10.4108/icst.mobiquitous.2014.258066
[4] Anand Paturi, Patrick Gage Kelley, and Subhasish Mazumdar. Introducing privacy threats from ad libraries to android users through privacy granules. Proceedings of NDSS Workshop on Usable Security (USEC’15). Internet Society. Vol. 1. No. 2. 2015. http://dx.doi.org/10.14722/usec.2015.23008
[5] Jialiu Lin, Shahriyar Amini, Jason I. Hong, Norman Sadeh, Janne Lindqvist, and Joy Zhang (2012). Expectation and purpose: understanding users' mental models of mobile app privacy through crowdsourcing. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UbiComp '12). Association for Computing Machinery, New York, NY, USA, 501–510. https://doi.org/10.1145/2370216.2370290
[6] Rachel Tucker, Carl Tucker and Jun Zheng. Privacy Pal: Improving Permission Safety Awareness of Third Party Applications in Online Social Networks. In: IEEE. 2015 IEEE 17th International Conference on High Performance Computing and Communications, 2015 IEEE 7th International Symposium on Cyberspace Safety and Security, and 2015 IEEE 12th International Conference on Embedded Software and Systems. [S.l.], 2015. p. 1268–1273. https://doi.org/10.1109/HPCC-CSS-ICESS.2015.83
[7] Patrick Gage Kelley, Lorrie Faith Cranor, and Norman Sadeh (2013). Privacy as part of the app decision-making process. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). Association for Computing Machinery, New York, NY, USA, 2013, 3393–3402. https://doi.org/10.1145/2470654.2466466
[8] Florian Schaub, Rebecca Balebako, Adam L Durity, and Lorrie Faith Cranor (2015). A Design Space for Effective Privacy Notices. In: Symposium on Usable Privacy and Security (SOUPS 2015). [S.l.: s.n.], p. 1–17. https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf
[9] Susanne Barth, Dan Ionita, and Pieter Hartel (2022). Understanding Online Privacy — A Systematic Review of Privacy Visualizations and Privacy by Design Guidelines. ACM Comput. Surv. 55, 3, Article 63 (February 2022), 37 pages. https://doi.org/10.1145/3502288