Title:
Evaluating the effects of nudging and deterrence on users' behavior for privacy-by-design
Authors:
- Mauro Iacono
- Michele Mastroianni
Published in:
(2024). ECMS 2024, 38th Proceedings
Edited by: Daniel Grzonka, Natalia Rylko, Grazyna Suchacka, Vladimir Mityushev, European Council for Modelling and Simulation.
DOI: http://doi.org/10.7148/2024
ISSN: 2522-2422 (ONLINE)
ISSN: 2522-2414 (PRINT)
ISSN: 2522-2430 (CD-ROM)
ISBN: 978-3-937436-84-5
ISBN: 978-3-937436-83-8 (CD) Communications of the ECMS Volume 38, Issue 1, June 2024, Cracow, Poland June 4th – June 7th, 2024
DOI:
https://doi.org/10.7148/2024-0521
Citation format:
Mauro iacono, Michele mastroianni (2024). Evaluating the effects of nudging and deterrence on users' behavior for privacy-by-design, ECMS 2024, Proceedings Edited by: Daniel Grzonka, Natalia Rylko, Grazyna Suchacka, Vladimir Mityushev, European Council for Modelling and Simulation. doi:10.7148/2024-0521
Abstract:
The definition of privacy-related specifications is crucial in the design process of any system which is subject to the GDPR. Privacy-related requirements can be seen as qualitative non-functional requirements which result in additional functional requirements during the specification process. As the application of GDPR is basically assessed by means of risk analysis of data treatments, a quantitative aspect of evaluation is anyway needed: consequently, defining a quantitative approach to privacy-related specifications is desirable, and suitable tools should be identified or provided. While there is some analogy with the field of security and the field of dependability, so that tools might be somehow and to some extent borrowed from those domains, the privacy domain also requires that human factor must be modeled, and external influences on human factors should be modeled as well. In this sense, risk can be evaluated similarly to what can be done in the security field, and this is actually done in the privacy domain by approaches like DPIA, but nudging and deterrence play a different role and are worth some reflections.
In this paper we discuss this perspective on privacy-related specification and discuss the use of a tool, Pythia, which is not related to the risk analysis domain but can be profitably used, in our opinion, to define privacy policies as complementary to privacy-aware systems design cycles and to assess their impact. We present an improved analysis of a model from our previous research to show that this point of view on privacy-aware systems design is peculiar and should be considered in the design processes.