Exploring solutions to the privacy paradox in the context of e-assessment: informed consent revisited

  • PDF / 818,236 Bytes
  • 16 Pages / 595.276 x 790.866 pts Page_size
  • 81 Downloads / 183 Views

DOWNLOAD

REPORT


ORIGINAL PAPER

Exploring solutions to the privacy paradox in the context of e‑assessment: informed consent revisited Ekaterina Muravyeva1   · José Janssen1   · Marcus Specht2 · Bart Custers3

© The Author(s) 2020

Abstract Personal data use is increasingly permeating our everyday life. Informed consent for personal data use is a central instrument for ensuring the protection of personal data. However, current informed consent practices often fail to actually inform data subjects about the use of personal data. This article presents the results of a requirements analysis for informed consent from both a legal and usability perspective, considering the application context of educational assessment. The requirements analysis is based on European Union (EU) law and a review of current practices. As the main outcome, the article presents a blueprint which will be the basis for the development of an informed consent template that supports data controllers in establishing an effective and efficient informed consent form. Because the blueprint, and subsequently, the template, distinguishes between legal and usability requirements, it also provides the basis for the mapping of legal requirements in other (non-European) contexts. Keywords  Informed consent · Personal data · Sensitive data · e-Assessment · Privacy paradox

Introduction Technology is increasingly permeating our day-to-day experiences in education, work, and leisure activities. A recent report by the Rathenau Institute (Van Est et al. 2014) states that the “new technological wave” requires further study This publication reflects the views of the authors only, and the European Commission cannot be held responsible for any use, which may be made of the information contained therein. * Ekaterina Muravyeva [email protected] José Janssen [email protected] Marcus Specht [email protected] Bart Custers [email protected] 1



Open University of the Netherlands, Valkenburgerweg 177, 6401 DL Heerlen, The Netherlands

2



Delft University of Technology, Mekelweg 5, 2628 CD Delft, The Netherlands

3

Leiden University, Rapenburg 70, 2311 EZ Leiden, The Netherlands



from legal and ethical perspectives. The report distinguishes four levels of human-technology interaction: technology in us (e.g., pills that, once inside a human body, can monitor the body’s condition and/or support its proper functioning); technology between us (e.g., mobile phones and social networks that allow people to become connected and communicate with each other); technology about us (e.g., navigation systems that determine current location and lead along a programmed route or video surveillance for security purposes); and technology just like us (e.g., robots that are programmed to perform tasks delegated to them). Regardless of the precise level of interaction, human-technology interaction is challenging, particularly when (sensitive) personal data is involved, such as in health care or educational assessments (Kobsa et al. 2016; Wang and Kobsa 2013). This aspect brin