Dr. Daniel Franzen is a research associate and is conducting research in the FreeMove project on enabling privacy-preserving donation of mobility data.What is the goal of FreeMove and what is the role of the HCC research group in this project?
The overall goal of the FreeMove research project is to develop a framework that consists of guidelines and tools that will help companies, institutions and software developers to enable their users to donate data, in this case movement data, in a privacy-preserving manner and to handle the collected data accordingly. The topic of privacy-preserving data donation is very complex and can only be dealt with in an interdisciplinary network. The FreeMove project therefore involves colleagues from the fields of law (Berlin University of the Arts), data science (HTW - University of Applied Sciences), technology (Technische Universität Berlin) and civil society (Technologiestiftung Berlin). The perspective that we bring to the project with the Human-Centered Computing (HCC) research group is - as the name already says - strongly human-centered. So we look at which stakeholders play a role in the context of a data donation and which requirements and needs they have. Within the framework of the project, we are primarily looking into the question of how we can find a well-functioning compromise that keeps the richness and accuracy of the data in mind and at the same time meets the users' demand for privacy. Within the project FreeMove, the HCC research group is primarily concerned with the question of how all these different stakeholder requirements can be reconciled with each other.
At the centre of our research interest lies the question of how we can communicate this complex issue of data donation and explain the solutions and methods that are being employed to ensure privacy. We strive to find a solution that supports the data donors in reaching an informed and sovereign decision in the process.Why should users who have an interest in privacy donate their movement data in the first place?
We are used to being asked to consent to the recording and use of our data when using digital services, apps or software. The motivation for using such services, even if we have to agree to data sharing to a certain extent, is often of a social nature, e.g., because we maintain social contacts by using an app, or want to plan our movement through the city conveniently and use navigation services for this purpose. We often accept the fact that our data is recorded and analyzed by companies in the process.
In the public sector, for example on the part of cities and municipalities, there is also a need for data. Their need for data, in contrast to the product orientation just described, is oriented towards the common good. But how do cities, municipalities, research institutions, and other institutions oriented towards the common good get hold of such data? One approach to solving this problem is the voluntary donation of data. The motivation for such a data donation oriented towards the common good lies, for instance, in the fact that my data can contribute to the demand-oriented planning and improvement of public infrastructure such as cycle path networks or public transport.
The question of motivation and the added value of a data donation can therefore be answered quite simply. With our research, we want to help ensure that data donation is made possible in such a way that the privacy of the user is actually preserved. We must therefore firstly communicate what added value a data donation has - e.g. for the common good -, secondly, what methods are used to protect privacy, and thirdly, what residual risks remain for the data donors.How do you go about your research in order to develop proposals for solutions to this complex topic?
There are already very good but little-tested technical methods, so-called "Privacy Preserving Technologies" or "PPT" for short, which, for example, greatly minimize the risk of users being identified in the data. Within the project FreeMove, we are primarily testing the "Differential Privacy" (DP) approach. Put simply, DP can be used to prevent attackers, who want to learn personal information about a person, from identifying whether or not the particular person's data is contained in a database. Even though methods like DP can reduce the risk significantly, they cannot fully protect data donors against any identification in the data. Potential data donors, i.e. the users of an app or software, must understand what residual risk is associated with their data donation - despite the use of PPT.
Currently, we are focusing on the challenge how we can communicate to users the residual risk that is associated with their data donation. In the privacy domain, however, there is not yet as much research and approaches to the topic of risk communication. So far, risk in this context is described quite vaguely, for example, "there is little risk that your data will be leaked." In regards to privacy, it is not yet common to quantify the concrete risk for users. The technical methods we are currently working on, such as differential privacy, allow us to quantify the risk much more precisely. However, the mere quantification of a risk is still not easy to understand. For this reason, we are currently using communication strategies that have already been used successfully to communicate other risks. The field of medicine offers us a good orientation for this. We are now transferring the strategies and solutions developed and tested in medicine to the area of privacy.And how do you check whether your proposed solutions achieve their goal?
As I said, we are not starting from scratch, but instead transfer already tried and tested communication strategies and methods to our application context in the privacy domain. However, since risk communication is a very complex topic, we cannot simply assume that these strategies will work in the privacy domain in exactly the same way as in the medical context. Therefore, to get a first impression of which of these transferred solution approaches basically work, we conduct online studies. In the first step, these are studies with click workers, for example. This will of course be followed by studies in real contexts. In the future, we will work closely with the German Aerospace Centre (DLR). The DLR is already using an app to collect and analyze movement data from users in the context of studies. We can build on this and integrate our proposed solutions into the data collection with this app. In this way, we can evaluate whether the methods and communication approaches we have developed can influence the behavior of the participants with regard to data donation.
Interview & Text: Dr. Daniel Franzen and Katrin Glinka