Digital self-tracking devices and services – such as step counters on smartphones, fitness wristbands or smart watches with a pulse measurement function – are now commonplace for personal use. But it is not just the users themselves who can benefit from the extensive data collected by this technology – it also offers opportunities for health services, medical research and care. For example, the data can be used by doctors and therapists to provide their patients with faster, more targeted and more personalized treatment. Many health insurance companies are already encouraging their customers to use self-tracking devices. The health data collected via the smart tech is usually processed and passed on. But this throws up a number of questions for the users: Which data is being collected? For what purpose? And who is it being passed on to? The General Data Protection Regulation does provide users with comprehensive rights regarding their personal data, but – as Uwe Laufs, researcher at Fraunhofer IAO and TESTER project manager, explains – the picture is different in practice: “A major hurdle for the self-determined use of data from self-tracking technology is not a lack of rights for individuals but a lack of transparency and the practical problems involved in exercising these rights. The TESTER project is aiming to make a valuable contribution in this area.”
So how can self-tracking systems and processing methods be designed to ensure that users feel sufficiently well informed and empowered when it comes to the processing of their highly sensitive self-tracking data? In the TESTER research project, the Fraunhofer Institute for Industrial Engineering IAO is working with the Institute of Human Factors and Technology Management IAT at the University of Stuttgart as well as the Department of Public Law, IT Law and Environmental Law at the University of Kassel and practice partner Actimi GmbH to investigate and develop a privacy assistant which will support the self-determined use of data from self-tracking technology. The aim is to maximize transparency and intervenability for self-tracking and to place greater emphasis on users’ needs with regard to the protection of their data. The research project is funded by the German Federal Ministry of Education and Research (BMBF).
Users’ motives shed light on data transparency
How much information do users need? What level of intervention should be possible? In order to develop a piece of needs-based support technology with high user acceptance, the research team will start by conducting qualitative and quantitative interviews to determine the different motives for self-tracking as well as preferences regarding the transparency of data processing. Experiments carried out in the Fraunhofer IAO smart home laboratory, in which scenarios from users’ home lives are replicated in a realistic manner, will also help to characterize user behavior. The research team will then use these findings to develop appropriate concepts for the privacy assistant. They plan to employ methods from the fields of usability, user experience and machine learning to develop a suitable user interface that will optimize transparency for the users. They will also be developing suitable tools such as software interfaces for providers in order to support intervenability. The privacy assistant will be integrated into one of Actimi GmbH’s medical applications in order to test the data protection assistant in a real software environment.
Legal conformity as a benchmark for technology design
The project will combine usability issues such as transparency and intervenability with legal considerations. “Due to its sensitive nature, health data is subject to special legal protection,” says Prof. Gerrit Hornung, who is in charge of the legal work at the University of Kassel. “The only way to ensure that the technical solutions are both legally compliant and in keeping with all relevant interests is to take the legal requirements into account at an early stage.” In this context, the project is considering not only legal aspects but also social issues regarding transparency and intervenability for users of self-tracking technology. In order to eliminate the problem of discrimination with regard to medical care, considerations relating to equality rights are also being incorporated into the development of the privacy assistant. The project team is analyzing legal requirements which allow the users to personalize the self-tracking systems according to their health-related preferences.