“In Germany, we're very sensitive when it comes to big data”

Professor Dr. Steffen Augsberg is looking ‘with cautious optimism’ towards the digital future. The Professor of Public Law at Justus Liebig University Giessen, and Chairman task group „Big Data“ within German Ethics Council, explains his reasons in an interview with Simone Harr. On June 1, 2018, at the invitation of Professor Dr. Martin Gersch, Professor Augsberg gave a lecture at the Einstein Center Digital Future (ECDF). He spoke about ‘Big Data and the Regulatory Framework – Recommendations by the German Ethics Council’.

 

Since May 25, 2018, the EU General Data Protection Regulation (GDPR) has been in force. What are your thoughts on this?

The EU General Data Protection Regulation has advantages in certain areas – for instance regarding the protection of minors or in connection with e-commerce. But the Regulation has its flaws. It’s going to generate plenty of work for lawyers and is a major burden for small companies, in particular. The GDPR’s effect is somewhat excessive. In addition, it is unclear whether major Internet companies will actually make significant changes to their business models. After all, Facebook’s business model is based on the collection and utilization of data. The GDPR brings improvements and problems, and simply fails to cover certain areas. As a regulatory model, it is not consistently innovative and, especially with regard to big data, it needs to be supplemented.

 

The amount of data circulating worldwide doubles every year. What does this mean for individuals?

The sheer amount of data makes individuals more anonymous. However, each individual’s own data set is an important part of the whole. The repercussions resulting from this are problematic. For example, when it comes to individual pricing for online bookings. For some people, a flight to Berlin is more expensive than for others. It’s exactly this sort of hidden manipulation that is highly problematic.

 

How can scientists deal with big data in a responsible manner?

I must be aware of what I am doing with the data I need for my research. I must consider the sources. If I take datasets that are freely available, I need to be fully aware that the individuals concerned have often not given their comprehensive consent to such use. As soon as you start collecting data yourself, you must inform the persons concerned about what is going to be done with their data. One solution for addressing this issue could be a cascade model of consent. In this case, each individual can, for example, indicate whether or not they also consent to their data being used for other projects. The donation of data is another conceivable option. In principle, researchers must develop a consciousness about how they use data and what importance this data has for the individuals concerned.

 

What expectations do you have for research projects, as for example those connected to the ECDF, with regard to the use of big data?

I do not have any expectations that are significantly different to those I have for the work of other researchers. One thing is clear: when you work on digitalization, you cannot avoid putting on the ‘data-protection glasses’. For example, scientists could develop a code of quality standards that go beyond the minimum legal requirements.

 

When it comes to e-health, Denmark is a few steps ahead. Patients and employees in the health sector can view data on an online platform. Is anything like this conceivable in Germany in the near future?

I only have to look at the delayed implementation of the electronic health card to feel rather pessimistic about how things are likely to develop in the e-health sector. In this matter, there can hardly be any doubt that it makes good sense to have all one’s own healthcare data in one place. But it must be ensured that not just anyone can gain access to this data. In Germany, we have got some interesting projects in this area which, however, are also meeting a lot of resistance.

 

What is the reason for this?

In Germany, we are very sensitive when it comes to data protection in general and big data in particular. It’s an interesting cultural phenomenon. It may have something to do with the experience of dictatorship – twice over. The older section of the population, especially, has difficulties with digitalization, and this may also be because they do not have the relevant devices or do not know how to use them. This is a point that we also need to consider in this context. Digitalization has inclusive, but also exclusive, effects.

 

How does digitalization affect the working world in particular?

Ways of working are changing. In many areas we are in the process of making ourselves redundant. That’s a somewhat frightening development. We are coming from traditional EDP, are already working with big data, and are now on the path to artificial intelligence. As part of this process, more and more intellectually demanding activities will be taken over by AI. The impacts are getting closer. We have become accustomed to machines welding cars together, but until now we have been rather unsympathetic to the idea of administrative staff being replaced. We need to focus our efforts on creative processes, on soft factors.

 

How is the Ethics Council dealing with this development?

I am hopeful that we will soon be discussing the issue of artificial intelligence in the Ethics Council. It is time to ask ourselves the question: what exactly is the human factor? Of course, intelligence is not a prerequisite for being human, but it is closely linked to our understanding of ourselves. Other entities that are equally intelligent, or even more intelligent, are challenging us. It’s a phenomenon that requires close observation, because it can easily get out of hand.

 

How do you view the digital future?

With cautious optimism.

 

Why?

In Germany, I believe we have a very particular type of fear connected with technology. Whenever anything new is introduced, we want it to be risk neutral. But that is impossible. We must also see the positive sides of technical innovation: back in the 1970s, the number of lives lost every year on the roads amounted to the size of a small town. To put it brutally: we sacrificed these people on the altar of individual mobility. Fortunately, since then, technology has developed further so that the number of fatal road accidents has decreased enormously. In addition, we have to acknowledge that the digital future will make many things easier for us. We also need to be in no doubt that, even if Germany remains skeptical about artificial intelligence and big data, there is not going to be a worldwide moratorium on them. So if we are not careful we might end up being left behind. This would also entail missing out on the opportunity to incorporate into the design process those values that are now, and will always be, important to us.