Interview with Peter Schaar on the topic future challenges of the digital age
October 14, 2015
Interview with Peter Schaar, the former German Federal Commissioner for Data Protection and Freedom of Information (BfDI) and current chairman of the European Academy for Freedom of Information and Data Protection (EAID) on the topics of data protection, spyware, discrimination, monopolisation of opinion and future challenges of the digital age.
1. Mr. Schaar, personal data is becoming more and more of a macroeconomic asset, and we are moving towards becoming a data economy. The storage and use of personal data by different interest groups and for a variety of purposes is increasing steadily. The result is the loss of sovereignty over our data. Do you agree?
The question as to who has access to our data is becoming ever more important in light of current data collection and processing practices. Contrary to expectations, today's information society is primarily transparent in only one direction like a one-way mirror, with transparent users on one side and largely non-transparent, digital power centres on the other. No society in which ever more data is available at the global level is immune to cultural impoverishment and oppression. For the future of a democratic information society it is vital that the use of information technology is shaped in such a way that the values and rights fought for and obtained over centuries are safeguarded even under rapidly changing technological conditions. These include data protection and transparency in equal measure. As defined by the German Constitutional Court back in 1983 in its ruling on the then contentious issue of the census, data protection is the right to informational self-determination. The free development of personality guaranteed by the German constitution also includes – according to Germany's Constitutional Court – the basic right of every individual to determine the disclosure and appropriation of their personal data. This is even truer today, and technology can certainly help to ensure that this maxim is fulfilled. Unfortunately, it must be noted that many digital business models as well as government supervisory measures have the opposite effect and in fact restrict informational self-determination. Hence, a rethink is urgently required.
2. From your point of view, what are the economic, political and also societal consequences of these developments?
It may sound paradoxical, but transparency can also help strengthen the right to informational self-determination. Moreover, transparency can actually promote democratic mechanisms of participation and co-decision. Even though we don't need complete transparency – which would also give rise to data protection concerns – we need considerably greater insight into procedures, structures and decision-making processes. It would not suffice to create transparency afterwards. It is at least equally important, as a precautionary measure, to create clarity in order to prevent misguided developments in the public and private sectors from the outset. The right to information under data protection law, the right of access to documents under administrative law, and the right to access government information, which are all guaranteed under freedom of information laws, are insufficient. Rather, laws on the freedom of information must be developed further into laws on transparency, under which government bodies are obliged to provide much more information of their own initiative, i.e. pro-actively. What would be desirable is that not only copied documents are made public but also the data used in the decision-making process (open data). The digital economy, too, needs considerably greater transparency, especially in light of the procedures used in individual profiling and the assessment of human behaviour. Many of the people affected will undoubtedly be unable to understand the complex algorithms involved. Disclosure, however, means they are made verifiable by experts, consumers and data protection advocates. Experience with successful open-source projects – such as the Linux operating system – shows that such transparency of algorithms is useful and feasible and that greater transparency even contributes to improvements in software quality, because errors can be identified more quickly. Finally, we need transparency regarding prices and costs in connection with digital business models. Only if we know the value of the data we divulge, we can make a rational decision about whether we want to use services and which ones to use. The internet offers numerous technological resources that are suitable for this purpose.
3. Critical voices have claimed that, for instance, Apple users are quoted different prices to Android users for the same services or products, because they are presumed to have differing levels of purchasing power. Moreover, there is debate as to whether insurance companies will in future levy different premiums, or even refuse to insure certain individuals, based on personal movement profiles and/or electronic health data. Such price differentiation or the refusal to provide certain insurance products on the basis of our data profiles (of which we are unaware) have been labelled as discrimination by critics. Are these rumours or are consumers actually quoted different prices based on the device we use? How can consumers protect themselves from such discrimination?
Irrespective of the examples you mentioned, it cannot be denied that companies are increasingly working on models to individualise and personalise offers and that this also includes price differentiation. I expect we will see not only the rough differentiations you describe but also much more sophisticated methods. Even today, individual interest rates to be paid on a bank loan are already based on scoring models which include a number of parameters. Hence, it is of decisive importance in both retail trade and the services sector to ensure greater transparency. Today every individual already has the right to request information on their personal data. Transparency regarding the creation of profiles, by contrast, is underdeveloped. Citizens and consumers must be much better informed. I consider legislation to that effect to be urgently required.
4. Price differentiation, as mentioned above, is only one way companies can influence us through the use of personal data on the internet. Guided data acquisition results in every citizen increasingly tending to find their opinions are confirmed. As soon as a user, for example, utilises the results of their Google search, the underlying filtering algorithms produce a personalised selection so that the software vendor or programmer can define exactly which results the user gets to see and which ones they do not. Advertising pop-ups probably harbour less of a risk. However, there is cause for concern if this channel is also used to exert political influence. This holds true not only for Germany but especially for countries lacking a sound democratic and legal footing. What effects do you think are to be expected in the medium to long term of the danger of monopolisation of opinion by means of self-learning algorithms, and what do we citizens have to consider when seeking information online?
Already, we are often caught in the filter bubble. We do not really know how search engines collect the results of our search. According to the Google Doctrine, internet democracy is a kind of machine that concentrates a host of information at a company which evaluates and uses it according to algorithms that are not available to the public. The way in which companies handle this data ultimately depends on their economic interests. Rules are laid down unilaterally without user involvement; and neither do users have any influence on the search results presented to them. If they don't like these rules, users have no other choice than to stop utilising the service. The fact that government bodies also deliberately intervene in order to manage information is confirmed by examples from many countries. Even Western democracies are not immune to this, as shown by press reports about the Pentagon's alleged financing of a Facebook study on emotional manipulation. Users with merely superficial knowledge of computer technology stand little chance of protecting themselves against such manipulation. Hence, it is all the more important to make conscious decisions when using the internet and other digital technologies. This includes both the selection of services and data protection configurations. Particularly important, however, is a critical attitude towards information obtained online. In this respect, it is always a good idea to use several sources of information and to form one's own opinion of reality. The internet even facilitates such strategies.
5. The above-mentioned developments are also resulting in some internet firms in the data protection and data security segment specialising in helping citizens to regain control and sovereignty over their own data. The business models of these firms include cloud-based solutions that allow us to store our personal data securely in order to decide for ourselves whether to make it accessible to only certain people, companies or interest groups. How do you rate the likelihood of success of these – relatively new – business models, and should these offerings also be provided by the public sector? Do you see individuals being able to monetise their personal data themselves in future and what form might such a solution take?
Data protection and the guarantee of security and confidentiality of data are of substantial economic significance. The success of corresponding business models depends on a variety of factors: the performance and user-friendliness of the products, their price and of course consumer awareness. For instance, many commercial users have come to realise in recent times that they expose themselves to huge risks if they ignore data protection and IT security. The Snowden revelations have led to a considerable decline in revenues for US cloud services. Ultimately, however, the right legal framework must be in place: companies doing flourishing business in Europe simply cannot be allowed to ignore the EU data protection laws with which their European competitors have to comply. On this issue I'm counting on the imminent reform of EU data protection legislation which is intended, first and foremost, to create a “level playing field” where non-EU vendors have to comply with the local rules and regulations. I am sceptical about the self-monetisation of personal data, as it ultimately increases the volume of personal data that is given away. There is a good reason why German law bans the trade in human organs even though some individuals would be prepared to give their consent. We should not allow our fundamental right to informational self-determination to be bought, even though making money out of our own personal data might appear beneficial from a short-term perspective.
6. Besides the various internet firms there are also government authorities gathering our personal data and compiling personal movement profiles in order to improve the clear-up rate for criminal cases or to prevent as many terrorist attacks as possible. One aspect being debated is whether an imbalance has already developed between “ensuring public safety” and “exercising control over the public”. Government authorities are storing and evaluating personal data. Where will this practice lead in the medium to long term? To what extent is the state in a position to provide protection if it itself is part of the tracking system?
The authorities have jumped on the bandwagon. They demand and frequently obtain access to the data that is generated by digital business models. I am particularly sceptical about measures such as data retention that turn companies into pseudo-sheriffs. All the same, in Germany we have the Federal Constitutional Court and in Europe the European Court of Justice and the European Court of Human Rights, institutions that impose limits on the state's urge to collect data. When democratic states continually ramp up the surveillance of their citizens for perfectly understandable reasons they thereby jeopardise the principles of a democracy in which the rule of law applies. Unfortunately I cannot discern any change of tack by governments on this count.
7. Let us conclude by speculating about the future. What is your vision of digital-age communication that is secure and unmonitored, and if necessary also anonymous, and what preconditions have to be satisfied so that every user can once again surf the net authentically?
One important step could be for encrypted communication to become standard. Without encryption the confidentiality of internet use is at best comparable with that of a postcard. The fact that each internet user can already encrypt their emails does not obviate the need for comprehensive secure solutions. On the one hand, the issue is not merely emails, and on the other, most users feel they cannot cope. That is why the providers have to step in. We need binding standards across multiple platforms. Identification solutions must be tamper-proof but at the same time allow users to remain anonymous or use pseudonyms. Interesting solutions do exist, too. However, they have only been implemented inadequately. In my opinion it is the duty of the data protection authorities to ensure that providers fulfil their legal obligations – at least those that apply in Germany – to enable users of internet services to remain anonymous and use pseudonyms. Without data protection, unmonitored spaces for discussion, information and opinion forming and without the transparency of societal decision-making processes the information society would become a nightmare. From a democratic theory standpoint “there is no alternative” – to use an expression that is often rightly criticised – to taking legal and technical precautions that ensure that individual freedoms and opportunities for political participation are expanded. The possibility of being an anonymous user is an important aspect, but it is not enough by a long way.
Chairman of The European Academy for Freedom of Information and Data Protection (EAID), former Federal Commissioner for Data Protection and Freedom of Information (2003-2013).
Born in Berlin in 1954. Economics graduate. 1979-1986: Performed various functions in public-sector institutions. 1986-2002: Started out as a team leader for technological data protection, deputy data protection commissioner in Hamburg. 2002/3: Founder and CEO of a data protection firm. 2003-2013 Federal Commissioner for Data Protection and Freedom of Information. Since 2007: Lecturer at the University of Hamburg. Numerous publications, including the books Datenschutz im Internet (2002), Das Ende der Privatsphäre (2007), Total überwacht – Wie wir in Zukunft unsere Daten schützen (2014), Das digitale Wir – Der Weg in die transparente Gesellschaft (2015). Awards and prizes: “Das politische Buch” prize from the Friedrich-Ebert-Stiftung (2008); “eco Internet AWARD” (2008), “Deutscher Datenschutzpreis” from the Gesellschaft für Datenschutz und Datensicherheit (2013), “Louis D. Brandeis Privacy Award” from the US organisation Patients Privacy Rights (2014).
This interview was conducted by Thomas F. Dapp (+49) 69 910-31752 and Parinaz Khademi.
Original version in German: September 22, 2015
© Copyright 2016. Deutsche Bank AG, Deutsche Bank Research, 60262 Frankfurt am
Main, Germany. All rights reserved. When quoting please cite “Deutsche Bank Research”.
The above information does not constitute the provision of investment, legal or tax advice. Any views expressed reflect the current views of the author, which do not necessarily correspond to the opinions of Deutsche Bank AG or its affiliates. Opinions expressed may change without notice. Opinions expressed may differ from views set out in other documents, including research, published by Deutsche Bank. The above information is provided for informational purposes only and without any obligation, whether contractual or otherwise. No warranty or representation is made as to the correctness, completeness and accuracy of the information given or the assessments made.
In Germany this information is approved and/or communicated by Deutsche Bank AG Frankfurt, licensed to carry on banking business and to provide financial services under the supervision of the European Central Bank (ECB) and the German Federal Financial Supervisory Authority (BaFin). In the United Kingdom this information is approved and/or communicated by Deutsche Bank AG, London Branch, a member of the London Stock Exchange, authorized by UK’s Prudential Regulation Authority (PRA) and subject to limited regulation by the UK’s Financial Conduct Authority (FCA) (under number 150018) and by the PRA. This information is distributed in Hong Kong by Deutsche Bank AG, Hong Kong Branch, in Korea by Deutsche Securities Korea Co. and in Singapore by Deutsche Bank AG, Singapore Branch. In Japan this information is approved and/or distributed by Deutsche Securities Inc. In Australia, retail clients should obtain a copy of a Product Disclosure Statement (PDS) relating to any financial product referred to in this report and consider the PDS before making any decision about whether to acquire the product.