"Digital dictatorship" – Prisoners in the new world?

Dialogue forum, 12 April 2018

It is virtually impossible to leave no traces behind in the digital world. Data are being collected, analysed and stored everywhere. With the help of artificial intelligence, individual behaviour patterns can be quickly identified. The right to self-determination in the digital world can easily fall by the wayside. Can we still win the battle for individual freedom?

Artificial intelligence permeates our everyday life – and in the future, it will be taking many of our decisions. The new technology brings opportunities, as for example if autonomous cars can reduce the number of road deaths. In the field of medicine, entirely new possibilities are also opening up. On the other hand, artificial intelligence poses new dangers, the key word being manipulation. For example, politicians can try to win elections through the use of "nudging" – the directing of human behaviour using psychological and technological tricks. Companies, on the other hand, use such techniques because they want to encourage customers to buy their products.


In a digital world, every step leaves a trace behind. It is virtually impossible to fully switch back, says Daniel Steil, Focus.

One problem is that we often disclose personal data voluntarily, indeed willingly. Daniel Steil, Managing Director and Editor-in-Chief of FOCUS Online, had this warning for the audience: "When we decide to depict our lives in social networks, we turn the switch to "always on" – and it is then virtually impossible to switch back without leaving traces." For this reason, he has never posted a photo of his daughter on the network, because he believes she should be allowed to decide herself at a later date what kind of presence she wishes to have on the network. Apple, Amazon, Facebook and Google – who US author Scott Galloway calls "the Four Horsemen of the Apocalypse" – all know more about us then we would like. "Google is able to reconstruct our daily routines," Steil explained. "It knows exactly where we are and when. Apple – through the iPhone – and Facebook also have data about our movements".  

Learning curve to acquire digital competence
It is therefore essential to equip people from a very young age with digital media competency. But according to a 2017 study by the Bertelsmann Foundation, only 8 per cent of school principals consider digital media competence a strategic target. And only 15 per cent of teachers are online savvy. That is why people in Germany are not especially careful with their data. "We have a lot of catching up to do in this area," Steil cautioned. And he believed we should not expect very much from our politicians: "The word digitalisation occurs more than 200 times in the new coalition agreement, yet there is no Digital Ministry, and no appropriate strategy," the FOCUS editor pointed out.

Dr. Thilo Weichert, former Federal State Commissioner for Data Protection in Schleswig-Holstein, also see shortcomings in the political arena. "In the 20th century, Germany was a global driver and role model for data protection; today, German politicians have become delayers and obstructionists," he said. There is now a high-level data protection ordinance in Europe that the German federal government wishes to block. This is partly to protect the economic interests of companies in relation to big data. "Thank goodness the attempt failed," he said. If industry wants a greater degree of digitalisation, data protection must be improved at the same time. The objective must be to prevent the totalitarian concepts that are being followed and are technically possible. Weichert is highly critical: "The coalition agreement makes no attempt to look at design-related issues. That is a catastrophe."  

He believes Europe should take a fundamentally different path in terms of digitalization from the USA or China. The requirements for this are already in place, since Europe already has a General Data Protection Regulation (GDPR) that is firmly anchored in legislation. The lawyer and political scientist outlined what the digital future might look like: "We need a control on algorithms, one that is carried out by people, not computers, to ensure that ethical principles are followed." Ethical standards can only be worked out through cultural discourse in our society; it should not be left to machines. Weichert cited China as a negative example, where the government is establishing a social credit system that stores people’s positive and negative behaviour. "So someone who has a lot of credits is allowed to travel to the West, while people at the other end of the scale face imprisonment," Weichert explained.   


Weichert stresses, that data protection must be integrated from the very beginning of any soft- or hardware construction process.

A level playing field for data protection
How can we protect ourselves against the possibilities of extensive surveillance and behavioural prediction? For Henrik Klagges, managing director of TNG Technology Consulting GmbH, this can only be done through having an even playing field. "Respect for individual rights must be guaranteed at machine level," he said. Just as with autonomous driving, where a computer must decide in cases of doubt who the vehicle knocks down if it is about to crash. We need to transfer our standards into a kind of coded value system to achieve a programmed ethical code that can then respond directly. But what we feed the ethical systems with is another social question. "It would be better if we made this choice at an early stage, before the systems get the idea of taking this responsibility away from us," he warned. Because a system that works with data teaches itself to make decisions. This can work surprisingly well, but can also lead to inhuman decision-making.  

However, the major problem remains that many people voluntarily submit to the digital dictatorship, and are unable, for example, to live without social networks. "31 million Germans are on Facebook, and 79 per cent of them use the service at least once a day," Steil pointed out. He does not believe that the digital corporations will voluntarily ensure greater data protection. "Unless some tough decisions are reached in Brussels, we will not obtain a fair market." Weichert agreed with this opinion: "Alongside enhanced awareness in the population, regulation is a key tool." Specific areas of the brain can be triggered by addictive substances such as alcohol and cigarettes – and also social media – and the government has enforced restrictions for such products. Data protection specialist, Weichert, sounded a word of warning: "The problem is that politicians often have no idea about the basic questions that determine our digital lives."

 
Klagges explains that a lot of tools exist, which offer more data protection, like the TOR network. 

Surfing anonymously, making use of alternatives
So each person needs to achieve more informational self-determination and decide when and with what restrictions they publish personal data. "There are tools, such as what is called the TOR network, that allow you to surf anonymously," explained IT specialist Klagges. However, this involves a degree of expense that many people shy away from for the sake of convenience. Another option would be to use alternatives to Google for internet searches. And with e-mail services, there are also offerings that are compliant with data protection regulations, where content is not analysed and used for advertising purposes. At the same time, a person who has no overview of what information relating to themselves is known and stored is not really in a position to make independent decisions. Perhaps one day, as in the case of the census back in 1993, the Federal Constitutional Court will have to rule where the boundaries lie for the collection of personal data.  

Because the experts all agreed on one thing: the answer should not be to shift the responsibility entirely onto the citizen as the end user. Even assuming that future generations are equipped with adequate digital media competence, the government still needs to meet its protective duty by introducing regulations. Europe can play a pioneering role in this context – and Germany has still a long way to go. 

The next Dialogue Forum will take place on 16 May 2018 on the subject of "Work environment 4.0 – Of robots and men".

23 April 2018