Vision

‘If something is free on the internet, you pay for it with your personal information’, according to the philosophy professor Jeroen van den Hoven. He is serving on a committee that advises the Royal Netherlands Academy of Arts and Sciences (KNAW) on the ethical and legal aspects of computer science research.

Constant streams of data can reveal the most fantastic patterns. For example, they can reveal the spread of communicable diseases, the extent of confidence in the economy or how many people are withdrawing money from ATMs at a given time. What Prof. Jeroen van den Hoven would like to say is that the big data revolution will help us to understand society.
There are also disadvantages, however, notes Van den Hoven. Privacy is becoming an enormous problem, and the current European laws are inadequate. One of the core principles is that permission is always required for using data. ‘But this is fighting a losing battle. Companies want to know everything there is to know about you. As a rule, if something is free on the internet, you pay for it with your personal  information’.

We also play into this by granting all sorts of permissions to apps. Van den Hoven therefore expects a number of Snowden moments. ‘America lost a large share of it cloud market after Snowden. This data world is also bound to experience its own Chernobyls and  Fukushimas. In response to Fukushima, Germany declared that it would stop using nuclear energy’.  Van den Hoven sees another threat in the area of security. If we can see the most convenient way to get rid of a virus, maybe we could also find the most deadly route that it could take. Furthermore, if data are so valuable, who should be able to profit from them? ‘In principle, data should be open to the extent that they yield useful information in the common interest’, asserts Van den Hoven.

The Royal Netherlands Academy of Arts and Sciences (KNAW) would like for there to be guidelines for research in this hi-tech domain using apps, internet, Facebook, Twitter and software (e.g. for facial recognition and emotion detection). In the future, will we see a shift  towards restrictions on the generation of all these data? Van den Hoven thinks that we will. ‘You could allow a critical group to examine what you are  planning to do with your research. If the risks are too great, you will have to do something else’. This would thus mean stricter  assessment by the Netherlands Organisation for Scientific Research (NWO) and universities, or more intelligent solutions (e.g. data  minimisation, academisation or problematisation). ‘You don’t have to know who was whom and who was where in order to study patterns of behaviour. Make technology such that it helps to prevent problems (e.g. with passwords, authorisation, deleting data after a certain time and saving  log-in data. It is funny to see that  information technology is also being increasingly used to determine whether big data are being abused  (for example to investigate whether bank loan applications are turned  down in particular postal code areas). Sousveillance: the watched watching  the watchers’.

Photo© Sam Rentmeester.

Photo© Sam Rentmeester.

 

Stay informed about the research

Receive the Delft Outlook newsletter 4 times a year