Editorial: Big data

Relative to the global population, the per capita capacity for digital information storage has roughly doubled every 40 months since the 1980s. According to Wikipedia, 2.5 exabytes (1018) of data were created every day in 2012. Three years before, the amount of digitally stored data from sources like social networking sites, smartphones, remote sensing devices and government surveillance was estimated at 500 exabytes. According to a calculation by a journalist from The Guardian, if all of this data were to be printed in books it would make 10 stacks reaching from Earth to Pluto. So we are clearly justified in using the term ‘big data’. All of this data presents opportunities to gain new insights into economic trends, disease prevention, crime fighting and other areas. Nevertheless, the analysis of all that data requires expertise. In 2014, the National Think Tank identified a shortage of data scientists in the Netherlands.

In the effort to interest science students in the field, two Think Tank participants are organising the first workshop on data science. In Delta Magazine, Sabine Roeser, Chair of the Human Research Ethics Committee at TU Delft, cautions against forgetting the other side of the data mines: ‘They often involve privacy-sensitive information. It is also important to consider how we can be certain that everyone from whom the data were collected has provided informed consent. The ethical aspects of working with big data are largely an unexplored territory’. In this issue of Delft Outlook, we explore how TU Delft is  applying big data to societal problems.

Frank Nuijens,

Photo: Sam Rentmeester

Photo: Sam Rentmeester

Stay informed about the research

Receive the Delft Outlook newsletter 4 times a year