Data donation: Auditing socially relevant algorithms
17 Dec 2017 13:00h - 14:00h
Event report
[Read more session reports and live updates from the 12th Internet Governance Forum]
This session was organised by Algorithm Watch, a German organisation that conducts research on automated decision-making and its relevance for society. Two of the organisation’s founders, Mr Matthias Spielkamp and Ms Lorena Jaume-Palasi, provided an explanation of what the organisation is doing, and highlighted one of its current projects: an analysis of Google search result personalisation in the context of the 2017 German elections.
Spielkamp explained that discussions on automated systems and their social influence are gaining prominence and are now debated at a global scale. Yet, there is still a lack of evidence and concrete research on how algorithms work, and what their influence on society is. One of the questions that has arisen, mainly in the context of political elections, relates to algorithmic personalisation, and the related hypothesis that search results appear different to everyone, confirming pre-existing world views.
Algorithm Watch designed a browser plugin that would collect the Google search results of 16 queries related to the German elections – such as the names of political leaders and their parties – from each user that had installed the plugin. In the end, the organisation collected more than 5 million search results. Both the plugin and datasets were made available for the public to contribute to further research on this topic.
Although the data is still being examined, the preliminary results demonstrate that the degree of personalisation is relatively limited: approximately 8 out of the 9 organic search results did not differ between users, and the results that did differ most likely originate from the geographic location of the user, rather than political viewpoints. This is in line with Google’s claims related to the degree to which it says it personalises search results. After examining the results of Google’s search engine, the organisation is now conducting similar research for Google News, and it seems that the results will show a greater differentiation in search results between users.
The discussion raised many questions related to the details of the research process as well as on the topic of search result personalisation. For example, the results related to Alternative for Germany, one of the newer and more controversial political parties in Germany, displayed more differences in search results than the more established parties, most likely because other parties have accumulated a very stable information base on Google over the last couple of years. In the future, it was suggested that it might be more interesting to look at policy topics that feature highly on the election agenda, such as migration, rather than politicians and their parties, as these search results might show more variety.
The topic of data protection was also raised, and Jaume-Palasi and Spielkamp explained that they did not collect any personal data, only the search results of the 16 queries. This presented limitations, as they do not know whether the people who downloaded the plugin are representative of the German population, and are unable to break down the search results by gender, age or geographic origin. At the same time, by avoiding to collect personal data, the study was able to include people who would be reluctant to share their data with third parties.
Auditing socially relevant algorithms involves a multidisciplinary effort, not just comprised of the technical analysis of the source code, but also the data that it is based on, as well as legal and ethical dimensions, and the context in which the algorithm operates. Ultimately, the crowdsourced nature of the project demonstrated that despite the seemingly black boxes of algorithms, a joint effort by society can clarify some of their inner workings, even when being unable to completely reverse-engineer the algorithm.
By Barbara Rosen Jacobson