Misinformation, responsibilities and trust
27 Nov 2019 15:00h - 16:30h
Event report
[Read more session reports and updates from the 14th Internet Governance Forum]
Misinformation has become a pressing issue on the agenda of governments around the world. ‘Different forms of misinformation and distrust take different shapes even in a global era’, according to Ms Minna Horowitz (University of Helsinki). This means there is no silver bullet that can easily solve the problem once and for all, but rather, that sharing experiences and reflections can improve the measures adopted to counter misinformation.
Horowitz pointed out that the phenomenon of misinformation is heterogeneous because different contexts have different vulnerabilities. She suggested that this issue be looked at through three levels of analysis: macro (states), meso (media systems), and micro (individuals). A very simple example, regarding macro vulnerabilities, could be: if a society is going through turmoil, there are more opportunities for misinformation to spread. However, at the level of micro vulnerabilities, the analysis would be: if citizens are not media literate there are more chances of them believing fake information.
Highlighting the granularity of the issues, different panellists commented on the nuances of misinformation depending on the platform in which it comes. When the main platform to spread misinformation is a private messaging app like WhatsApp, that means people receive messages almost exclusively from people they know. Because of that, it is very easy for false information to be perceived as true. When the information is sent by someone you trust you are more likely to believe there is some truth to it. To the contrary, on other platforms such as Twitter, the interaction with information from other users is different.
The session covered a wide range of experiences with misinformation in diverse contexts. Panellists from India and Kenya addressed contextual particularities of countries where democratic processes are often not guaranteed. One of the panellists explained: ‘In a culture where rule of law is always an issue and the craziest things you can think of are quite believable, I think it is even more difficult to distinguish fake news from real news. The average person (…) might actually believe that the government is capable of doing something like this or some private company is out to do this sort of thing’. On a different note, a participant pointed out that in Middle East countries, hoaxes seem to revolve around religious topics and that makes it difficult to debunk them without meddling with the sensitivity of people.
What are the initiatives taken so far that can be evaluated as good practices? There seemed to be a reasonable consensus that there is a need for multistakeholder and multidisciplinary approaches to tackle the problem of misinformation. Technology is often put on a pedestal and people think that there will be a technological solution to misinformation, but technology alone is not good enough to solve complex societal issues. There is also a need for promoting media literacy and fact-checking as good practices that will definitely take us in the right direction, but that is not sufficient. As Ms Amrita Choudhury (Director of CCAOI) put it: ‘In the digital world there are best practices that an individual can follow if you can teach them how to practice that. But at the end of the day, you also need the people who are providing those services to be a bit more responsible and the government who looks after things to at least protect your interests, not only their interests.’
The main conclusion from the session was that it is important to understand the nuances of the misinformation phenomenon. For instance, when thinking of potential solutions it is relevant to consider how different contexts, different age groups, different platforms might interact with information. We live in a global society and at the same time we still have our national, regional, and local context, Horowitz commented. There was also a call to push back on regulatory or legal tools and rather focus on collaboration, self-governance, developing quality journalism, and fact-checking. Mr Yongjiang Xie (Cyber Security Association of China) claimed that the law is the last measure to deal with fake news.
By Paula Szewach