IGF 2019 – Dynamic coalition on child online safety
26 Nov 2019 09:30h - 11:00h
Event report
[Read more session reports and live updates from the 12th Internet Governance Forum]
The discussion was opened by Ms Marie Laure Lemineur, Head of the Global Programme Combating Sexual Exploitation of Children Online at ECPAT International, who reminded the audience that EPCAT is a network of 40 NGOs based in many countries, combatting all forms of sexual exploitation, especially the sexual exploitation of children online.
The session focused on how to identify and remove child sex abuse materials (CSAM) and terrorist propaganda. In particular, special attention was given to the issue of hotline operators’ protection and wellness.
In their interventions, all the panellists shared the best practices implemented in their respective organisations vis-à-vis moderators’ protection and psychological wellness. Overall, three main considerations were put forth. In the first place, all the speakers considered that repeated exposure to disturbing content has a significant impact on hotline personnel’s psychological well-being. Thus, it is important for the employers to highlight the moderators’ social role, value their work and stress its importance by showing its positive contribution to society. In the second place, it is essential that preventive steps are taken in order to avoid stress and burn-out. For example, Mr Marco Pancini, Director of EU Public Policy at Google, explained that Google is ensuring that its moderators have available professional help and mindful mediation sessions in order to ensure that employees can reach a state of mind that goes beyond the single tasks they are performing and the stress related to it. In the third place, managers are trained to identify any sign of stress and they are open to receive feedback from their moderators.
Mr John Carr, ECPAT International, UK, further drew the audience’s attention to Dr Sarah T Roberts’ book ‘Digital refuse: Canadian Garbage, commercial content moderation and the global circulation of the social media’s waste’. He explained that the book investigates how western countries’ media waste (e.g. old TV sets and old refrigerators) is transported to developing countries. In particular, the book points out an interesting match: countries that are the destination of this social media waste are also the same countries where online moderators were selected. Carr concluded his presentation, considering that there is a significant difference between the terms and conditions under which these moderators performed their job in countries that are the destination of this ‘digital dump’, and the moderators who were being employed in head office back in California or Seattle – in spite of the fact that they both belonged to the same company.
Mr Michael Turks, Policy Manager at the Internet Watch Foundation (IWF), explained that the IWM is a ‘hotline’ which aims at removing child sexual abuse material in the UK. The IWF is a regulatory body working closely with the Internet industry, the government, and law enforcement agencies. He reiterated the importance of assuring moderators’ welfare while they are constantly exposed to disturbing content. In particular, he considered the important role of technology in reducing the moderator’s exposure to content. For example, when disturbing content is uploaded, it passes through a series of web crawlers that find similar images or duplicates, thus facilitating their removal.
Ms Karuna Nain, Global Safety Programs Manager at Facebook, joined the other panellists in stressing the importance of the moderators’ well-being. In particular, she considered that in order to fully achieve this goal, there is no ‘one-fits-all’ approach, but rather, this issue needs to be tackled from different angles. Furthermore, she also clarified that Facebook is contractually mandated to offer psychological support to moderators.
The session was concluded by an intervention from Mr Larry Magid, member of the Safety Advisory Boards of Facebook, Twitter and Snapchat. He drew the audience’s attention to the fact that employees of organisations dealing with missing and exploited children are also getting the same support as moderators. He stressed the fact that it is important that not only the technical industry, but also start-ups, take responsibility to ensure appropriate resources and conditions for their employees.
By Marco Lotti
[Read more session reports and live updates from the 12th Internet Governance Forum]
The discussion was opened by Ms Marie Laure Lemineur, Head of the Global Programme Combating Sexual Exploitation of Children Online at ECPAT International, who reminded the audience that EPCAT is a network of 40 NGOs based in many countries, combatting all forms of sexual exploitation, especially the sexual exploitation of children online.
The session focused on how to identify and remove child sex abuse materials (CSAM) and terrorist propaganda. In particular, special attention was given to the issue of hotline operators’ protection and wellness.
In their interventions, all the panellists shared the best practices implemented in their respective organisations vis-à-vis moderators’ protection and psychological wellness. Overall, three main considerations were put forth. In the first place, all the speakers considered that repeated exposure to disturbing content has a significant impact on hotline personnel’s psychological well-being. Thus, it is important for the employers to highlight the moderators’ social role, value their work and stress its importance by showing its positive contribution to society. In the second place, it is essential that preventive steps are taken in order to avoid stress and burn-out. For example, Mr Marco Pancini, Director of EU Public Policy at Google, explained that Google is ensuring that its moderators have available professional help and mindful mediation sessions in order to ensure that employees can reach a state of mind that goes beyond the single tasks they are performing and the stress related to it. In the third place, managers are trained to identify any sign of stress and they are open to receive feedback from their moderators.
Mr John Carr, ECPAT International, UK, further drew the audience’s attention to Dr Sarah T Roberts’ book ‘Digital refuse: Canadian Garbage, commercial content moderation and the global circulation of the social media’s waste’. He explained that the book investigates how western countries’ media waste (e.g. old TV sets and old refrigerators) is transported to developing countries. In particular, the book points out an interesting match: countries that are the destination of this social media waste are also the same countries where online moderators were selected. Carr concluded his presentation, considering that there is a significant difference between the terms and conditions under which these moderators performed their job in countries that are the destination of this ‘digital dump’, and the moderators who were being employed in head office back in California or Seattle – in spite of the fact that they both belonged to the same company.
Mr Michael Turks, Policy Manager at the Internet Watch Foundation (IWF), explained that the IWM is a ‘hotline’ which aims at removing child sexual abuse material in the UK. The IWF is a regulatory body working closely with the Internet industry, the government, and law enforcement agencies. He reiterated the importance of assuring moderators’ welfare while they are constantly exposed to disturbing content. In particular, he considered the important role of technology in reducing the moderator’s exposure to content. For example, when disturbing content is uploaded, it passes through a series of web crawlers that find similar images or duplicates, thus facilitating their removal.
Ms Karuna Nain, Global Safety Programs Manager at Facebook, joined the other panellists in stressing the importance of the moderators’ well-being. In particular, she considered that in order to fully achieve this goal, there is no ‘one-fits-all’ approach, but rather, this issue needs to be tackled from different angles. Furthermore, she also clarified that Facebook is contractually mandated to offer psychological support to moderators.
The session was concluded by an intervention from Mr Larry Magid, member of the Safety Advisory Boards of Facebook, Twitter and Snapchat. He drew the audience’s attention to the fact that employees of organisations dealing with missing and exploited children are also getting the same support as moderators. He stressed the fact that it is important that not only the technical industry, but also start-ups, take responsibility to ensure appropriate resources and conditions for their employees.
By Marco Lotti