The EU’s battle against illegal hate speech made important strides forward during the past year, with 90% of all flagged content being assessed within 24 hours, while 71% of the content deemed to be illegal hate speech was removed.
These numbers came out of the European Commission annual evaluation of the Code of Conduct on countering illegal hate speech online.
The report notes however that the platforms need to further improve transparency and feedback to users. They also have to ensure that flagged content is evaluated consistently over time; separate and comparable evaluations carried out over different time periods showed divergences in performance.
Věra Jourová, Vice-President for Values and Transparency, said:” The Code of conduct remains a success story when it comes to countering illegal hate speech online. It offered urgent improvements while fully respecting fundamental rights. It created valuable partnerships between civil society organisations, national authorities and the IT platforms. Now the time is ripe to ensure that all platforms have the same obligations across the entire Single Market and clarify in legislation the platforms’ responsibilities to make users safer online. What is illegal offline remains illegal online.
Didier Reynders, Commissioner for Justice, said: “I welcome these good results. We should, however, not satisfy ourselves with these improvements and we should continue the good work. I urge the platforms to close the gaps observed in most recent evaluations, in particular on providing feedback to users and transparency. In this context, the forthcoming Digital Services Act will make a difference. It will create a European framework for digital services, and complement existing EU actions to curb illegal hate speech online. The Commission will also look into taking binding transparency measures for platforms to clarify how they deal with illegal hate speech on their platforms.”
The fifth evaluation shows that on average:
- 90% of flagged content was assessed by the platforms within 24 hours, whereas it was only 40% of contents in 2016.
- 71% of the content deemed to be illegal hate speech was removed in 2020, whereas only 28% of content were removed in 2016.
- The average removal rate, similar to the one recorded in the previous evaluations, shows that platforms continue to respect freedom of expression and avoid removing content that may not qualify as illegal hate speech.
- Platforms responded and gave feedback to 67.1 % of the notifications received. This is higher than in the previous monitoring exercise (65.4%). However, only Facebook informs users systematically; all the other platforms have to make improvements.
The results obtained in the context of the implementation of the Code of Conduct over the last four years will feed into the ongoing reflections on how to strengthen measures whose objectives are to address illegal content online in the future Digital Services Act Package on which the Commission recently launched a public consultation.
The Commission will consider ways to prompt all platforms dealing with illegal hate speech, to set up effective notice-and-action systems.
In addition, the Commission will continue in 2020 and 2021 to facilitate the dialogue between IT companies and civil society organisations working on the ground to tackle illegal hate speech, in particular to foster the engagement with content moderation teams, and mutual understanding on local legal specificities of hate speech.