Published on: 21.12.17

IGF Day 3: Dynamic Coalitions Day

Author: Jutta Croll , Organisation: Stiftung Digitale Chancen

IGF Tag 3: Dynamic Coalitions Day (deutsche Version des Artikels)

IGF session

210.12.2017 The third day of the IGF starts with the Main Session of Dynamic Coalitions. These are groups formed by of organizations and individuals who share common interests and work together on their issues. Altogether 17 groups are currently active and reflect in their composition the multi-stakeholder principle as well.

The Dynamic Coalition on Child Online Safety - (DC-COS) - has been established in Rio de Janeiro in 2007. It is coordinated by ecpat international and working on all questions related to risk potentials and protection measures for children on the internet. In 2007 the Digital Opportunities Foundation represented the European project Youth Protection Roundtable at the IGF and we have affiliated with the Dynamic Coalition to take part in the international exchange on youth media protection.

In the Main Session working steps and results that have been achieved in the twelve months since the IGF in Guadalajara were presented and discussed. The Dynamic Coalition on Child Online Safety put their focus on the Declaration of Rome and the input of the coalition members to this document. Overall synergies between the single coalitions became clear and shall be harnessed even more in the future. These include issues as net neutrality, public access and a general concept for the consent to online activities discussed under the term “Consent by Design”.

In the afternoon the Dynamic Coalition’s workshop on Child Online Safety the took place. Topic was the monitoring of content. John Carr explained for ecpat international that platform providers have enhanced their efforts on deleting illegal content. At the same time there exist reports about freelancers in South East Asia - mostly under bad working conditions - who are performing this task. It is unacceptable, so John Carr, that the Western World is dumping their content-waste like they do with their electronic garbage in the developing countries where poorly paid women - sometimes even in presence of their children - have to view and sort out pictures displaying extremely violent content and sexual abuse of children.

Karuna Nain, facebook and Marco Pancini, Google explained how their companies deal with complaints on content. The biggest challenge is the amount of reports. All illegal content would be deleted after careful examination, but many of the reported content does not violate the law or platform conditions. Michael Tunks, Internet Watch Foundation, explained the staff welfare programme for employees dealing with huge amounts of intolerable material. Larry Magid, member of the Safety-Board of facebook and other large companies, declared him not being aware of the issue of people monitoring content like this on a freelance basis not being subject to such welfare programmes. Several of the approximate 50 participants in the workshop stated this for themselves. At the same time they enquired critically the fact that it is increasingly up to the platform providers to decide if content qualifies for deletion or not. Catrin Bauer-Bulst from the European Commission explained thereupon that there is no obligation for monitoring in Europe, but companies, if they become aware of illegal content, must follow applicable law. Different than in the USA the E-Commerce-Directive does not provide for a “Good-Samaritan-Rule” exempting the platform provider from liability for the decision whether content is deleted or is allowed to stay online.

Speeches and discussions showed the manifold facets of the topic and demonstrate the huge need for comprehensive strategies and holistic concepts of protection for more online safety of children and youths. There is a lot to do for the Dynamic Coalition on Child Online Safety in the year 2018!

Further information

Source: Stiftung Digitale Chancen

Jugendschutz, Zensur von Inhalten