News02 2018 ICT Coalition

Exchange of views with the ICT Coalition for Children Online

On the 20th of February, the ICT Coalition for Children Online (members of the ICT industry such as Google, Facebook and others, who drive the integration of child online safety in their services/devices) organised a Forum to exchange on the latest developments and initiatives around child safety online. Several companies and organisations presented their activities:

  • Orange presented their #supercoders initiative: 20.000 kids participating in coding workshops since 2014, across 20 countries as of 2018.
  • Twitter presented the progress they have made since 2017 to address abuse and harmful content on the platform. This included the possibility to mute or block accounts or hashtags, the option to export block lists to coordinate blocking abusive accounts. On privacy and data protection, users have the option to disable sharing of private data. Finally, Twitter also partnered with CoderDojo to promote coding among children.
  • Parent Zone showed their joint digital literacy and critical thinking tool developed in partnership with Vodafone, helping parents and children flag and identify fake news from real news online.
  • EU Schoolnet presented the outcomes of the Safer Internet Day 2018 which was a massive global success. The social reach was 2,7 million people living in 350 cities across the globe.
  • The European Commission presented its Digital Education Action Plan and its broader strategy in creating a better Internet for kids. The main topics that will be focused on will be fake news, online radicalization and relationships, as well as an initiative around SaferInternet4EU Ambassadors, which will help spread the EU Commission’s work.

Martin Schmalzried, representing COFACE Families Europe, had a chance to present some recommendations and trends to focus on looking forward:

1/More User control:

  • Going from the “right to be forgotten” to the “right to change your mind”: provide users with the option to automatically hide their posts after a set amount of time. This would especially be useful for children which change a lot, especially as they become teenagers.
  • Control over algorithms and content filtering: open up the options that users have on the way the content they see is sorted. This could involve: making it possible for users to select open source or alternative algorithms developed by third parties or sort their content by “standard” methods such as “sort by date”.
  • Control over data and privacy: as the GDPR is nearing entry into force, companies need to rethink how they handle user data, and the balance between online and offline. For instance, for connected toys or Internet of Things devices, can some features work without a connection to the Internet, relying on the local area network? For instance, recording a child play session, something which might be very sensitive, could be done on a hard drive on the family’s local area network instead of being uploaded on the company’s servers. Gradually, we could envisage a Web where the data is fully in control of users (for instance, all data hosted on their personal cloud) and permissions granted by users for various online services to access select parts of that data.
  • Not all online users are digitally literate or technically savvy, which is why even if companies develop or introduce new features there is no guarantee that they will be used. Again, the same system as “content curators” could help in that respect: delegate the default settings to a trusted third party (importing the “default settings” of a person you trust).

2/Increase Trust:

  • Content curators: parents and users in general are not trusting algorithms any more to provide them with the best or most relevant content. New business models and dedicated features should be created for content curators which would receive some remuneration for compiling online content which corresponds to some core values or standards, and which other Internet users could trust.
  • Fake news: the debate over which news sources and which information is “trustworthy” online has mushroomed a lot in recent years. However, there is no easy solution to fake news. Policy makers and companies should be weary of adopting a methodology for identifying “trusted” news sources. Rather, two main actions should be taken. The first one, is to make sure that all users receive a diversified exposure to news, which can be done via algorithms. This means that for any news story they read, other articles about the same story should be displayed to show that there are diverging points of view. Second, the critical thinking skills should go beyond looking at sources and facts, but identifying and exert critical thinking in relation to the (political or commercial) objectives behind any news story or content. Is the article trying to “sell” something? Is there a political or ideological objective behind it?

3/Explore Alternative Business models:

  • Online advertising is the predominant business model but it has resulted in a “race to the bottom” with too much advertising, clickbaiting, and fueling/exacerbating such problems as fake news. Looking into other business models such as Patreon (online crowdfunding platforms), or cryptocurrency mining, digital tokens (like the Basic Attention Token) which support content creators not based on the number of clicks but on the time spent looking at content, which would promote quality over quantity. Overall, users should have a choice of various monetization models for the digital content and services they use.

4/Careful about “Web Mobbing”:

  • While there are plenty of reprehensible contents and people online, we should collectively reflect on how to make sure that the Web doesn’t become a “Medieval village” where users can be victims of “digital stoning”. We should not underestimate the importance of the rule of law and legal due process, and address the issues of online users delivering “digital justice” directly by harassment, cyberbullying or public shaming of other users. The most recent example is the outcry over sexual harassment and the #MeToo tweet campaign. While sexual harassment is a very serious problem, online users should not be the ones to decide on how to “punish” sex offenders or people harassing women. This is but one of the latest examples, but situations where children are being victims of mass harassment or shaming for posting something silly are not rare and can cause real harm and trauma.

For more information about the ICT Coalition, visit the official website, or contact our Senior Policy and Advocacy officer Martin Schmalzried: mschmalzried@coface-eu.org

© Picture by photoroyalty / Freepik

Translate »