On the 6th and 7th of June, DG CNECT held the Next Generation Internet Summit, which was part of a series of actions including a consultation to which COFACE Families Europe responded, aiming at building a strategy to foster the development of the internet as a powerful, open, data-driven, user centric, interoperable platform ecosystem, for the benefit of both companies and citizens. Three main topics were investigated during the conference:
- New technologies for disrupting the economy: business, employment and skills
- New technologies disrupting the public sphere: information, democracy and new media (to which COFACE participated)
- New technologies blurring online and offline worlds and disrupting the personal sphere
A number of issues were raised during the debates and COFACE actively contributed to the discussion.
The issue of “fake news” and how to deal with it has been discussed at length at all levels, and major players such as Google and Facebook have been pressured into addressing the phenomenon, largely exacerbated by the use of their networks. So far, these platforms have chosen to “flag” news that have been identified as “disputable”, lower the exposure of “fake news” by playing with algorithms and cutting the advertisement revenue which makes the bread and butter of many “fake news” outlets. However, this does not solve the very sensitive question of “what is fake news” and how to respect the freedom of speech and freedom of the press.
COFACE Families Europe has advocated for a solution that was expressed by MEP Eva Kaili, namely, instead of censoring “fake news”, make use of algorithms to diversify the newsfeed of online users to expose them to a diversity of opinions and break the “filter bubble”. At present, algorithms used by most online services are configured to show to a user similar content to that which he/she already likes/shares. This only reinforces exposure to “fake news” for certain user profiles. Such an algorithm could for instance display articles about a similar topic from different media outlets/viewpoints. COFACE has advocated more generally that algorithms needed to be “liberalized”, or in other words, that users would have more control over the algorithms that filter what they see online.
But the problem of “fake news” stretches to all media and not just a select few platforms or outlets. The mainstream press cannot be seen as exempt from this issue. Freedom of the press has been essentially defined vis-à-vis State/government control. Now, the “new” threat to freedom of the press is its underlying business model and private ownership/corporate capture of many press and media outlets, which inevitably influences the content, quality and impartiality of the news.
Regarding “fact checking” initiatives, they are useful, but only in an environment/context where people are receptive to critical thinking. Dire socio-economic conditions for instance are a fertile context for “fake news” and fact checking will not help.
Technology forces us to revisit old problems
Advances in technology such as automation, decentralized online services, blockchain, and the challenges that emerge will inevitably reopen more traditional debates such as inequalities and the gap between the rich and the poor, the distribution of the added value from production and the ownership and control over the means of production. Problems such as “fake news”, cyber security and ransomeware are mere symptoms of larger problems such as growing inequalities and social exclusion.
Code is law
In the near future, and especially with the advent of blockchain technology, the saying “code is law” will become even more true than before. Developers and coders will be able to “bake” certain rules within their code and thus will be able to influence the experience people have online and the rights/responsibilites users have.
Democratic decision making can be embedded in code. Wikipedia and community based moderation are examples of solutions which are built around more participation and democracy, directly embedded in the functioning of these services. Blockchain is also an example of a code which has embedded democratic principles: it takes 50% of the network to compromise it. The most important thing is to ensure that the technologies of tomorrow (notably the decentralized ones) are coded with these democratic/participatory principles in mind, to make sure that internet users have a say, can participate and contribute to shape the future of the internet.
Blockchain is full of promise
Blockchain technology will remain at the center stage of future reflections on the “next generation internet”.
Among the many things blockchain might enable in the near future, we find especially the aspect of disintermediation, that is, the removal of third parties in any form of interaction between people, consumers and businesses etc. For instance, blockchain could permit to make payments without going through a bank, or to propose car sharing services without having to pay a fee to platforms like Uber.
A good overview of the disruptive impact of blockchain can be viewed here
Decentralized technologies based on blockchain technology, and specifically, virtual currencies, might become another element of the “checks and balances” system to keep governments from abusing their power, along the separation between the executive, the legislative and the judiciary. Virtual currencies will never fully replace official government backed currencies, but the scope of their use could take the form of “boycott” towards a government which no longer has popular support. Also, virtual currencies may help solve issues of inequality, by enabling the poor to trade and exchange goods and services between themselves, and bypassing currencies where a very small number of participants control most of the available wealth.
There are many challenges ahead for blockchain technology, especially the issue of liability. By getting rid of third parties, it is unclear where liability lies. For instance, if your blockchain payment account gets hacked, who is liable? Who can you turn to? In the case of a bank, there is a clearly identifiable entity which has legal responsibility in case your funds are stolen or lost.
The internet and its impact on politics
We should not underestimate the impact of social media and the internet for democracy, politics and political campaigns and where it is leading us: sophisticated polling and individual campaigning based on people’s preconceptions and ideas. In essence, instead of having politicians which campaign based on novel ideas and visions of the future, social media campaigns allow politicians to simply “test the waters”, see what people already think, and try to convince them that they will act in accordance with their preconceptions or stereotypes, by targeting them individually, with little regards to the coherence of their political project. The use of social media to manipulate public opinion is not only used by political parties or politicians for election campaigns but more broadly, for instance, for fighting international propaganda battles.
See this article for more information about this phenomenon
Beyond the GDPR and data portability
The ultimate goal in data protection and privacy, and solving the “lock in” effect of certain online service providers is to separate the “data” layer from the “services” layer. This means that instead of posting a video on Youtube, you would upload it on a cloud service of your choice (either a commercial cloud service or a decentralized/free cloud service, or even on your own home server) and you would grant Youtube the authorization to access that video, an authorization which could be taken back at any time. You could give the same authorization to other video hosting services. The same can apply to any online service like social networks. Your posts could be hosted on your personal cloud and you could authorize access to them to multiple social networks. Such a proposal is in the direct line of the current “data portability” right in the GDPR but pushes its logic further. However, we must also make sure that strong data protection regulations are in place, especially regarding the use online services and companies are making of such data because it could only exacerbate the scope and breadth of data they have access to. Uses of data for calculating risk premiums or making predictive analytics which violate individual rights and freedoms should either be banned or strongly restricted.
COFACE Families Europe has initiated a reflection on all of these issues in its recent conference, Families on the Move. Download the conference report here