NewsApril Privacy

COFACE speaking on Privacy and the Digital Single Market

On the 24th of April, COFACE Families Europe, represented by Martin Schmalzried, spoke at the Federation of European Direct and Interactive Marketing (FEDMA) event on privacy and the digital single market, hosted and moderated by MEP Axel Voss. COFACE Families Europe spoke on the first panel entitled “Privacy and online services: can users have it all?” and covered many issues related to the online business models which are over-reliant on advertising and how this affected privacy.

  • Diversifying away from advertising as the sole business model

While it is understandable that advertising has become the dominant business model online for a variety of reasons, especially the ability to generate more revenue by gradually increasing advertising, it has its disadvantages. The quality of content online has dropped significantly since attracting users with “catchy” titles or pictures (also known as click-baiting) is enough to generate revenue. It is essential to start looking at other business models available out there to achieve a certain balance and avoid the advertising model to destroy itself. Examples include: donation based (Wikipedia), subscription based (Medium, Netflix) and cryptocurrency based (Musicoin, Steem.it, Artbyte, BAT – Basic Attention Token).

  • Forget about consent

While the GDPR (General Data Protection Regulation), coming into force end of May 2018, and the e-privacy directive currently under revision focus a lot on consent (freely given, explicit…), it is important to remember that users are not in a position to refuse to provide it. If they do not provide consent, then they are de facto excluded from using certain key online services which may lead to online exclusion and isolation. The focus should be, instead, on what should be the uses that online service providers can make with the data that users generate. Should it be allowed that advertising is targeted based on political preference to convince (manipulate) people to vote for a specific political party? Should it be allowed to use gender, sexual orientation, race, religion, age etc, to target certain ads which may lead to social exclusion? There are no easy answers, but these are questions that we will have to eventually tackle as a society, and we may as well start now.

  • Diversity for Democracy

Targeted advertising is only the tip of the iceberg. The more important issue is the growing “customization” of content to users’ preference based on their past behaviour. While this may sound convenient to many, it also creates the well documented “filter bubble” phenomenon where users are exposed to ideas and content which matches their previous preferences, allowing little room for discovering “new” ideas/content or conflicting opinions to their own. A democratic society, however, requires that people are exposed to a diverse set of opinions/ideas. One solution would be to open up algorithms, making them open source and allowing users to have more control over which algorithm they use to sort content and customize it by “resetting” it periodically in order to reflect more accurately the person they are and not the person they were.

  • Transparency: a sweet dream

While indeed, transparency over how personal data is being used is essential to create trust, it seems like an unachievable goal. The algorithms (mostly generated via deep learning) which are responsible for sorting content are becoming complex to the point that it is not easy to explain to users why their newsfeed or the content they see is arranged in one way or another. Again, rather than trying to explain to low digitally literate people how algorithms work, the focus should be on high privacy by default and on regulation which delimits the purpose of data processing. If users know that companies cannot abuse of their data, they do not necessarily need to know how algorithms work.

  • Stop the blame game, start looking for solutions

In conclusion, the main problem of the current targeted advertising based model has no limits and is caught up in a “race to the bottom” phenomenon. Whenever a space for potential profits (even if they are unethical) remains unoccupied by the “market”, some company will fill it in order to get a competitive edge over other companies. Facebook was caught up in the Cambridge Analytica scandal, but had it not been Facebook, it would have been another company. That is the sad truth of the current economic system and free markets with no clear rules. Online advertising has no limits and this is precisely what is killing it, driving people to download ad blockers or shifting towards subscription based services. If online advertising is to survive as a business model, it has to be regulated to avoid going too far and alienating users to a breaking point. COFACE Families Europe has issued many recommendations in this regard: applying the same logic as the AVMSD (Audio Visual Media Services Directive) which caps advertising on TV to 20% of the total content (12 minutes per hour) and developing indicators for users to be able to track and compare the prevalence of advertising between different online services.

In the end, failing to come to some form of regulation will simply drive users towards decentralized online services, which have their own set of problems. The choice is in the hands of both advertisers, online service providers and regulators: do you want to kill your golden goose?

For more information, visit the official website of the event or contact Martin Schmalzried: mschmalzried@coface-eu.org