EU rewrites rules on censorship of illegal content
The European Union is set to rewrite the rules of the road governing how search engines, online marketplaces, social media and other web platforms find and remove illegal content.
The Digital Services Act (DSA) is based on a fundamental principle, according to a press release from the European Council: âWhat is illegal offline should also be illegal online. “
It is a vast network, covering everything from images of child sexual abuse and terrorist content to the sale of dangerous goods to copyright infringements. And new age verification requirements and parental controls to protect children are also in the game.
The legislation accompanies the Digital Markets Act (DMA) which oversees the collection and use of personal data by tech giants. While the DSA places particular responsibilities on Google, Facebook, Apple, and Amazon, it is much broader and has an impact on any business – such as an Internet Service Provider (ISP) – that is an intermediary between Internet users and content. they are looking for. .
Read more: 6 ways the EU’s digital markets law will change big tech
Like the DMA, the DSA has sharp teeth. Financial penalties for âvery large online platformsâ used by 10% or more of EU citizens can reach 6% of global income. But, while the law is stricter for large platforms, it applies to all but the smallest.
However, its path has not been easy and various interest groups are still fighting for changes.
As always, the biggest battle concerns the responsibility of platforms to remove illegal and copyright infringing content.
While ISPs and content platforms will remain free from liability for the content posted by users, the speed at which they must act is subject to change. Depending on who you ask, it’s way too fast or way too slow.
According to the web-based Electronic Frontier Foundation (EFF), liability could change from the current requirement that infringing content be removed promptly once the platform is made aware of the breach. de facto 24 hours if the liability is to be limited.
It’s so quick that it would force platforms to approve any infringement complaints using an automatic filter, the EFF said.
Unsurprisingly, the International Federation of the Phonographic Industry (IFPI), an umbrella organization for the recording and content industry, sees it differently. The DSA’s Safe Harbor provisions “remove all incentives for search engines to stop allowing access to illegal or harmful content – and to make money from such activity,” he said. IFPI said in a statement.
This is especially true of the way the DSA regulates search engines, which the IFPI fears will “make them the beneficiaries of a broad and unwarranted ‘safe harbor'” and “remove all incentives. for search engines to stop allowing access to illegal or harmful content – and make money from such activity.
Despite these fights, DSA legislation – at this point – is also letting more light in. This includes in particular requiring the biggest platforms to disclose how the content is moderated and in which languages. They will also have to make public information on the algorithms they use to make recommendations, and give researchers access to data on their operation.
Platforms should tell content creators if a platform has limited a post’s visibility or stopped monetary payments for viewers. This would be especially important for creators funded by advertising on YouTube, TikTok and other social media sites.
The EU also said the DSA will make it easier for users to challenge content moderation decisions of large platforms and force large commerce platforms to monitor third-party sellers more carefully.
The responsibility for notifying authorities of suspected serious criminal offenses will be extended from online platforms to all hosting services, and the DSA will also transfer enforcement tasks for the larger platform from national regulators to the European Commission, while leaving the others to the local authorities.