Read the Beforeitsnews.com story here. Advertise at Before It's News here.
Profile image
By Electronic Frontier Foundation (Reporter)
Contributor profile | More stories
Story Views
Now:
Last hour:
Last 24 hours:
Total:

Europe's Digital Services Act: on a Collision Course With Human Rights

% of readers think this story is Fact. Add your two cents.


Last year, the EU introduced the Digital Services Act (DSA), an ambitious and thoughtful project to rein in the power of Big Tech and give European internet users more control over their digital lives. It was an exciting moment, as the world’s largest trading bloc seemed poised to end a string of ill-conceived technological regulations that were both ineffective and incompatible with fundamental human rights.

We were (cautiously) optimistic, but we didn’t kid ourselves: the same bad-idea-havers who convinced the EU to mandate over-blocking, under-performing, monopoly-preserving copyright filters would also try to turn the DSA into yet another excuse to subject Europeans’ speech to automated filtering

We were right to worry.

The DSA is now steaming full-speed-ahead on a collision course with even more algorithmic filters – the decidedly unintelligent “AIs” that the 2019 Copyright Directive ultimately put in charge of 500 million peoples’ digital expression in the 27 European member-states.

Copyright filters are already working their way into national law across the EU as each country implements the 2019 Copyright Directive. Years of experience have shown us that automated filters are terrible at spotting copyright infringement, both underblocking (permitting infringement to slip through) and overblocking (removing content that doesn’t infringe copyright) – and filters can be easily tricked by bad actors into blocking legitimate content, including (for example) members of the public who record their encounters with police officials.

But as bad as copyright filters are, the filters the DSA could require are far, far worse.

The Filternet, Made In Europe

Current proposals for the DSA, recently endorsed by an influential EU Parliament committee, would require online platforms to swiftly remove potentially illegal content. One proposal would automatically make any “active platform” potentially liable for the communications of its users. What’s an active platform? One that moderates, categorizes, promotes or otherwise processes its users’ content. Punishing services that moderate or classify illegal content is absurd – these are both responsible ways to approach illegal content.

These requirements give platforms the impossible task of identifying illegal content in realtime, at speeds no human moderator could manage  – with stiff penalties for guessing wrong. Inevitably, this means more automated filtering – something the platforms often boast about in public, even as their top engineers are privately sending memos to their bosses saying that these systems don’t work at all.

Large platforms will overblock, removing content according to the fast-paced, blunt determinations of an algorithm, while appeals for the wrongfully silenced will go through a review process that, like the algorithm, will be opaque and arbitrary. That review will also be slow: speech will be removed in an instant, but only reinstated after days, or weeks,or 2.5 years

But at least the largest platforms would be able to comply with the DSA. It’s far worse for small services, run by startups, co-operatives, nonprofits and other organizations that want to support, not exploit, their users. These businesses (“micro-enterprises” in EU jargon) will not be able to operate in Europe at all if they can’t raise the cash to pay for legal representatives and filtering tools.

Thus, the DSA sets up rules that allow a few American tech giants to control huge swaths of Europeans’ online speech, because they are the only ones with the means to do so. Within these American-run walled gardens, algorithms will monitor speech and delete it without warning, and without regard to whether the speakers are bullies engaged in harassment – or survivors of bullying describing how they were harassed.

It Didn’t Have to be This Way

EU institutions have a long and admirable history of attention to human rights principles. Regrettably, the EU legislators who’ve revised the DSA since its introduction have sidelined the human rights concerns raised by EU experts and embodied in EU law.

For example, the E-Commerce Directive, Europe’s foundational technology regulation, balances the need to remove unlawful content with the need to assess content to evaluate whether removal is warranted. Rather than establishing a short and unreasonable deadline for removal, the E-Commerce Directive requires web hosts to remove content “expeditiously” after they have determined that it is actually illegal (this is called the “actual knowledge” standard) and “in observance of the principle of freedom of expression.” 

That means that if you run a service and learn about an illegal activity because a user notifies you about it, you must take it down within a reasonable timeframe. This isn’t great – as we’ve written, it should be up to courts, not disgruntled users of platform operators, to decide what is and isn’t illegal. But as imperfect as it is, it’s far better than the proposals underway for the DSA. 

Those proposals would magnify the defects within the E-Commerce Directive, following the catastrophic examples set with German’s NetzDG and France’s Online Hate Speech Bill (a law so badly constructed that it was swifty invalidated by France’s Constitutional Council) and set deadlines for removal that preclude any meaningful scrutiny. One proposal requires action within 72 hours, and another would have platforms remove content within 24 hours or even within 30 minutes for live-streamed content.  

The E-Commerce Directive also sets out a prohibition on “general monitoring obligations” – that is, it prohibits Europe’s governments from ordering online services to spy on their users all the time. Short deadlines for content removals run afoul of this prohibition and cannot help but violate freedom of expression rights. 

This ban on spying is complemented by the EU’s landmark General Data Protection Regulation (GDPR) – a benchmark for global privacy regulations – which stringently regulates the circumstances under which a user can be subjected to “automated decision-making” – that is, it effectively bans putting a user’s participation in online life at the mercy of an algorithm.

Taken together, a ban on general monitoring and harmful and non-consensual automated decision-making is a way to safeguard European internet users’ human rights to live without constant surveillance and judgment.

Many proposals for DSA revisions shatter these two bedrock principles, calling for platforms to detect and restrict content that might be illegal or that has been previously identified as illegal, or that resembles known illegal content. This cannot be accomplished without subjecting everything that every user posts to scrutiny.

It Doesn’t Have to be This Way

The DSA can be salvaged.It can be made to respect human rights, and kept consistent with the E-Commerce Directive and the GDPR. Content removal regimes can be balanced with speech and privacy rights, with timeframes that permit careful assessment of the validity of takedown demands. The DSA can be balanced to emphasize the importance of appeals systems for content removal as co-equal with the process for removal itself, and platforms can be obliged to create and maintain robust and timely appeals systems.

The DSA can contain a prohibition on automated filtering obligations, respecting the GDPR and making a realistic assessment about the capabilities of “AI” systems based on independent experts, rather than the fanciful hype of companies promising algorithmic pie in the sky.

The DSA can recognize the importance of nurturing small platforms, not merely out of some fetish for “competition” as a cure-all for tech’s evils – but as a means by which users can exercise technological self-determination, banding together to operate or demand social online spaces that respect their norms, interests and dignity. This recognition would mean ensuring that any obligations the DSA imposes take account of the size and capabilities of each actor. This is in keeping with recommendations in the EU Commission’s DSA Impact Assessment – a recommendation that has been roundly ignored so far.

The EU and the Rest of the World

European regulation is often used as a benchmark for global rulemaking. The GDPR created momentum that culminated with privacy laws such as California’s CCPA, while NetzDG has inspired even worse regulation and proposals in Australia, the UK, and Canada.

The mistakes that EU lawmakers make in crafting the DSA will ripple out all over the world, affecting vulnerable populations who have not been given any consideration in drafting and revising the DSA (so far).

The problems presented by Big Tech are real, they’re urgent, and they’re global. The world can’t afford a calamitous EU technology regulation that sidelines human rights in a quest for easy answers and false quick fixes.


Source: https://www.eff.org/deeplinks/2021/10/europes-digital-services-act-collision-course-human-rights-0


Before It’s News® is a community of individuals who report on what’s going on around them, from all around the world.

Anyone can join.
Anyone can contribute.
Anyone can become informed about their world.

"United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.

Please Help Support BeforeitsNews by trying our Natural Health Products below!


Order by Phone at 888-809-8385 or online at https://mitocopper.com M - F 9am to 5pm EST

Order by Phone at 866-388-7003 or online at https://www.herbanomic.com M - F 9am to 5pm EST

Order by Phone at 866-388-7003 or online at https://www.herbanomics.com M - F 9am to 5pm EST


Humic & Fulvic Trace Minerals Complex - Nature's most important supplement! Vivid Dreams again!

HNEX HydroNano EXtracellular Water - Improve immune system health and reduce inflammation.

Ultimate Clinical Potency Curcumin - Natural pain relief, reduce inflammation and so much more.

MitoCopper - Bioavailable Copper destroys pathogens and gives you more energy. (See Blood Video)

Oxy Powder - Natural Colon Cleanser!  Cleans out toxic buildup with oxygen!

Nascent Iodine - Promotes detoxification, mental focus and thyroid health.

Smart Meter Cover -  Reduces Smart Meter radiation by 96%! (See Video).

Report abuse

    Comments

    Your Comments
    Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

    MOST RECENT
    Load more ...

    SignUp

    Login

    Newsletter

    Email this story
    Email this story

    If you really want to ban this commenter, please write down the reason:

    If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.