Yesterday we exposed the dangers of Shadow Regulation; the secretive web of backroom agreements between companies that seeks to control our behavior online, often driven by governments as a shortcut and less accountable alternative to regulation.
Today we are proposing a set of criteria, summarized in the infographic below, which turns this critical account of private agreements gone wrong, into a positive agenda for how they could be done better. EFF co-founder John Perry Barlow wrote:
You claim there are problems among us that you need to solve. You use this claim as an excuse to invade our precincts. Many of these problems don't exist. Where there are real conflicts, where there are wrongs, we will identify them and address them by our means. We are forming our own Social Contract. This governance will arise according to the conditions of our world, not yours. Our world is different.
But what are the means by which we will address problems and wrongs that we find online? What mechanisms of governance are we using to craft our own Social Contract?
In general it's best to take decisions as close to the user as possible—preferably by empowering the user themselves. But that isn't always possible; often a user falls subject to foreign laws or corporate policies that they have no say in, and that they can't simply opt out of or route around. In such cases, a good use case for a Social Contract might be to coordinate these laws and policies so that they are consistent and broadly fair to all affected.
Sometimes, we can use treaties to do this (like the Marrakesh Treaty), or standards documents (like those of the IETF), or non-binding high-level statements of principles (like the NETmundial Principles), using so-called “multi-stakeholder” processes which although not democratic in the same way as national parliaments, at least offer all affected parties a chance to be heard.
The problem is that the term “multi-stakeholder” alone isn't very meaningful. The higher the stakes, the more likely it is that a process that may purport to be “multi-stakeholder”, will actually be captured and become a form of opaque and corrupt Shadow Regulation. So we're proposing a more specific set of criteria for adding to our Social Contract for the Internet in a way that meaningfully includes all those affected, from wherever they are in the world.
Finally, there is no point in inviting affected communities to help develop policies for the Internet if their recommendations are ignored. This doesn't mean that these need to be enforceable in their own right; often the solutions developed will simply be available for voluntary adoption by stakeholders, and the extent to which this happens is the best measure of their success. If there is justification for them to be enacted or enforced by formal decision-taking bodies (eg. through laws or treaties) then there should be a clear pathway for that to happen.
These criteria may seem a little abstract—but they'll become clearer as we use them to suggest improvements to institutions and processes that we don't think measure up, and especially those that amount to Shadow Regulation. You can check out particular cases that we've identified under the “Blog posts” tab on our Shadow Regulation issue page.