Read the Beforeitsnews.com story here. Advertise at Before It's News here.
Profile image
By Electronic Frontier Foundation (Reporter)
Contributor profile | More stories
Story Views
Now:
Last hour:
Last 24 hours:
Total:

Speaking Freely: Ethan Zuckerman

% of readers think this story is Fact. Add your two cents.


Ethan Zuckerman is a professor at the University of Massachusetts at Amherst, where he teaches Public Policy, Communication and Information. He is starting a new research center called the Institute for Digital Public Infrastructure. Over the years, he’s been a tech startup guy (with Tripod.com), a non-profit founder (Geekcorps.org) and co-founder (Globalvoices.org), and throughout it all, a blogger.

This interview has been edited for length and clarity.*

York: What does free speech or free expression mean to you? 

It is such a complicated question. It sounds really easy, and then it gets really complicated really quickly. I think freedom of expression is this idea that we want to hear what people think and feel and believe, and we want them to say those things as freely as possible. But we also recognize at the same time that what one person says has a real effect on what other people are able to say or feel comfortable saying. So there’s a naive version of freedom of expression which sort of says, “I’m going to say whatever I want all the time.” And it doesn’t do a good job of recognizing that we are in community. And that the ways in which I say things may make it possible or not possible for other people to say things. 

So I would say that freedom of expression is one of these things that, on the surface, looks super simple. You want to create spaces for people to say what they want to say and speak their truths no matter how uncomfortable they are. But then you go one level further than that and you start realizing, oh, okay, what I’m going to do is create spaces that are possible for some people to speak and not for other people to speak. And then you start thinking about how you create a multiplicity of spaces and how those spaces interact with one another. So it’s one of these fractally complicated questions. The first cut at it is super simple. And then once you get a little bit into it it gets incredibly complicated. 

York: Let’s dig into that complexity a bit. You and I have known each other since about 2008, and the online atmosphere has changed dramatically in that time. When we were both, I would say, pretty excited about how the internet was able to bring people together across borders, across affinities, etc. What are some of the changes you’ve seen and how do you think we can preserve a sense of free expression online while also countering some of these downsides or harms? 

Let’s start with the context you and I met in. You and I both were very involved in early years with Global Voices. I’m one of the co-founders along with Rebecca MacKinnon and a whole crew of remarkable people who started this online community as a way of trying to amplify voices that we don’t hear from very often. A lot of my career on the internet has been about trying to figure out whether we can use technology to help amplify voices of people in parts of the world where most of us haven’t traveled, places that we seldom hear from, places that don’t always get attention in the news and such. So Rebecca and I, at the beginning of the 2000s, got really interested in ways that people were using blogs and new forms of technology to report on what was going on. And for me it was places like Sub-Saharan Africa. Rebecca was interested in places like North Korea and sort of getting a picture of what was going on in some of those places, through the lens, often, of Chinese business people who were traveling to those places. 

And we started meeting bloggers who were writing from Iraq, which was under US attack at that point. Who were writing from countries like Madagascar, which had a lot going on politically, but almost no one knew about it or was hearing about it. So you and I started working in this context of, can we amplify these voices? Can we help people speak freely and have an audience? Because that’s one of these interesting problems— you can speak freely if you’re anonymous and on an onion site, etc, but no one’s going to hear you. So can we help people not just speak freely, but can we help find an audience associated with it? And some of the work that I was doing when you and I first met was around things like anonymous blogging with wordpress and Tor. And literally building guides to help people who are whistleblowers in closed societies speak online. 

You and I were also involved with the Berkman Center at Harvard, and we were both working on questions of censorship. One of the things that’s so interesting for me—to sort of go back in history—is to think about how censorship has changed online. Who those opponents to speech are. We started with the assumption that it was going to be the government of Saudi Arabia, or the government of Tunisia, or the government of China, who was going to block certain types of speech at the national level. You know, “You can’t say this. You’re going to be taken down, or, at worst, arrested for saying this.” We then pivoted, to a certain extent, to worries about censorship by companies, by platforms. And you did enormous amounts of work on this! You were at war with Facebook, now Meta, over their work on the female-presenting nipple. Now looking at the different ways which companies might decide that something was allowable speech or unallowable speech based on standards that had nothing to do with what their users thought, but really what the platforms’ decisions were. 

Somewhere in the late 20-teens, I think the battlefield shifted a little bit. And I think there are still countries censoring the internet, there are still platforms censoring the internet, but we got much better at censorship by each other. And, for me, this begins in a serious way with Gamergate. Where you have people—women, critics of the gaming industry—talking about feminist counter-narratives in video games. And the reaction from certain members of an online community is so hostile and so abusive, there’s so much violent misogyny named at people like Anita Sarkeesian and sort of other leaders in this field, that it’s another form of silencing speech. Basically the consequences for some people speaking are now so high, like the amount of abuse you’re going to suffer, whether it’s swatting, whether it’s people releasing a videogame to beat you up—and that’s what happened to Anita—it doesn’t silence you in the same way that, like, the Great Firewall or having your blog taken down might silence you. But the consequences for speech get so high that they really shift and change the speech environment. And part of what’s so tricky about this is some of the people who are using speech to silence speech talk about their right to free speech and how free speech protects their ability to do this. And in some sense, they’re right. In another sense, they’re very wrong. They’re using speech to raise the consequences for other people’s speech and make it incredibly difficult for certain types of speech to take place. 

So I feel like we’ve gone from these very easy enemies—it’s very easy to be pissed off at the Saudis or the Chinese, it’s really satisfying to be pissed off at Facebook or any of the other platforms. But once we start getting to the point where we’re sort of like, hey, your understanding of free speech is creating an environment where it’s very hard or it’s very dangerous for others to speak, that’s where it gets super complicated. And so I would say I’ve gone from a firm supporter of free speech online, to this sort of complicated multilayered, “Wow, there’s a lot to think about in this” that I sort of gave you based on your opening question. 

York: Let’s unpack that a bit, because it’s complicated for me as well. I mean, over the years my views have also shifted. But right now we are seeing an uptick in attempts to censor legitimate speech from the various bills that we’re seeing across the African continent against LGBTQ+ speech, Saudi Arabia is always an evergreen example, Sudan just shut down the internet again, Israel shut down the internet in Palestine, Iran still has some sort of ongoing shutdown, etc etc, I mean name a country and there’s probably something ongoing. And, of course, including the US with the Kids Online Safety Act (KOSA), which will absolutely have a negative impact on free expression for a lot of people. And of course we’re also seeing abortion-related speech being chilled in the US. So, with all of those examples, how do we separate the questions of how we deal with this idea of crowding or censoring eachother’s speech with the very real, persistent threats to speech that we’re seeing? 

I think it is totally worthwhile to mention that actors in this situation have different levels of power. So when you look at something like the Kids Online Safety Act (KOSA), which has the real danger of essentially leaving what is prohibited speech up to individual state attorneys general. And we are seeing different American state attorneys general essentially say we are going to use this to combat “transgenderism,” we’re going to use this to combat—what they see as—the “LGBTQ agenda”, but a lot of the rest of us see as humanity and people having the ability to express their authentic selves. When you have a state essentially saying, “We’re going to censor content accessible to people under 18,” first of all, I don’t think it will pass Supreme Court muster. I think even under the crazy US Supreme Court at the moment, that’s actually going to get challenged successfully. 

When I talk about this progression from state censorship to platform censorship to individual censorship, there is a decreasing amount of power. States have guns, they can arrest you. There’s a lot of things Facebook can do to you, but they can’t, at this point, arrest you. They do have enormous power in terms of large swaths of the online environment, and we need to hold that sort of power accountable as well. But these things have to be an “and”, not an “or.” 

And, at the same time, as we are deeply concerned about state power and we’re deeply concerned about platform power, we also have to recognize that changes to a speech environment can make it incredibly difficult for people to participate or not participate. So one of the examples of this, in many ways, is changes to Twitter under Elon Musk. Where technical changes as well as moderation changes have made this a less safe space for a lot of people. And under the heading of free speech, you now have an environment where it is a whole lot easier to be harassed and intimidated to the point where it may not be easy to be on the platform anymore. Particularly if you are, say, a Muslim woman coming from India, for instance. This is a subject that I’m spending a lot of time with my friend and student Ifat Gazia looking at, how Hindutva is sort of using Twitter to gang up on Kashmirian women and create circumstances where it’s incredibly unsafe and unpleasant for them to be speaking where anything they say will turn into misogynistic trolling as well as attempts to get them kicked off the platform. And so, what’s become a free speech environment for Hindu nationalism turns out to make that a much less safe environment for the position that Kashmir should be independent or that Muslims should be equal Indian citizens. And so, this then takes us to this point of saying we want either the State or the platform to help us create a level playing field, help us create a space in which people can speak. But then suddenly we have both the State and the platform coming in and saying, “you can say this, and not say this.” And that’s why it gets so complicated so fast. 

York: There are many challenges to anonymous speech happening around the world. One example that comes to mind is the UK’s Online Safety Act, which digs into it a bit. We also both have written about the importance of anonymity for protecting vulnerable communities online. Have your views on anonymity or pseudonymity changed over the years? 

One of the things that was so interesting about early blogging was that we started seeing whistleblowers. We started seeing people who had information from within governments finding ways to express what was going on, within their states and within their countries. And I think to a certain extent, kind of leading up to the rise of WikiLeaks, there was this sort of idea that anonymity was almost a mark of authenticity. If you had to be anonymous perhaps it was because you were really close to the truth. Many of us took leaks very seriously. We took this idea that this was a leak, this was the unofficial narrative, we should pay an enormous amount of attention to it. I think, like most things in a changing media environment, the notion of leaking and the notion of protected anonymity has gotten weaponized to a certain extent. I think, you know, Wikileaks is its own complicated narrative where things which were insider documents within, say, Kenya, early on in WikiLeak’s history, sort of turned into giant document dumps with the idea that there must be something in here somewhere that’s going to turn out to be important. And, often, there was something in there, and there was also a lot of chaff in there. I think people learned how to use leaking as a strategy. And now, anytime you want people to pay attention to a set of documents, you say, I’m going to go ahead and “leak” them. 

At the same time, we’ve also seen people weaponize anonymity. And a story that you and I are both profoundly familiar with is Gay Girl in Damascus. Where you had someone using anonymity to claim that she was a lesbian living in a conservative community and talking about her experiences there. But of course it turned out to be a middle aged male Scotsman who had taken on this identity in the hopes of being taken more seriously. Because, of course, everyone knows that middle aged white men never get a voice in online dialogues, he had to make himself into a queer, Syrian woman to have a voice in that dialogue. Of course, the real amusing part of that, and what we found out in unwinding that situation, was that he was in a relationship with another fake lesbian who was another dude pretending to be a lesbian to have a voice online. So there’s this way in which we went from this very sort of naive, “it’s anonymous, therefore it’s probably a very powerful source,” to, “it’s anonymous, it’s probably yet another troll.” 

I think the answer is anonymity is really complicated. Some people really do need anonymity. And it’s really important to construct ways in which people can speak freely. But anyone who has ever worked with whistleblowers—and I have—will tell you that finding a way to actually put your name to something gives it vastly more power. So I think anonymity remains important, we’ve got to find ways to defend and protect it. I think we’re starting to find that the sort of Mark Zuckerberg idea, “you get rid of anonymity and the web will be wonderful”, is complete crap. There’s many communities that end up being very healthy with persistent pseudonyms or even anonymity. It has more to do with the space and the norms associated with it. But anonymity is neither the one size fits all solution to making whistleblowing safe, nor is it the “oh no, if you let anonymity in your community will collapse.” Like everything in this space, it turns out to be complicated and nuanced. And both more and less important than we tend to think. 

York: Tell me about an early experience that shaped your views on free expression. 

The story of Hao Wu is the story I want to tell here. When I think about freedom of expression online, I find myself thinking a lot about his story. Hao Wu is a documentary filmmaker. At this point, a very accomplished documentary filmmaker. He has made some very successful films, including one called The People’s Republic of Desire about Chinese live-streaming, which has gotten a great deal of celebration. He has a new film out called 76 Days about the lockdown of Wuhan. But I got to know him very indirectly, and it was from the fact that he was making a film in China about the phenomenon of underground Christian churches. And he got arrested and held for five months, and we knew about him through the Global Voices community because he had been an active blogger. We’d been paying attention to some of the work he was doing and suddenly he’d gone silent. 

I ended up working with Rebecca MacKinnon, who speaks Chinese and was in touch with all the folk involved, and I was doing the websites and such, building a free Hao Wu blog. And using that, and sort of platforming his sister, as a chance to advocate for his release. And what was so fascinating about this was Rebecca and I spent months writing about and talking about what was going on, and encouraging his sister to speak out, but she—completely understandably—was terrified about the consequences for her own life and her own career and family. At a certain point she was willing to write online and speak out, but that experience of sort of realizing that something that feels very straightforward and easy from your perspective, miles and miles away from the political situation, like, here’s this young man who is a filmmaker and a blogger and clearly a smart, interesting person, he should be able to speak freely, of course we’re going to advocate for his release. And then talking to his family and seeing the genuine terror that his sister had, that her life could be entirely transformed, and transformed negatively, by advocating for something as simple as her brother’s release. 

It’s interesting, I think about our mutual friend Alaa Abd El-Fattah, who has spent most of his adult life in Egyptian prisons, getting detained again and again and again. His family, his former partner, and many of his friends have spent years and years and years advocating for him. This whole process of advocating for someone’s ability to speak, advocating for someone’s ability to take political action, advocating for someone’s ability to make art—the closer you get to the situation, the harder it gets. Because the closer you are to the situation, the more likely that the injustice that you’re advocating to have overturned, is one that you’re experiencing as well. And it’s really interesting. I think it makes it very easy to advocate from a distance, and often much harder to advocate when you’re much closer to a situation. I think any situations where we find ourselves yelling about something on the other side of the world, it’s a good moment to sort of check and ask, are the people who are yelling the people who are directly affected by this—are they not yelling because the danger is so high, are they not yelling because maybe we misunderstand and are advocating for something that seems right and seems obvious but is actually much more complicated than we might otherwise think? 

York: Your lab is advocating for what you call a pluraverse. So you recognize that all these major platforms are going to continue to exist, people are going to continue to use them, but as we’re seeing a multitude of mostly decentralized platforms crop up, how do we see the future of moderation on those platforms? 

It’s interesting, I spend a ton of my time these days going out and sort of advocating for a pluraverse vision of the internet. And a lot of my work is trying to both set up small internet communities with very specific foci associated with them and thinking about an architecture that allows for a very broad range of experiences. One thing I found in all this is that small platforms often have much more restrictive rules than you would expect, and often for the better. And I’ll give a very tangible example. 

I am a large person. I am, for the first time in a long time, south of 300 pounds. But for a long time I have been around between 290 and 310 for most of my adult life. And I started running about six months ago. I was inspired by a guy named Martinus Evans, who ran his first marathon at 380 pounds, and started a running club called the Slow AF Running Club, which has a very active online community and advocates for fitness and running at any size. And so I now log on to this group probably three or four times a week to log my runs, get encouragement, etc. I had to write an essay to join this community. I had to sign on to an incredible set of rules, including no weight talk, no weight loss talk, no body talk. All sorts of things. And you might say, I have freedom of speech! I have freedom of expression! Well, I’m choosing to set that aside so that I can be a member of this community and get support in particular ways. And in a pluraverse, if I want to talk about weight loss or bodies or something like that I can do it somewhere else! But to be a part of this extremely healthy online community that’s really helping me out a lot, I have to sort of agree and put certain things in a box. 

And this is what I end up referring to as “small rooms.” Small rooms have a purpose. They have a community. They might have a very tight set of speech regulations. And they’re great—for that specific conversation. They’re not good for broader conversations. If I want to advocate for body positivity. If I want to advocate for healthy at any weight, any number of other things, I’m going to need to step into a bigger room. I’m going to need to go to Twitter or Facebook or something like that. And there the rules are going to be very different. They’re going to be much broader. They’re going to encourage people to come back and say, “Shut up you fat fuck.” And that is in fact what happens when you encounter some of these things on a space like Reddit. So this world of small rooms and big rooms is a world in which you might find yourself advocating for very tight speech restrictions if the community chooses them on specific platforms. And you might be advocating for very broad open rules in the large rooms with the notion that there’s always going to be conflict and there’s a need for moderation. 

Here is one of the problems that always comes up in these spaces. What happens if the community wants to have really terrible rules? What if the community is KiwiFarms and the rules are we’re going to find trans people and we’re going to harass them, preferably to death? What if that tiny room is Stormfront and we’re going to party like it’s 1939? We’re going to go right back to going after white nationalism and Christian nationalism and anti-Jewish and anti-Muslim? And things get really tricky when the group wants to trade Child Sexual Abuse Material (CSAM), because they certainly do. Or they want to create un-permissioned nonconsensual sexual imagery? What if it’s a group that wants to make images of Taylor Swift doing lots of things that she has never done or certainly has not circulated photos of? 

So I’ve been trying to think about this architecturally. So I think the way that I want to handle this architecturally is to have the friendly neighborhood algorithm shop. And the friendly neighborhood algorithm shop lets you do two things. It lets you view social media on a client that you control through a set of algorithms that you care about. So if you want to go in and say, “I don’t want any politics today,” or “I want politics, but only highly-verified news,” or “frankly, today give me nothing but puppies.” I think you should have the ability to choose algorithms that are going to filter your media, and choose to use them that way. But I also think the friendly neighborhood algorithm shop needs to serve platforms. And I think some platforms may say, “Hey, we’re going to have this set of rules and we’re going to enforce them algorithmically, and here are the ones we’re going to enforce by hand.” And I think certain algorithms are probably going to become de rigeur. 

I think having a check for known CSAM is probably a bare minimum for running a responsible platform these days. And having these sorts of tools that Facebook and such have created to scan large sets of images for  known CSAM, making those tools available to even small platform operators is probably a very helpful thing to do. I don’t think you’re going to require someone to do this for a Mastodon node, but I think it’s going to be harder and harder to run a Mastodon node if you don’t have some of those basic protections in place. Now this gets real hard really quickly. It gets real hard because we know that some other databases out there—including databases of extremist and terrorist content—are not reviewable. We are concerned that those databases may be blocking content that is legitimate political expression, and we need to figure out ways to be able to audit these and make sure that they’re used correctly. We also, around CSAM specifically, are starting to experience a wave of people generating novel CSAM that may not actually involve an actual child, but are recombinations of images to create new scenarios. I’ve got be honest with you, I don’t know what we’re going to do there. I don’t know how we anticipate it and block it, I don’t even know the legal status of blocking some of that imagery where there is not an actual child harmed. 

So these aren’t complete solutions. But I think getting to the point where we’re running a lot of different communities, we have an algorithmic toolkit that’s available to try to do some of that moderation that we want around the community, and there is an expectation that you’re doing that work. And if you’re not, it may be harder and harder to keep that community up and running and have people interact and interoperate with you. I think that’s where I find myself doing a lot of thinking and a lot of advocacy these days. 

We did a piece a few months ago called “The Three Legged Stool,” which is our manifesto for how to do a pluraverse internet and also have moderation and governability. It’s this sort of idea that you want to have quite a bit of control through what we call the loyal client, but you also want the platforms to have the ability to use these sorts of things. So you’ve got folks out there who are basically saying, “Oh no, Mastodon is going to become a cesspit of CSAM.” And, you know, there’s some evidence of that. We’re starting to see some pockets of that. The truth is, I don’t think Mastodon is where it’s mostly happening. I think it’s mostly on much more closed channels. But something we’ve seen from day one is that when you have the ability to do user-generated content, you’re going to get pornography and some of that pornography is going to go beyond the bounds of the galley. And you’re going to end up with that line between pornography and other forms of imagery that are legally prohibited. So there’s gotta be some architectural solution, and I think at some point, running a node without having thought about those technical and architectural solutions is going to start feeling deeply irresponsible. And I think there may be ways in which not only does it end up being irresponsible, but people may end up refusing services to you if you’re not putting those basic protections into place. 

York: Do you have a free speech or free expression hero? 

Oh, that’s interesting. I mean I think this one is probably one that a lot of people are going to say, but it’s Maria Ressa. I think the places in which free expression, to me, feel absolutely the most important to defend is in holding power to account. And what Maria was doing with Rappler in the Philippines was trying to hold an increasingly autocratic government responsible for its actions. And in the process found herself facing very serious consequences—imprisonment, loss of employment, those sorts of things—and managed to find a way to turn that fight into something that called an enormous amount of attention to the Duterte government and opened global conversations about how important it is to protect journalistic freedom of expression. So I’m not saying that journalistic freedom of expression is the only freedom of expression that’s important, I think enormous swaths of freedom of expression are important, but I think it’s particularly important. And I think freedom of expression in the face of real power and real consequences is particularly worth lauding and praising. And I think Maria has done something very interesting which is she has implicated a whole bunch of other actors, not just the Philippines government, but also Facebook and also the sort of economic model of surveillance capitalism. And she encouraged people to think about how all of these are playing into freedom of expression conversations. So I think that ability to take a struggle where the consequences for you are very personal and very individual and turn it into a global conversation is incredibly powerful.


Source: https://www.eff.org/deeplinks/2024/05/speaking-freely-ethan-zuckerman


Before It’s News® is a community of individuals who report on what’s going on around them, from all around the world.

Anyone can join.
Anyone can contribute.
Anyone can become informed about their world.

"United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.

Humic & Fulvic Liquid Trace Mineral Complex

HerbAnomic’s Humic and Fulvic Liquid Trace Mineral Complex is a revolutionary New Humic and Fulvic Acid Complex designed to support your body at the cellular level. Our product has been thoroughly tested by an ISO/IEC Certified Lab for toxins and Heavy metals as well as for trace mineral content. We KNOW we have NO lead, arsenic, mercury, aluminum etc. in our Formula. This Humic & Fulvic Liquid Trace Mineral complex has high trace levels of naturally occurring Humic and Fulvic Acids as well as high trace levels of Zinc, Iron, Magnesium, Molybdenum, Potassium and more. There is a wide range of up to 70 trace minerals which occur naturally in our Complex at varying levels. We Choose to list the 8 substances which occur in higher trace levels on our supplement panel. We don’t claim a high number of minerals as other Humic and Fulvic Supplements do and leave you to guess which elements you’ll be getting. Order Your Humic Fulvic for Your Family by Clicking on this Link , or the Banner Below.



Our Formula is an exceptional value compared to other Humic Fulvic Minerals because...


It’s OXYGENATED

It Always Tests at 9.5+ pH

Preservative and Chemical Free

Allergen Free

Comes From a Pure, Unpolluted, Organic Source

Is an Excellent Source for Trace Minerals

Is From Whole, Prehisoric Plant Based Origin Material With Ionic Minerals and Constituents

Highly Conductive/Full of Extra Electrons

Is a Full Spectrum Complex


Our Humic and Fulvic Liquid Trace Mineral Complex has Minerals, Amino Acids, Poly Electrolytes, Phytochemicals, Polyphenols, Bioflavonoids and Trace Vitamins included with the Humic and Fulvic Acid. Our Source material is high in these constituents, where other manufacturers use inferior materials.


Try Our Humic and Fulvic Liquid Trace Mineral Complex today. Order Yours Today by Following This Link.

Report abuse

    Comments

    Your Comments
    Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

    MOST RECENT
    Load more ...

    SignUp

    Login

    Newsletter

    Email this story
    Email this story

    If you really want to ban this commenter, please write down the reason:

    If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.