Facebook deserves every bit of flak being flung at them for their cavalier and irresponsible attitude toward privacy. Another problem, however, has gotten much less attention: activists around the world who use Facebook to promote political and human rights causes frequently find their Facebook groups and accounts suspended.
In all cases that I've heard of, Facebook has been either unresponsive or unsympathetic to activists' complaints. Since the beginning of this year, pro-democracy groups in Hong Kong have complained of repeated deletion of their Facebook groups. One of these groups was called "Never Forget June Fourth," set up to commemorate the June 4th 1989 Tiananmen massacre. Last month some Hong Kong pro-democracy Facebook users wrote an open letter to Mark Zuckerberg (in Facebook). Readers will be shocked and stunned to learn that neither The Zuck nor his colleagues bothered to respond to this letter:
I know of a couple of specific instances in which U.S.-based free speech and civil liberties organizations with contacts at Facebook have tried to get Facebook executives to have a conversation with some members of the human rights community to discuss this problem. Their efforts have all met with total radio silence. Facebook has shown zero interest in holding even a private conversation about this issue. Thus I can only conclude one or all of the following: They are in denial. Or they really don't care. Or they are overwhelmed and upset that people won't love them unconditionally and are sticking their head in the sand. Or they have such a Messiah complex that they really think anybody who wants to talk to them about their faults can only be an enemy. If you have any further conclusions to contribute, feel free to add them in the comments section of this post.
Facebook is certainly not the only company with problems stemming from abuse-prevention and content moderation. There have been several cases in which human rights activists got their YouTube accounts deleted or disabled (the accounts were restored after an outcry from activists). Other activists have had takedown and suspension problems with Flickr, other Yahoo services, and domain hosting companies. But at least Google (YouTube's parent company) and Yahoo have been willing to engage with activists to discuss how content moderation and abuse-prevention mechanisms and procedures might be improved so that they can avoid inadvertent "censorship" of human rights activism taking place on their platforms. Both participated actively in a recent conference call convened by the Global Network Initiative, during which activists and people from the Internet industry held a frank discussion about how company practices can be improved - and how activists might also better educate themselves about terms of service and moderation procedures in order to anticipate and prevent problems. (Click here and scroll down to the bottom of the page for a contact e-mail if you want to get involved with further discussions.)
At last week's Global Voices Summit in Santiago, Chile, YouTube's Victoria Grand joined me, Jillian York, and Oiwan Lam of Hong Kong for a wide-ranging discussion on this issue. I've embedded the video below. At the beginning of the session I asked everybody in the room (probably around 80 people) whether they'd ever had problems with content being removed or accounts suspended on one of the social networking services they use. A couple dozen hands went up. I asked people for concrete suggestions about how netizens (I hate the word "users") and companies could work together on preventing inadvertent suppression of dissent on social networking platforms. Here are a few points that multiple people repeated:
Global Voices Summit: Discussion of Content Moderation, Part I:
In all cases that I've heard of, Facebook has been either unresponsive or unsympathetic to activists' complaints. Since the beginning of this year, pro-democracy groups in Hong Kong have complained of repeated deletion of their Facebook groups. One of these groups was called "Never Forget June Fourth," set up to commemorate the June 4th 1989 Tiananmen massacre. Last month some Hong Kong pro-democracy Facebook users wrote an open letter to Mark Zuckerberg (in Facebook). Readers will be shocked and stunned to learn that neither The Zuck nor his colleagues bothered to respond to this letter:
Dear Mark,While the group was eventually restored and revived (here) the people involved with it feel that Facebook has failed to communicate clearly with them so that these kinds of problems can be avoided in the future. People still have no idea what happened or why. Many suspect that the abuse-reporting mechanisms within Facebook are themselves being abused by governments and other powerful entities. Some including the authors of the above letter - thanks to Facebook's lack of honesty and transparency - suspect Facebook is responding to pressure from governments, including Beijing. I myself doubt there is direct collusion between Facebook and the Chinese government, but I have concluded - based on their actions and inactions - that they don't give a toss about the human rights activists using their service, in spite of what their executives say in speeches about Facebook being a force for world peace and whatnot. In this discussion thread one Hong Kong user posted the following set of suggestions for how Facebook might treat activists with more care and respect:
We are users of Facebook with shared interests in the political development and democracy of Hong Kong and China. We have set up a political nature group page named ‘Never Forget June the Forth’. This group was formed to engage in a free exchange of ideas and remembrances around Tiananmen Square in the spring of 1989, and there has never been an intention to be abusive, or bullying or “take any action on Facebook that infringes or violates someone else’s right or otherwise violates the law,” to quote your own official policy.
Nevertheless, the group of key administrators and creator of ‘Never Forget June the Forth’ is being unfairly harassed by Facebook, which has blocked them from posting new content and working on it at all.
This latest violation of their rights comes on top of numerous other examples of harassment. Some sites have been closed without notice. Facebook has attributed the problems to “technical difficulties,” an explanation that strains common sense.
Therefore we are calling on Facebook to end this harassment (apparently done on behalf of Beijing). We feel, especially in light of what is happening at Google, that both the regional and global press will be interested in this story, should we choose to take it to them.
Please end your harassment, which will prevent us from taking this drastic step. The ball is in your court, and We are certain you can feel the moral weight of how this case may impact your reputation.
Sincerely,
Creator and Group of Administrators
Dear Mark,There are other examples from other regions. In March, Jillian York reported on Global Voices Advocacy that Facebook removed a Moroccan secularist group and its founder. While the group and the founder's personal account were eventually restored, "Facebook did not respond to any requests for an explanation."
I don't want reiterate what people said here. Here I point out a few things.
1. There is no doubt that Facebook is a commerical company. However, it is a leader in social networking now in the planet. It therefore undeniably bears certain social responsibility.
2. There is no clear guideline on when a political oriented group is banned or limited. Today, the removal of a group is all a subjective decision, which is not good enough. I oppose to removal of groups without a public reason, including groups holding opposite political view.
3. Abuse use of report mechanism is a problem. If the report is in fact invalid, and it is an organized act by a gang, there is no punishment at all. This really hurts freedom of speech. This leads to problem that people opens 2nd account to avoid get banned, or form a gang to counter act on the opponent with such report mechanism.
Ok. better to suggest rather than just complaint. Here is my suggestion
A. If it is a political group, pls list out the guideline and restrictions for the admins. What can be done and what can't. What extra limitation can be applied to political group. If violation is confirmed, warnings should be sent to all admins. If several warnings are ignored, FB has the reason to take action. If the group is banned, pls leave a page explaining why.
B. If confirmed case of abuse use of report mechanism, there should be "bad marks" accumulated for that user. Up to certain point, warning is given, and onwards, liable to be disabled.
C. For political group, do we have better option for admin to manage? e.g. ban certain user for 3 days just to let him calm down, or limit certain unfriendly action for 3 posts per 24 hrs. For all posts, unlike option, admin vote count for removal/warnings, and FB operator warnings flag are considered useful.
I think the above required your internal discussion and system modification. But I hope that your company will take my advice seriously.
Rgds
Justin
I know of a couple of specific instances in which U.S.-based free speech and civil liberties organizations with contacts at Facebook have tried to get Facebook executives to have a conversation with some members of the human rights community to discuss this problem. Their efforts have all met with total radio silence. Facebook has shown zero interest in holding even a private conversation about this issue. Thus I can only conclude one or all of the following: They are in denial. Or they really don't care. Or they are overwhelmed and upset that people won't love them unconditionally and are sticking their head in the sand. Or they have such a Messiah complex that they really think anybody who wants to talk to them about their faults can only be an enemy. If you have any further conclusions to contribute, feel free to add them in the comments section of this post.
Facebook is certainly not the only company with problems stemming from abuse-prevention and content moderation. There have been several cases in which human rights activists got their YouTube accounts deleted or disabled (the accounts were restored after an outcry from activists). Other activists have had takedown and suspension problems with Flickr, other Yahoo services, and domain hosting companies. But at least Google (YouTube's parent company) and Yahoo have been willing to engage with activists to discuss how content moderation and abuse-prevention mechanisms and procedures might be improved so that they can avoid inadvertent "censorship" of human rights activism taking place on their platforms. Both participated actively in a recent conference call convened by the Global Network Initiative, during which activists and people from the Internet industry held a frank discussion about how company practices can be improved - and how activists might also better educate themselves about terms of service and moderation procedures in order to anticipate and prevent problems. (Click here and scroll down to the bottom of the page for a contact e-mail if you want to get involved with further discussions.)
At last week's Global Voices Summit in Santiago, Chile, YouTube's Victoria Grand joined me, Jillian York, and Oiwan Lam of Hong Kong for a wide-ranging discussion on this issue. I've embedded the video below. At the beginning of the session I asked everybody in the room (probably around 80 people) whether they'd ever had problems with content being removed or accounts suspended on one of the social networking services they use. A couple dozen hands went up. I asked people for concrete suggestions about how netizens (I hate the word "users") and companies could work together on preventing inadvertent suppression of dissent on social networking platforms. Here are a few points that multiple people repeated:
- Automated moderation and abuse-prevention processes will inevitably result in mistakes that hurt activists. Human judgment - informed by adequate knowledge of cultures, languages, and political events around the world - needs to be brought into the mix.
- Companies need to be as transparent and open as possible about how their takedown, moderation, and suspension procedures work. Otherwise they have nobody but themselves to blame if users cease to trust them.
- Companies should designate staff members to focus on human rights. Their job should be to develop channels for regular communication with the human rights community.
- It's almost impossible for globally popular social networking and content-sharing services to hire enough staff with enough knowledge of political movements and disputes in all obscure corners of the world in all kinds of languages. But communities like Global Voices and others with large networks of bloggers and online activists all over the world are ready and willing to help companies keep abreast of political hot-button issues and online movements around the world - and even provide help with obscure languages - so that extra care can be taken, and political activism won't be mistaken for spam or some other form of abusive behavior. We just need to figure out how to set up workable mechanisms through which this kind of feedback, advice, and communication can take place.
- Activists need to pay closer attention the Terms of Service used by social networking platforms, and be more proactive in educating themselves about how moderation, takedown, and abuse-prevention mechanisms work. We probably need a "Guide to avoiding account suspension and takedown for human rights activists."
- It might also be good to have some kind of respected clearing house organization - or consortium of organizations - which can help mediate and resolve problems between activists and companies.
- People who rely on social networking and content sharing platforms run by companies to do political and social activism should engage more actively with company administrators to improve policies and practices. Anticipate problems and help solve them not only for yourself but for everybody else in the community. Act like a citizen. Not a passive "user."
Global Voices Summit: Discussion of Content Moderation, Part I:
I'm shocked about the fb way to "moderate contents" and it let me seriously think about to quit the fb site, because I always thought that it offers some support for all the HR activists in Asia I'm supporting there....
Thanks for opening my eyes on the freedom of speech in times of globalisation and the "western" arrogance of a higher and more developed standard of HR.
Posted by: alexandra | May 15, 2010 at 05:20 AM