UPDATE: Click here to read the response to this post by Facebook's Barry Schnitt, Director, Policy Communications, followed by reactions from others, including Rafik Dammak whose account suspension is discussed in this post.
Privacy is only one many problems eroding users' trust in Facebook. Don't forget Facebook's inconsistent and untransparent approach to account deactivation and page takedowns. I recently wrote about how Hong Kong democracy activists have had pages disabled or deleted with no explanation. Robert Scoble recently reported on how a Texas radio show recently had its Facebook page disabled, and that he gets "a new email from someone who has gotten removed from Facebook every week." [This sentence has been corrected thanks to "Wrldvoyagr" in the comments section, below.] Scoble continues:
They need a more mature approach toward customer service. People and content are still getting deleted by spam filters which no one understands or can explain to me and there isn’t good place to go to appeal content deletions. This the problem with not having a federated system that runs on our own servers. Facebook has too much control over our digital lives and that power to delete content really freaks me out.The problem is especially upsetting for Facebook's international users, particularly those who come from countries where there is heavy censorship and political repression. Rafik Dammak, a Tunisian student living in Tokyo and who is an upstanding and active member of the global Internet community and represents non-commercial Internet users on one of ICANN's councils, announced last week that his Facebook account was suspended without warning. "They don't bother to give rationales, we have to "trust" them," he later wrote on Twitter.
While Rafik has no way of knowing for sure why Facebook deactivated his account, he has some theories. In an e-mail exchange, he and Nasser Weddady, a Mauritanian activist based in Boston, described a concerted "abuse-reporting" campaign being carried out against Tunisian exiles, activists, and other Facebook users who the people conducting the campaign don't like. Rafik pointed to a couple of Facebook pages (here and here) which he says are devoted to targeting and organizing abuse-reporting efforts against specific Facebook users. France 24 has a report in French about those groups here. Here's how Nasser explained it to me:
This phenomenon seems to have been first triggered by the mass protest over things like the Danish Cartoons and Mohamed's portrayals. But the novelty now is the extent at which some pro-government elements in Tunisia resorted to this tactic to shut down dissenter groups or individual user accounts, most to seem to be targeted on account of their political views but that is not always the case hence the name in Arabic: Moubid which means insecticide. This is how it works:In early May Curt Hopkins wrote an article on ReadWriteWeb about how Facebook's move over the past year to make people's accounts and pages public by default had made it easier for such harrassment to occur. While he quotes Facebook representatives who insisted that they don't deactivate accounts without human review, it's clear that something remains seriously wrong with their system.
- A user with a large following designates a group as "subversive"
-spreads the news to his following by inviting other users with the explicit request to pass it on in order to create a snowball effect
-Facebook agrees to shut down the group most likely based on the number of complaints
As you can see, it is a double abuse: an abuse by users who embark on witch hunts just because someone said those accounts or facebook groups are bad, and an abuse of facebook's opaque policy on designating groups as violating its users agreement.
There was a similar phenomenon during the last Gaza war, when pro-palestinan users accounts and groups were targeted by pro-israel groups and individuals in a similar way as in Tunisia. While, it is very difficult to prove that the Tunisians learned that technique from the pro-israel advocates, the first malicious precedent we know of in the Middle East is that one.
This larger context also helps explain the extent to which moderate and cosmopolitan Muslim Facebook users who believe in free speech and who are generally against censorship were so alienated and upset by the fact that Facebook allowed the "Everybody Draw Muhammad Day" page - which on the several occasions when I looked at it was full of obscene and gratuitous anti-Muslim hate speech - to stay up for more than a week. It's well known that Facebook quickly takes down other racist and anti-semitic pages. Yet they allowed a page full of nastiness and hate against the Muslim faith to stay up. We also know from this interview given by one of the page's administrators to Radio Free Europe that Facebook admins were in touch with the page's creators and "not in a negative way," as administrator Andy Freiheit (a pseudonym) put it.
Many things remain seriously wrong in Facebookistan.

Powered by ScribeFire.
Leo Laporte was not the Texas radio host, he merely reported about the disabling. Leo is a tech show host in Petaluma. CA (live.twit.tv).
Posted by: Wrldvoyagr | May 29, 2010 at 06:55 PM
But, yet, you have a Facebook account...why? I stay away from Facebook for a variety of privacy issues...this issue you point out is just another good reason to stay away.
Posted by: GZ Expat | May 29, 2010 at 07:17 PM
Think that these privacy issues will continue to persist as long as Mark Zuckerberg is at the helm in FB. Basically, he doesn't not believe in privacy, and has said so many times. Very much like many governments in this respect.
All his moves to "protect privacy" are just so many small tactical concessions to mollify criticism, and then when he thinks things have quieted down, he goes back to doing what he did before, which is coming up with new ways to bundle and sell users' data.
Of course, this is not unusual. many US politicians and business people are same way. They are all basically sociopaths. So why should we expect Mark Zuckerberg to be any different? American has a long fine tradition of putting sociopaths in charge, then acting with dismay and outrage when they do outrageous things.
狗嘴长不出象牙来
Posted by: Paul Denlinger | May 29, 2010 at 10:09 PM
What happened to Rafik happened to me before. Facebook is a black hole and they just don't care. How can anyone get their attention?
Posted by: Charles Mok | May 29, 2010 at 11:40 PM
GZ expat, good question. There are a number of reasons I haven't left yet, which I think I will explain at length in my next blog post. Meanwhile I've removed anything from my account that isn't already public information and tightened my privacy settings, and mainly now use my FB account to share information with my 900+ person network about FB's problems.
Posted by: Rebecca MacKinnon | May 30, 2010 at 12:15 PM
Thanks for your attention to these issues. We agree freedom of speech is a fundamental right and we’re proud of the role Facebook plays in enabling free expression around the world. However, we’re disappointed that you didn’t reach out to us to confirm any of the information in your post. As a result, there are a number of inaccuracies in the story and you generally misunderstand and misrepresent our system. Specifically:
--We deal with a lot of users and, inevitably, someone breaks the rules. Just because they say they didn’t do anything wrong doesn’t mean they didn’t. When we investigate these issues, most of them end up being legitimate disables for users that are doing something that violate our terms of service. This was the case in the Texas radio station issue and Rafik Dammak. In both cases, the user sent a large number of friend requests that were rejected by other users and ignored warnings about spamming. As a result, the accounts were disabled, and the Pages for which they were the sole administrator were disabled. If the users consent, we would be happy to provide the account data that corroborates our decision. We are sending messages to both users asking for their consent. The suggestion that our automated system has been programmed to censor those who criticize us or take some other specific position is counter to everything Facebook stands for.
--We offer an appeal system and, when we’re wrong, we admit it and often give people second chances even when we’re right. We recognize that we can always do better and we’re currently working to improve the appeals flow to make the process more straightforward and easier to navigate for our users.
--The theory that the number of complaints impacts our decisions is demonstrably false. We’ve recognized that the simple tactic of stuffing the complaint box would be a threat to free speech if we used it as a trigger for action. That’s why the number of complaints plays no role in our decisions and never has. For example, holocaust denial groups that receive complaints every day and garnered much negative attention in the press (http://www.cnn.com/2009/TECH/05/08/facebook.holocaust.denial/index.html). The groups remain.
--The suggestion of an anti-muslim bias is wrong and offensive. We review content that is reported to us and will remove it if expresses hatred that directly targets individuals or groups. If you find hatred that directly targets individuals or groups on the site, the sole reason it remains is that it either hasn’t been reported to us or we haven’t had a chance to review the report yet.
--In almost all cases, a professional reviewer investigates a report before action is taken.
--One exception is spam where automated systems help keep Facebook free of spam. Of course, in some cases, someone may inadvertently trigger an automatic system that is intended to prevent spam. The system always provides warnings to users when they’re getting close to hitting the limit for a given feature. After a certain number of warnings, they must go through a process that educates them about how they may have been misusing our features. At that point, access is fully restored. Those who do not comply could be temporarily disabled. Users can always write to our appeals queue if they feel they’ve been treated unjustly. The suggestion that these systems act without warning is false and is not the way the system is designed.
--Finally, Facebook has 400 million users and that is a tremendous community to manage. Of course we may make the occasional mistake but our intention is always to keep the service safe, secure and free of spam and hate.
If you have questions about this or future issues, please reach out to me (barry@facebook) or my colleagues ([email protected]). We try to be responsive, even on holiday weekends.
Sincerely,
Barry Schnitt
--
Barry Schnitt
Director, Policy Communications
Facebook
[email protected]
650.543.4979
Posted by: Barry | May 30, 2010 at 02:50 PM
If you find hatred that directly targets individuals or groups on the site, the sole reason it remains is that it either hasn’t been reported to us or we haven’t had a chance to review the report yet This comment by Barry Schnitt appears to be quite mendacious, in the context of the offensive 'Draw Muhammad Day' group.
While holding no brief for the Pakistan Lahore High Court order blocking FB in Pakistan (about which there is enough smoke, mirrors, sound and fury signifying roughly as much worth) until today (the court convenes again in a couple of hours), the fact that several tens of thousands of people protested the page should have alerted FB, if it is indeed as upright about 'offensive to individuals and groups' content.
Many people have reported (in email lists discussing freedom of the media) their frustration about the alacrity with which pro-Hitler content is taken down, yet the DMD page stayed up for a week or more, triggering the court order. If Mr Schnitt still needs evidence of this, he should also wonder about whether he is fully informed about internal FB actions.
At the very least, FB, as a social networking site, needs to be uber-sensitive to the likelihood of reactionary anti-freedom responses to its opacity and lack of sensible privacy controls.
Posted by: Vic | May 30, 2010 at 11:43 PM
@Barry,
I have a number of points to respond to, but limited time, so I'll choose the most important:
You say that Facebook offers an appeals system, yet the e-mail that Rafik (and others) received states quite clearly the following:
"We will not be able to reactivate it for any reason, nor will we provide further explanation of your violation or the systems we have in place. This decision is absolutely final."
If the decision is absolutely final, then how is there an appeals process? You seem to contradict yourself.
Frankly, I don't trust your explanation one bit. Having seen numerous users slighted in this manner by Facebook without recourse or ability to return (often with no response to appeals), I say this: Facebook, it's time to step up. YouTube has an enormous community too, but they somehow manage to do the right thing. It's about time you join them.
(And, for the record, though I'm not the author of this post, I've reached out to Facebook over similar issues numerous times and have never, not once, received a response.)
Posted by: Jillian C. York | May 31, 2010 at 12:00 AM
in response to Barry's comment
First thank you for your reply, after almost one week, I got more clear response to what happen instead of standard messages I previously received (gmail show times those which are just copy and paste)
Second, I got the email about disclosing the data to public today but without reference to Rebecca post, I feel this as a trap and it is not really helpful for building trust. Facebook is asking to disclosing data to public when User operations staff clearly refused for days to give proofs and rationales for what they stating about my supposed misbehavior.
About that point, I didn't receive recently warnings related to friend requests (or the last months). I only shortened my friend list by removing people and also removing sent friend requests for more privacy. Maybe I received some warnings... 2 or 3 years ago when I started using Fb and I was mistaking by thinking to have more friends and accepting to send invitations after Fb accessing to my contacts in gmail account(not wise at all). I didn't also receive emails in that matters to warn me if I misbehaved or not.
For appeal system, I don't want to be ironic but receiving message like this "You will no longer be able to use Facebook. This decision is final and cannot be appealed.", it was the first message and the in the second message ". We will not be able to reactivate it for any reason, nor will we provide further explanation of your violation or the systems we have in place. This decision is absolutely final.",User operations clearly stated that they wont give further information,
I hoped that we can discuss about the issue when I send emails to have more concrete rationales and proofs but the reactions only happened Thanks to Rebecca when she mentioned my case.
Finally let's be positive, I think that such incidents may help Facebook to understand that communicating well and avoiding standard messages in response to user complains can only improve the users perception and satisfactions. I also advice Facebook to review its governance model and involve users as part of its social corporate responsibility. I also invite Facebook representatives to attend Internet Governance Forum and to see discussions and debates about governance of social network.
Thanks you Barry for replying,
Rafik
Posted by: Rafik | May 31, 2010 at 01:33 AM
Thanks Barry for your response. The people responding to your comment are raising issues that seem very valid to me. I look forward to a continued dialogue on this issue.
Your company is certainly not alone in having user trust problems caused by less-than-ideal handing of content takedown and account suspension. A lot of people would like to help you do a better job, because that would be a win-win for everybody concerned.
I believe you and your colleagues are aware of the Global Network Initiative (globalnetworkinitiative.org), a multi-stakeholder initiative which brings together companies, human rights groups, socially responsible investors, and academics in an effort to help Internet companies uphold core principles of free expression and privacy. We are working to start a dialogue between companies, human rights groups, and other concerned user groups to help develop "best practices" on account suspension and takedown so that your most vulnerable users will not be hurt by the way in which you manage your user accounts and their content. We recognize it's hard for companies going it alone to get things right at all times, so we are hoping to create some community mechanisms through which activists can get help when they feel that their content has been unfairly removed, or accounts unfairly suspended, on a range of global social networking and content-sharing platforms. This effort is open to non-GNI members and I hope you and your colleagues will participate - along with other people including Rafik who have been involved with this thread.
Posted by: Rebecca MacKinnon | May 31, 2010 at 01:58 PM