Achievers Directory‎ > ‎F‎ > ‎Facebook‎ > ‎

Abuse process

What happens when you report abuse on Facebook?

Join thousands of others, and sign up for Naked Security's newsletter

Don't show me this again

If you encounter abusive content on Facebook, do you press the "Report abuse" button?

Facebook has lifted the veil on the processes it puts into action when one of its 900 million users reports abuse on the site, in a post the Facebook Safety Group published earlier this week on the site.

Reporting abuse on Facebook

Facebook has four teams who deal with abuse reports on the social network. The Safety Team deals with violent and harmful behaviour, Hate and Harrassment tackle hate speech, the Abusive Content Team handle scams, spam and sexually explicit content, and finally the Access Team assist users when their accounts are hacked or impersonated by imposters.

Facebook User Operations teams

Clearly it's important that Facebook is on top of issues like this 24 hours a day, and so the company has based its support teams in four locations worldwide - in the United States, staff are based in Menlo Park, California and Austin, Texas. For coverage of other timezones, there are also teams operating in Dublin and Hyderabad in India.

According to Facebook, abuse complaints are normally handled within 72 hours, and the teams are capable of providing support in up to 24 different languages.

If posts are determined by Facebook staff to be in conflict with the site's community standards then action can be taken to remove content and - in the most serious cases - inform law enforcement agencies.

Facebook has produced an infographic which shows how the process works, and gives some indication of the wide variety of abusive content that can appear on such a popular site.

The graphic is, unfortunately, too wide to show easily on Naked Security 

Facebook reporting guide. Click to view large version of infographic


larger version of image - page down



Of course, you shouldn't forget that just because there's content that you might feel is abusive or offensive that Facebook's team will agree with you.

As Facebook explains:

Because of the diversity of our community, it's possible that something could be disagreeable or disturbing to you without meeting the criteria for being removed or blocked. For this reason, we also offer personal controls over what you see, such as the ability to hide or quietly cut ties with people, Pages, or applications that offend you.





To be frank, the speed of Facebook's growth has sometimes out-run its ability to protect users. It feels to me that there was a greater focus on getting new members than respecting the privacy and safety of those who had already joined. Certainly, when I received death threats from Facebook users a few years ago I found the site's response pitiful.

I like to imagine that Facebook is now growing up. As the website approaches a billion users, Facebook loves to describe itself in terms of being one of the world's largest countries.

Real countries invest in social services and other agencies to protect their citizens. As Facebook matures I hope that we will see it take even more care of its users, defending them from abuse and ensuring that their experience online can be as well protected as possible.