![]() “We’re sorry for the mistakes we have made - they do not reflect the community we want to help build,” Facebook Vice President Justin Osofsky said in a statement. Our analysis shows that Facebook’s content reviewers often make different calls on whether to allow or delete items with similar content. In the other two cases, it said it didn’t have enough information to respond. In six cases, Facebook said the content did violate its rules but its reviewers had not actually judged it one way or the other because users had not flagged it correctly, or the author had deleted it. In 22 cases, Facebook said its reviewers had made a mistake. We asked Facebook to explain its decisions on a sample of 49 items, sent in by people who maintained that content reviewers had erred, mostly by leaving hate speech up, or in a few instances by deleting legitimate expression. Even when they do follow the rules, racist or sexist language may survive scrutiny because it is not sufficiently derogatory or violent to meet Facebook’s definition of hate speech. ![]() ![]() Based on this small fraction of Facebook posts, its content reviewers often make different calls on items with similar content, and don’t always abide by the company’s complex guidelines. Such inconsistent Facebook rulings are not unusual, ProPublica has found in an analysis of more than 900 posts submitted to us as part of a crowd-sourced investigation into how the world’s largest social network implements its hate-speech rules.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |