zulooimagine.blogg.se

Facebook oversight board
Facebook oversight board








facebook oversight board facebook oversight board

At the outset, these numbers can be seen as a strong stand on freedom of expression: “more free speech on Facebook! Put up those posts!” the Board seems to say (for a summary of the decisions, see here). That is: the Board concluded that in 80% of the cases the content had to be put back up, and in 20% of them its removal was correct and the content had to stay down. The FOB overturned four Facebook’s takedown decisions and upheld one. After (ongoing) controversies on its creation (see here and here), and criticism on every step of the Board’s path (see here and here), the first decisions shed some light on the FOB’s approach to its cases and to content moderation more generally. Today, that looks like a brief hiccup Facebook is valued at roughly $770 billion, making it the eighth most valuable company in the world.īut it does suggest that Facebook, which set out to “make the world more open and connected,” won’t ever be free of policing those connections.The Facebook Oversight Board (FOB) has announced its first five decisions. Wall Street freaked out in 2018 when Facebook said it would hire tens of thousands of additional moderators to police hate speech and misinformation. Indeed, on Wednesday, the company reported that profit rose 58 percent last year, to $29 billion. None of this means Facebook won’t continue to be a successful business. Now, imagine wading through those considerations millions, or even billions, of times a day. “That expression of opinion is protected under Facebook’s Community Standards and does not reach the level of hate speech.”Ĭomplex indeed. “Taken in context, the board believes that the text is better understood as a commentary on the apparent inconsistency between Muslims’ reactions to events in France and in China,” the board said. But the board translated the phrase slightly differently, as “hose male Muslims have something wrong in their mindset.” The board said its translators suggested that the terms used were not derogatory or violent, and ordered the post reinstated. Facebook has been accused of being used to stoke hatred of the Rohingya.įacebook removed the post because it translated its text to say there is “indeed something wrong with Muslims psychologically.” Given the criticism of its prior treatment of Rohingya, that was understandable, the board said. The Myanmar government has persecuted members of its Rohingya minority, who are Muslim, driving many into neighboring Bangladesh. In October, a user in Myanmar, writing in Burmese, posted photographs of a Syrian Kurdish child who drowned attempting to reach Europe in 2015, and contrasted the reaction to the photo to what the user said was a “lack of response by Muslims generally to the treatment of Uighur Muslims in China.” The older post had presumably been allowed to remain, raising questions about the consistency of Facebook’s standards for reviewing content.Ī decision on any one of those posts can be enormously complex. In the other case, involving a quote purportedly from Nazi propaganda chief Joseph Goebbels, Facebook’s memory feature had actually recommended that the user recirculate a post from two years earlier. To its credit, Facebook restored the post before the Oversight Board heard the case but it still underscores problems with letting algorithms do the work. But the post was an effort to raise awareness about breast cancer, an exception to Facebook’s general policy against nudity, and an issue that has bedeviled Facebook for a decade. In one, Facebook’s automated systems removed an Instagram post in Portuguese from a user in Brazil showing bare breasts and nipples. We’ve long known that the vast majority-now approaching 90 percent-of Facebook users are outside the US, but the breadth of these cases drives home the magnitude of Facebook’s challenge.įacebook has touted automation as one solution to that challenge, but these cases also highlight the shortcomings of algorithms. Two touch on deep-seated global conflicts: China’s oppression of Uighur Muslims and the ongoing border war between Armenia and Azerbaijan. The cases involve posts in five languages, and often, subtleties of meaning and interpretation. More than anything, though, they show the futility of moderating content across networks with more than 3 billion users-nearly half the people on earth. The rulings are well thought out and show the board members, charged with reviewing Facebook decisions to remove content and make recommendations on Facebook policies, take their job seriously. The Facebook Oversight Board issued its first five decisions Thursday.










Facebook oversight board