Facebooks nyinstiftade oversight board – ett slags högsta domstol för att lösa tvister om moderering – har just meddelat vägledande beslut i fem fall. I fyra av dessa kom man fram till att företaget tagit bort inlägg på felaktiga grunder.
Eftersom en del tycks ha fallit bort i översättningen till engelska skall vi inte göra saken värre genom att översätta en gång till. Så vi klipper helt fräckt in vad ArsTechnica rapporterar:
- A user in Myanmar posted to Facebook that ”Muslims have something wrong in their mindset” (or maybe ”something is wrong with Muslims psychologically”—translations differ) arguing that Muslims should be more concerned about the genocide of Uighurs in China and less focused on hot-button issues like French cartoons mocking the Prophet Muhammad. Facebook removed the post as anti-Muslim hate speech. The Oversight Board re-instated it, finding that it was best understood as ”commentary on the apparent inconsistency between Muslims’ reactions to events in France and in China.”
- A user wrote that (in the board’s paraphrase) Azerbaijanis ”are nomads and have no history compared to Armenians” and called for ”an end to Azerbaijani aggression and vandalism.” The post was made during an armed conflict between Armenia and Azerbaijan. The board upheld Facebook’s decision to remove it as hate speech.
- A user in Brazil uploaded an image to raise awareness about breast cancer that featured eight pictures of female breasts—five of them with visible nipples. Facebook’s software automatically removed the post because it contained nudity. Human Facebook moderators later restored the images, but Facebook’s Oversight Board still issued a ruling criticizing the original removal and calling for greater clarity and transparency about Facebook’s processes in this area.
- A user in the United States shared an apocryphal quote from Joseph Goebbels that ”stated that truth does not matter and is subordinate to tactics and psychology.” Facebook removed the post because it promoted a ”dangerous individual”—Goebbels. The user objected, arguing that his intention wasn’t to promote Goebbels but to criticize Donald Trump by comparing him to a Nazi. The board overturned Facebook’s decision, finding that Facebook hadn’t provided enough transparency on its dangerous individuals rule.
- A French user criticized the French government for refusing to allow the use of hydroxychloroquine for treating COVID-19. The post described the drug as ”harmless.” Facebook removed the post for violating its policy against misinformation that can cause imminent harm. The board overturned the ruling, concluding that the post was commenting on an important policy debate, not giving people personal medical advice.
Tanken är att dessa beslut skall bli vägledande – även om det inte är helt tydligt hur det är tänkt att gå till.
»The big challenge is that Facebook is far too large—it has 2.7 billion users and 15,000 moderators—for the Oversight Board to review more than a tiny fraction of its moderation decisions. In order for the Oversight Board to have a meaningful impact on Facebook as a whole, the small number of cases it does review must create precedents that are followed by moderators in millions of other cases.«