Facebook updates its standards to clarify what it considers sarcasm
My Page In Google News
Facebook has announced that it is adding information to its Community Standards about the so-called mock exception when moderating content.
The change was made in response to a recent decision by its supervisory board.
Facebook said updating the information allows teams to consider sarcasm when assessing potential hate speech violations.
The company said that the update will be completed by the end of the year. Facebook is fully implementing the oversight board’s recommendations regarding the parody exception. While it evaluates the feasibility of other recommendations made based on the situation.
In response to another Oversight Board recommendation, which advised Facebook to ensure it had appropriate procedures in place to properly assess parody and contextual content including by providing content managers with additional resources, the company revealed that it was working on a new parody framework for its regional teams. However, it is currently outlining how to broadly apply this review.
Facebook said stakeholders – from academic experts and journalists to comedians and representatives of satirical publications – who took part in the event noted that humor and satire are highly personal across people and cultures.
The company was also told that it was important to conduct human review of humor and irony by individuals with cultural content.
Facebook and irony:
“Given the context-specific nature of irony, we cannot immediately extend this type of additional assessment or advice to our content moderators,” Facebook said.
“We need time to weigh the potential trade-offs between identifying and increasing content that might qualify for our parody exception versus prioritizing increased riskier policies,” she added. Review times are likely to be slower among our content moderators.
In this case, as described by the oversight board, the American Facebook user replaced the cartoon character’s face in the comic with the Turkish flag. The user included two options: the Armenian Genocide was a lie and the Armenians were the terrorists they deserved.
According to the oversight board, Facebook said it removed the comment because the phrase “Armenians were deserving terrorists,” which included assertions that Armenians are criminals based on their nationality and ethnicity, violating the company’s community standard for hate speech.
The company explained that the comic was not covered under an exception that allows users to share hateful content to condemn it or raise awareness.
The majority of Facebook’s oversight board members disagreed, finding that the comic was covered up by this exception, overturning the company’s decision on the matter.