Meta's Oversight Board overturns takedown decision for Pakistan child abuse documentary

Meta’s Oversight Board overturns takedown decision for Pakistan child abuse documentary


Meta’s external advisory group, the Oversight Board, has overturned the social media company’s decision to take down a news documentary revealing the identities of child victims of sexual abuse and murder in Pakistan — an exceptional case based on newsworthiness.

The 11-minute documentary, posted by the broadcaster Voice of America (VOA) Urdu on its Facebook page in January 2022, was reported by 67 users until July 2023 for disturbing details of the crimes conducted by Javed Iqbal, who murdered and sexually abused about 100 children in Pakistan in the 1990s. It contained images of newspaper clips that showed child victims’ faces with their names and people in tears.

Initially, Meta did not find it a violation following automated and human reviews. However, the post, viewed about 21.8 million times and shared approximately 18,000 times until it was pulled, was later removed by Meta’s policy team for violating the Child Sexual Exploitation, Abuse and Nudity policy after it was escalated internally and flagged separately by the company’s High Risk Early Review Operations system for high chances of it being viral.

Despite violating the Child Sexual Exploitation, Abuse and Nudity Community Standard, the majority of the Oversight Board found that the content should be allowed on the platform after Meta referred the case to the board.

“For the majority, the public interest in reporting on these child abuse crimes outweighed the possible harms to the victims and their families,” the Oversight Board said in a blog post Tuesday explaining its extraordinary decision.

It noted that the documentary was produced to raise awareness and not to sensationalize the gruesome details of the crimes that took place about 25 years ago, with none of the victims surviving.

“This passage of time is the most important factor because it means possible direct harms to the child victims had diminished. Meanwhile, the public interest in child abuse remains,” the board said.

The board has pointed out that Meta decided to pull the content after it had been available on the platform for over 18 months. Additionally, it questions the sufficiency of Meta’s resources for Urdu-language videos.

While most of the Board favored overturning the takedown decision, a minority suggested making the content unavailable, as it was possible to discuss the issues raised in the video without revealing the names and faces of victims.

The Oversight Board recommends Meta create a section within each Community Standard describing what exceptions and allowances apply. The company should include its rationale for not allowing certain exceptions that apply to other policies (such as news reporting or awareness raising) in the new section when it is used, the Board said.

“While the rarely used newsworthiness allowance — a general exception that can be applied only by Meta’s expert teams — was relevant here, the Board notes that no specific policy exceptions, such as raising awareness or reporting on, are available for the Child Sexual Exploitation, Abuse and Nudity policy. Meta should provide more clarity to users about this,” the board noted. “Additionally, it could be made clearer to people in the public language of this policy what qualifies as identifying alleged victims “by name or image.”

Meta has acknowledged that it erred in removing the content due to the substantial harms it posed to the victims and their families even though the events happened over two decades ago. It also welcomed the the Oversight Board’s decision and will reinstate the content within seven days.

The board, which started its work in 2020 after Meta CEO Mark Zuckerberg conceptualized its formation in 2018, has ruled some important oversight cases, including the one criticizing Facebook for banning former President Donald Trump “indefinitely.” In February this year, it called Meta to reform its “incoherent” rules about altered videos.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *