Leaked document says Facebook may be underreporting child abuse images

A training document used by Facebook content moderators raises questions about whether the social network underreports images of potential child sexual abuse, The New York Times reportsThe document allegedly told moderators to “err on the side of an adult” when evaluating footage, a practice moderators challenged but company executives defended.

The problem is how Facebook moderators should handle images in which the age of the subject is not immediately obvious. This decision may have significant implications, as alleged child abuse images are reported to the National Center for Missing and Exploited Children (NCMEC), which forwards the images to law enforcement. Images depicting adults, on the other hand, can be removed from Facebook if they violate its rules, but are not reported to outside authorities.

But like The NYT points out, there is no reliable way to determine age based on a photograph. Moderators are said to be trained to use a more than 50-year-old method to identify “progressive phases of puberty”, but the methodology “was not designed to determine someone’s age”. And, since Facebook’s guidelines ask moderators to assume photos they’re unsure of are of adults, moderators suspect that many images of children may get through.

This is further complicated by the fact that Facebook’s contract moderators, who work for outside companies and do not get the same benefits as full-time employees, may only have seconds to make a decision and may be penalized for making the wrong choice. call.

Facebook, which reports more child sexual abuse material to NCMEC than any other company, says erring on the adult side is to protect users and privacy and to avoid false reports that could hamper the ability of authorities to investigate actual cases of abuse. Company security officer Antigone Davis told the newspaper that it could also be a legal liability for them to make false reports. Notably, not all companies share Facebook’s philosophy on this issue. Apple, Snap and TikTok are all said to be taking the “opposite approach” and reporting images when unsure of their age.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you purchase something through one of these links, we may earn an affiliate commission.

Comments are closed.