TECH: Facebook may be underreporting images of child abuse, Leaked document indicates it.

A train­ing doc­u­ment used by Facebook’s con­tent mod­er­a­tors rais­es ques­tions about whether the social net­work is under-report­ing images of implic­it child sex­u­al abuse, The New York Timesreports.The doc­u­ment report­ed­ly tells chair­per­sons to “ err on the side of an grown-up” when assess­ing images, a prac­tice that chair­per­sons have tak­en issue with but com­pa­ny direc­tors have defended.

At issue is how Face­book mod­er­a­tors should han­dle images in which the age of the sub­ject isn’t incon­ti­nent­ly egre­gious. That deci­sion can have sig­nif­i­cant alle­ga­tions, as sus­pect­ed child abuse imagery is report­ed to the Nation­al Cen­ter for Miss­ing and Exploit­ed Chil­dren (NCMEC), which refers images to law enforce­ment. Images that depict grown-ups, on the oth­er hand, may be removed from Face­book if they vio­late its rules, but are n’t report­ed to out­side authorities.

But, as The NYT points out, there is n’t a trustable way to deter­mine age ground­ed on a snap. Mod­er­a­tors are report­ed­ly trained to use a fur­ther than 50- year-old sys­tem to iden­ti­fy “ the pro­gres­sive phas­es of puber­ty,” but the method­ol­o­gy “ was not designed to deter­mine someone’s age.” And, since Facebook’s guide­lines instruct mod­er­a­tors to assume prints they are n’t sure of are grown-ups, mod­er­a­tors sus­pect numer­ous images of chil­dren may be slip­ping through.
This is fur­ther com­pli­cat­ed by the fact that Facebook’s con­tract mod­er­a­tors, who work for out­side enter­pris­es and do n’t get the same ben­e­fits as full- time work­ers, may only have a many sec­onds to make a deter­mi­na­tion, and may be pun­ished for mak­ing the wrong call.

Face­book, which reports fur­ther child sex­u­al abuse mate­r­i­al to NCMEC than any oth­er com­pa­ny, says erring on the side of grown-ups is meant to cov­er druggies’and pri­va­cy and to avoid false reports that may ham­per author­i­ties’ capa­bil­i­ty to probe fac­tu­al cas­es of abuse. The company’s Head of Safe­ty Antigone Davis told the paper that it may also be a legal lia­bil­i­ty for them to make false reports. Spe­cial­ly, not every com­pa­ny shares Facebook’s ide­ol­o­gy on this issue. Apple, Snap and Tik­Tok all report­ed­ly take “ the con­trary approach” and report images when they’re doubt­ful of an age.

Be the first to comment

Leave a Reply

Your email address will not be published.


*