One of the tools is a pop-up message that Facebook plans to display through its apps for people who use search terms related to children’s exploration.
The letter details the consequences of displaying this content, and provides information on how to obtain assistance from the organizations concerned.
Another measure focuses on harmless sharing of child-exploiting content, so that people who share this material see a safety alert about the harm it could cause.
The alert includes a warning that the content violates Facebook’s rules, as well as the legal implications of sharing this material.
Facebook removes the content and reports it to NCMEC, and deletes accounts promoting such material.
Facebook has updated its policies on child safety, so that the account is deleted Pages, groups and Instagram accounts dedicated to sharing innocent photos of children with captions, hashtags or comments containing inappropriate references about children.
While photos or videos that people share may not violate Facebook’s rules, the accompanying text can help the social network better determine whether the content is sexualising children and whether an account, page, or group should be removed.
In addition, the company has updated its list of reports on Facebook and Instagram, andUsers can select an option called Include a Child under the Nudity & Sexuality section.
Facebook explains that material reported in this way takes priority for content reviewers.
It has also adopted Google’s Content Security API to detect when posts may contain child abuse and prioritize reviewers.
The company has long used various detection systems to root out content that exploits children and to find potentially inappropriate measures with children or potential situations of child grooming..
Facebook says it is looking for networks that violate its rules on child exploitation, in a similar way to the way it deals with unoriginal coordinated behavior and dangerous organizations.