Facebook said on Tuesday that it is stepping up its fight against child abuse with new tools to detect such content and establish strict rules.
Antigone Davis, director of global security, said in a blog post: “It is annoying and unacceptable to harm children using our app.”
“We are developing targeted solutions, including new tools and policies to reduce this type of content sharing.”
The social media giant updated its guidelines to make it clear that it will delete Facebook or Instagram accounts dedicated to sharing images of children posted, as well as titles, tags, or comments that contain insinuations or inappropriate emotional signs.
Davis said: “We have always removed content that explicitly categorizes child pornography, but the more difficult to define content is not clear, nor does it describe the nudity of children.”
“According to this new policy, although the image alone cannot violate our regulations, the accompanying text can help us better determine whether the content constitutes a sexual violation of children, and whether the relevant personal data, pages, and pages should be deleted. Group or account.”
The new tools being tested include a tool that responds to search terms related to child exploitation and triggers pop-up messages that warn about the consequences of viewing such material and advise people to get help to change their behavior.
Davis said Facebook is also testing a security alert that will inform people sharing child exploitation content about the harm and legal consequences it causes.
In addition to deleting content that violates Facebook’s rules, such posts will also be reported to the National Center for Missing and Exploited Children (NCMEC).
Davis said: “We are using the insights from this security alert to help us identify the behavioral signals of those who may be at risk of sharing this material.”
According to Facebook, an analysis of posts on illegal exploitation of children shared with NCMEC at the end of last year found that more than 90% of them were the same or very similar to previously reported content.
Davis said that only six videos accounted for more than half of the content reported during that period.
Facebook works with NCMEC and other organizations to collect people’s apparent intent to share such content.
The conclusion is that more than 75% of the content of the censored sharing does not seem to be malicious, but this is an attempt to express indignation or bad humor.
Facebook plans to provide end-to-end encryption on all of its messaging platforms. This plan has attracted the attention of law enforcement agencies, and the police said this move may allow criminals to hide communications.
Is the Samsung Galaxy S21+ the perfect flagship for most Indians? We discussed this on the weekly technical podcast Orbital, you can subscribe via Apple Podcast, Google Podcast or RSS, download the episode, or click the play button below.