Instagram is aiming to make it harder for potential scammers and criminals to coerce teens into sending nude photos and extort them for money.

The company announced on Thursday it is testing new features to curb an alarming trend called financial sextortion, which often targets kids and teenagers. Once the nude images are sent, the scammers claim they’ll post them online, either on public websites or on newsfeeds where their friends will see, unless the victims send money or gift cards.

In the upcoming weeks and among a subset of users, Instagram said it will roll out various new features, such as blurring nude images sent in direct messages and informing users when they’ve interacted with someone who engaged in financial sextortion. The tools will come to all users worldwide soon after.

“It is a really horrific crime that preys on making people feel alone and ashamed,” Antigone Davis, Meta’s director of global safety, told CNN. “It’s been well documented that the crime is growing, and this is one of the reasons that we want to get out ahead and make sure people are aware of what we’re doing as we continually evolve our tools.”

Non-consensual sharing of nude images has been a problem for years, typically among people who seek revenge on victims they know personally. But the FBI recently said it has seen an increase in financial sextortion cases from strangers, often started by scammers overseas. In some cases, sextortion has resulted in suicide.

Meta’s latest tools build on Meta’s existing, related teen safety features, including strict settings that prevent messaging between non-connected accounts, an increase in safety notices and an option to report DMs that threaten to share or request intimate images. Last year, Meta teamed up with the National Center for Missing and Exploited Children (NCMEC) to develop Take It Down, a platform that lets young people create a unique digital fingerprint for explicit images they want taken down from the internet.

The tools also come as Meta, along with other social networks, face thousands of wrongful death lawsuits about how the platforms have caused harm to young users, from facilitating the sales of lethal drugs to enabling eating disorders and social media addictions.

Meta told CNN it will first test its nudity protection feature within Instagram’s direct messages. When an explicit picture is sent, the platform will blur the image and inform the recipient that it contains nudity. The alert will also remind users they don’t need to respond and ask them if they want to block the sender. A notification will also appear when the platform detects a user wants to send a nude photo, nudging them to reconsider.

The tool will be on by default for teens under age 18, but adults will also receive a notification encouraging them to enable the tool.

The company told CNN that Meta’s technology uses on-device machine learning to determine if a photo contains nudity. Meta already prohibits nudity on news feeds and other public areas of its platforms.

Meta said it is also working on ways to identify accounts that may be engaging in sextortion scams by detecting and monitoring likely sextortion behavior. This includes making those accounts more difficult to interact with, such as blocking outgoing messages, or alerting users who may have interacted with an account that’s been removed for sextortion. The message will direct them to expert-backed resources, according to the company.

In November, Meta joined a program called Lantern, operated by industry group Tech Coalition, that enables tech companies to share information about accounts and behaviors that violate their child safety. The company said it is now adding integration with its latest sextortion prevention tools; for example, if a link originated on another social media network before it was shared on Instagram, the other platform would be notified.

“What I would like to see accomplished with this announcement is that parents are more aware of this crime and take time to learn about it,” Davis said. “I also want to make sure parents know it’s important to let their kids know that it is okay to come to them if something has happened. They shouldn’t feel ashamed to come forward and there are tools available that can help.”

Share.
Exit mobile version