Dark Mode Light Mode

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Follow Us
Follow Us
Login Login
Following a compromise, the US government advises Sisense users to change their passwords.
Meta will automatically hide naked photos in Instagram direct messages, which is the newest safety measure for teens.
A European auto company will test Sanctuary AI's humanoid robot.

Meta will automatically hide naked photos in Instagram direct messages, which is the newest safety measure for teens.

Meta said on Thursday that it is testing new Instagram features designed to protect young people from unsolicited nudity or sextortion schemes. “Nudity Protection in DMs” is a feature that automatically blurs photographs containing nudity.

The digital behemoth said it would also encourage youngsters to safeguard themselves by issuing a warning to think twice before sharing personal photographs. Meta thinks that this will increase security against fraudsters who may send nude photographs to mislead others into submitting their own images in exchange.

The business also said that it is making changes to make it more difficult for prospective fraudsters and criminals to discover and communicate with teenagers. Meta said that it is developing new technologies to detect accounts that are “potentially” engaged in sextortion schemes and would impose restrictions on how these suspicious accounts interact with other users.

Advertisement

Meta also said on Thursday that it has expanded the data it shares with Lantern, a cross-platform online kid safety initiative, to include additional “sextortion-specific signals.”

The social networking giant has long had standards prohibiting users from sending unsolicited nudes or attempting to compel others into providing personal photographs. However, this does not prevent these issues from arising and creating pain for a large number of teenagers and young people, sometimes with disastrous consequences.

We have compiled a more detailed list of the most recent modifications below.

Nudity screens.
Nudity Protection in DMs seeks to safeguard young Instagram users from cyberflashing by hiding nude photographs behind a safety screen. Users will be able to select whether or not to see these photographs.

“We’ll also show them a message encouraging them not to feel pressure to respond, with an option to block the sender and report the chat,” Meta went on to say.

Worldwide, users under the age of 18 will default to the nudity safety screen. Older users will notice a notification inviting them to enable the functionality.

“People sending images containing nudity will see a message reminding them to be cautious when sending sensitive photos and that they can unsend these photos if they’ve changed their mind,” the business said.

Anyone attempting to send a nude photograph will see a similar notice, prompting them to rethink.

Because the picture analysis is performed on the user’s own device, Meta claims that the functionality will operate in end-to-end encrypted talks.

The nudity filter has been in development for over two years.

Safety tips
Another protection is that Instagram users who transmit or receive nudes will be led to safety advice (with information about the possible hazards), which Meta claims was prepared with professional input.

“These tips include reminders that people may screenshot or forward images without your knowledge, that your relationship with the person may change in the future, and that you should review profiles carefully in case they’re not who they say they are,” the business said in a statement. “They also link to a range of resources, including Meta’s Safety Center, support helplines, StopNCII.org for those over 18, and Take It Down for those under 18.”

The corporation is also testing the presentation of pop-up notifications to individuals who may have connected with a terminated account due to sextortion. These pop-ups will also guide viewers to appropriate sites.

“We’re also including new kid safety hotlines from across the globe in our in-app reporting features.” This implies that if minors report relevant concerns—such as nudity, threats to reveal private photographs, or sexual exploitation or solicitation—we will refer them to local child safety helplines, if accessible,” the business said.

Technology to identify sextortionists
Meta claims to eliminate sextortionist accounts as soon as it becomes aware of them, but it must first identify problematic actors before shutting them down. Therefore, the organization is attempting to go farther by “developing technology to help identify where accounts may potentially be engaging in sextortion scams, based on a range of signals that could indicate sextortion behavior.”

“While these signals aren’t necessarily evidence that an account has broken our rules, we’re taking precautionary steps to help prevent these accounts from finding and interacting with teen accounts,” the firm said in a statement. “This builds on the work we already do to prevent other potentially suspicious accounts from finding and interacting with teens.”

It is unclear what technology Meta is using to conduct this study or what signs would indicate a possible sextortionist (we have requested additional information). Presumably, the corporation will examine communication patterns in order to identify problematic individuals.

Meta will restrict accounts identified as potential sextortionists from chatting or connecting with other users.

“[A]ny message requests potential sextortion accounts try to send will go straight to the recipient’s hidden requests folder, meaning they won’t be notified of the message and never have to see it,” the business said in a statement.

Users who are already chatting with potential scam or sextortion accounts will not have their chats terminated, but will see safety notices “encouraging them to report any threats to share their private images and reminding them that they can say ‘no’ to anything that makes them feel uncomfortable,” according to the company.

Instagram already restricts teens from receiving direct messages (DMs) from unlinked adults. However, Meta goes a step further, claiming to be testing a feature that disables the “Message” button on kids’ profiles to safeguard against potential sextortion accounts, even when linked.

“We’re also testing hiding teens from these accounts in people’s follower, following, and like lists, making it harder for them to find teen accounts in search results,” the company said.

It’s worth mentioning that the firm is under more scrutiny in Europe for kid safety issues on Instagram, with regulators questioning its strategy since the bloc’s Digital Services Act (DSA) went into effect last summer.

A lengthy, gradual crawl to safety.
Meta has previously announced initiatives to prevent sextortion, most notably in February, when it increased access to Take It Down. The third-party program allows users to make a hash of an intimate photograph on their own device and share it with the National Center for Missing and Exploited Children, contributing to the creation of a library of non-consensual image hashes that firms may use to find and delete revenge porn.

Critics have criticized the company’s past methods of addressing this issue, as they required young people to post nude images. Without strict regulations to protect minors, social networks allowed Meta to self-regulate for years, leading to mixed results.

However, the United Kingdom’s Children Code (which went into effect in 2021) and the EU’s more recent DSA have imposed certain rules on platforms in recent years, forcing digital behemoths like Meta to pay greater attention to safeguarding children.

For example, in July 2021, Meta began defaulting young people’s Instagram accounts to private shortly before the UK compliance date. In November 2022, Meta further tightened the privacy settings on adolescents’ Instagram and Facebook accounts.

The business stated in January that it will automatically implement tougher message controls for adolescents on Facebook and Instagram, just before the DSA’s full compliance deadline in February.

This sluggish and incremental feature creep at Meta regarding protective measures for underage users raises concerns about why the firm waited so long to implement tighter protections. It appears that Meta chose a cynical minimum of protection in an attempt to limit the effect on use and promote engagement over safety. That is precisely what Meta whistleblower Francis Haugen has constantly criticized her old company for.

When asked why the company is not also implementing these new protections on Facebook, a Meta spokeswoman told Eltrys, “We want to respond to where we see the greatest need and relevance—which, when it comes to unwanted nudity and educating teens on the risks of sharing sensitive images—we believe is on Instagram DMs, so that’s where we’re focusing first.”

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Add a comment Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post

Following a compromise, the US government advises Sisense users to change their passwords.

Next Post

A European auto company will test Sanctuary AI's humanoid robot.

Advertisement