Apple is expanding iPhone feature that blocks unsolicited nudes

 (Nic Coury / AFP via Getty Images)
(Nic Coury / AFP via Getty Images)

Apple will start protecting iPhone users against unsolicited nudes in more places with its next software update.

Starting with iOS 17.2, which is expected to arrive in December, Apple will blur sensitive images in Contact Posters in both the Contacts and Phone apps, and stickers in Messages.

Apple introduced its Sensitive Content Warning feature on iOS 17 in September. When switched on, the filter blurs the contents of images and videos sent via Messages, AirDrop, Contact Posters, and FaceTime Video Messages.

The media is flagged in a pop-up message that asks users if they wish to view it anyway, along with guidance on how to stay safe.

Apple’s decision to expand the feature could be aimed at those trying to sidestep its filter using some of its latest iOS additions. It introduced the option to create visual calling cards with the launch of Contact Posters on iOS 17.

Apple also allowed users to create and place stickers anywhere in a message with the software update.

How to block unsolicited nudes on iPhone

The Sensitive Content Warning feature is turned off by default on your iPhone.

To switch it on, head to settings>privacy and security. Now, scroll down and tap sensitive content warning and then toggle it on.

Once it’s active, you can switch it off for individual apps and services.

Apple's nudity filter can be switched on for individual apps (Apple)
Apple's nudity filter can be switched on for individual apps (Apple)

With the filter turned on, you’ll start seeing a larger message icon with its contents blurred if Apple has flagged them as nudity. You can tap view to see the image or video.

If you’re unsure, tap the alert button to find resources or block the person who sent the sensitive content.

Can Apple see your photos?

Apple first offered the option to filter nudes in messages sent to and from children in 2022. The safety feature, known as “communication safety in Messages”, uses artificial intelligence to scan for sensitive content in messages.

With the launch of iOS 17, Apple introduced a similar function for adults, which is now being updated.

All the scanning is carried out “on-device”, meaning that the images are analysed by the iPhone itself, and Apple never sees either the photos being analysed or the results of the analysis, the company said.