Home » Technology » Apple’s nudity-blurring Messages feature gets international release

Share This Post

Technology

Apple’s nudity-blurring Messages feature gets international release

Apple’s nudity-blurring Messages feature gets international release

Apple’s “communication safety in Messages” feature, which is designed to automatically blur images containing nudity sent to children using the company’s messaging service, is now rolling out to additional countries. After launching in the US last year, the feature is now coming to the Messages apps on iOS, iPadOS, and macOS for users in the UK, Canada, New Zealand, and Australia. Exact timings are unclear, but The Guardian reports that the feature is coming to the UK “soon.”

Scanning happens on-device, and does not impact the end-to-end encryption of messages. Instructions on how to enable the feature, which is integrated with Apple’s existing Family Sharing system, can be found here.

The opt-in feature scans incoming and outgoing pictures for “sexually explicit” material to protect children. If found, the image is blurred and guidance is provided for finding help alongside reassurances that it’s ok not to view the image and to leave the conversation “You’re not alone, and can always get help from someone you trust or with trained professionals,” reads the pop-up message. “You can also block this person.”

Similar to its initial release in the US, children will have the option of messaging an adult they trust about a flagged photo. When Apple originally announced the feature last August, it suggested that this notification would happen automatically. Critics were quick to point out that this approach risked outing queer kids to their parents, and could otherwise be abused.

Apple is also expanding the rollout of a new feature for Spotlight, Siri, and Safari searches that will point users towards safety resources if they search for topics relating to child sexual abuse.

Alongside these two child safety features, Apple originally announced a third initiative last August that involved scanning photos for child sexual abuse material (CSAM) before they’re uploaded to a user’s iCloud account. However, this feature drew intense backlash from privacy advocates, who argued it risked introducing a backdoor that would undermine the security of Apple’s users. The company later announced it would delay the rollout of all three features while it addressed concerns. Having released the first two features, Apple has yet to provide an update on when the more controversial CSAM detection feature will become available.

Share This Post

Viewing 1 post (of 1 total)
Viewing 1 post (of 1 total)
  • You must be logged in to reply to this topic.