To protect children, the iPhone will detect pornographic photos in messages

Share This Post

- Advertisement -

The company will deploy in France a new function of iOS 16, to warn the youngest.

On iOS, parents will now be able to better protect their children from seeing explicit photos. With various media, including BFMTV, Apple announces that a new feature, called “Communications Security in Messages”, will be implemented at the beginning of the school year in France, on the occasion of the launch of iOS 16. Its objective is to better protect minors , when using the iMessage SMS and messaging app.

- Advertisement -

warning messages

Specifically, if parents enable this feature (via the Family Sharing options), iPhone will analyze each image received in iMessage before displaying it. If the artificial intelligence detects elements of nudity, the photo will be blurred by default. If the young user chooses to view it, a warning message will be displayed, warning in particular that it may shock them, or that the person taken in the photo may not consent to this transmission.

As specified by Apple on its site, the protection measure works both ways: a preventive message will also be displayed if a minor tries to send an analyzed photo with nudes. The user will then be encouraged to discuss the situation with trusted family members, if pressured to submit such images. Apple has also improved its Siri voice assistant, so it can respond to requests for advice on what to do in the event of a troublesome situation.

- Advertisement -

In 2021, Apple announced an iCloud image analysis tool to detect possible child pornography content. This initiative was finally shelved, in the face of criticism from privacy advocates. A problem that will not arise with this new option: all the analysis of the images will be done locally, on the smartphone, thus preserving the end-to-end encryption of iMessage.

Author: Raphael Grably
Source: BFM TV

- Advertisement -

Related Posts