Apple Expands Nude Image Detection in iOS 17 to Protect Minors

Apple Expands Nude Image Detection in iOS 17 to Protect Minors

Apple is expanding protections against unwanted nudes with iOS 17’s Communication Safety feature, available to all users of its mobile operating system.

Communication Safety uses machine learning to scan incoming messages for nude images, warning the sender if any are found which are inappropriate for recipients under age 18. Once warned, they have the choice between sending it anyway or canceling their message.

The new feature aims to prevent minors from receiving unwanted nude images and help parents/guardians keep their children secure while browsing online.

iOS 17 comes packed with more privacy and security improvements than just its new Communication Safety feature, including:

  • Users now have more control of their data with our newly added privacy dashboard and features like hiding email addresses from senders as well as creating strong passwords.
  • iOS 17 is currently in beta testing and expected to be made available to the general public sometime later in 2023.

Below are more details regarding the Communication Safety feature:

  • This feature is only available to users under 18 and enabled by default
  • Users have the option of turning it off at any time if desired.
  • Utilising machine learning technology, this feature scans messages for nude images.
  • When an explicit image is detected, its sender will be warned of its recipient being under 18 and given the choice between sending it anyway or cancelling their message.

Communication Safety features can provide parents and guardians with a powerful tool for safeguarding minors against being exposed to inappropriate content online. Furthermore, this feature can protect minors from being sent unsolicited pornographic images that they might receive as unsolicited SMS.

Source

Leave a Comment