What’s New In Apple’s Child Safety Initiative

Posted on 16-08-2021
Posted by devmin

Apple has recently announced the new child Safety initiative, which claims to empower people and enrich their lives through advanced technology. The basic idea behind this endeavor is to help parents protect their children from predators who use CSAM  to exploit children of age 12 and below. For this purpose, Apple has implemented new technology in three major areas:

  1. the Messages App,
  2. iCloud Photo Library,
  3. and Siri & Search.

The Child Safety feature will be available later this year for accounts set up as families in iCloud for iOS 15, iPad 15, and macOS Monterey.

Let’s explore what’s new in Apple’s Child Safety Initiative!

Communication Safety in Messages

New communication tools will enable parents to assist their children in navigating online communication in a more informed and safe way. While users’ private communications will remain unreadable by Apple, the Message app will use machine learning to alert users about sensitive content.

The Communication Safety feature in the Messages app on iPhone, iPad and Mac is optional and aims to help parents gain more control without hurting their privacy. If enabled, the messaging app can warn children and their parents when receiving or sending sexually explicit photos. This feature in the Messages app will use on-device machine learning to detect image attachments, and if a photo is identified as sexually explicit, it will be automatically blurred, and the child will be warned. This way, parents can reassure that their children do not interact with unwanted or explicit content.

CSAM Detection for iCloud Photos

CSAM stands for Child Sexual Abuse Material, and Apple has aimed to protect children from such content by detecting images determined as CSAM. Apple will provide valuable information to law enforcement through CSAM detection in iCloud Photos.

As part of Apple’s design for privacy, iOS and iPadOS will use new cryptographic tools to inhibit the spread of CSAM online.

Prior to storing an image in iCloud Photos, the image is checked against the known CSAM hashes on the device. The matching process uses a cryptographic technique called private set intersection to determine whether there is a match without revealing results or breaching users’ privacy.

However, the system does not apply to those whose iCloud Photos are disabled. A private photo library on your iPhone does not work with this feature.

Enabling this feature to iCloud photos will allow Apple to detect known Child Sexual Abuse Material (CSAM) images in iCloud Photos and report this material to the National Center for Missing and Exploited Children (NCMEC), which is a non-profit organization affiliated with US law enforcement. Apple confirmed today that the process would only apply to photos being uploaded to iCloud Photos and not videos.

Siri and Search

The third area where Apple is releasing this update is Siri and Search. As part of its broader guidance program, Apple is expanding benefits provided by Siri and Search to help families stay safe online and find help in dangerous situations. Siri, for instance, will tell users how and where to file a report in case of CSAM or child exploitation.

The updated feature to Siri and Search provides parents and children a wider range of information and assistance when faced with an unsafe situation. In addition, Siri and Search will step in when users search for topics related to CSAM. By educating users on the danger and problem of interest in this topic and by providing them with resources to address this issue, this intervention will help address the child safety concern.

The Bottom Line

The recent update from Apple has been the talk of the town after the announcement. However, it has gathered mixed reviews from Apple users. Some advocate this feature from the safety point of view and give thumbs up to the initiative. On the other hand, a substantial number of users find this update a breach of their security and privacy. As soon as the update implies to the account set up options by the end of this year, we would be better able to understand the true aspects of this recently announced feature from Apple.

GOT AN IDEA

LET'S GET IN TOUCH!

Let's discuss your project and find out what we can do to provide value.

Table of Content