[ad_1]
Apple has started rolling out the iOS 15.2 beta update, which brings one of the Child Safety features that the company announced earlier this year, though there is a slight modification. The latest update adds a Communication Safety feature for the Messages app, which as the name implies is aimed at keeping children safer online.
The new feature isn’t enabled by default and one will have to activate it manually on the Messages app. Once the feature is enabled, the app can reportedly detect nudity in images that are sent or received by children. Furthermore, if a nude image is sent to anyone, it will automatically be blurred and the child will receive warnings about the content, as per a report by Macrumors.
The company will also reportedly offer resources to contact someone they trust for help. If a child received a nude image, then the app will ask the child to not view the photo. It is worth noting that when Communication Safety was first announced, Apple asserted that if a child views a nude image in Messages, then the parents of children under the age of 13 will get the option to receive a notification for the same.
But, Apple appears to have removed this notification option as it could pose a risk to children if they are involved in a situation where there is parental violence or abuse. Apple appears to have found a better solution to this and will help offer guidance from a trusted adult.
The company says that the Messages app analyses image attachments to check for nudity in photos and that this will not affect user privacy as the messages will remain end-to-end encrypted. Apple will still have no access to Messages.
Besides, Apple announced one more safety feature a few months back, which is called anti-CSAM (child sexual abuse imagery detection). This is different from the Communication Safety feature and is expected to be rolled out in the future.
With this feature, the Cupertino giant aims to detect child sexual abuse and trafficking in iCloud photos. But, the launch of this feature was delayed as Apple said it first address the complaints filed by privacy advocates. The anti-CSAM feature is to find child sexual abuse images by scanning a user’s iCloud Photos against a list of known CSAM, which raised privacy concerns. If the feature detects enough matches, it will alert Apple’s moderators, who can then disable the account and report the images to legal authorities.
[ad_2]
Source link