Tuesday, September 27, 2022
Home Health Apple’s iOS 15.2 beta update adds Child Safety feature for Messages app

Apple’s iOS 15.2 beta update adds Child Safety feature for Messages app

[ad_1]

Apple has started rolling out the iOS 15.2 beta update, which brings one of the Child Safety features that the company announced earlier this year, though there is a slight modification. The latest update adds a Communication Safety feature for the Messages app, which as the name implies is aimed at keeping children safer online.

The new feature isn’t enabled by default and one will have to activate it manually on the Messages app. Once the feature is enabled, the app can reportedly detect nudity in images that are sent or received by children. Furthermore, if a nude image is sent to anyone, it will automatically be blurred and the child will receive warnings about the content, as per a report by Macrumors.

The company will also reportedly offer resources to contact someone they trust for help. If a child received a nude image, then the app will ask the child to not view the photo. It is worth noting that when Communication Safety was first announced, Apple asserted that if a child views a nude image in Messages, then the parents of children under the age of 13 will get the option to receive a notification for the same.

But, Apple appears to have removed this notification option as it could pose a risk to children if they are involved in a situation where there is parental violence or abuse. Apple appears to have found a better solution to this and will help offer guidance from a trusted adult.

The company says that the Messages app analyses image attachments to check for nudity in photos and that this will not affect user privacy as the messages will remain end-to-end encrypted. Apple will still have no access to Messages.

Besides, Apple announced one more safety feature a few months back, which is called anti-CSAM (child sexual abuse imagery detection). This is different from the Communication Safety feature and is expected to be rolled out in the future.

With this feature, the Cupertino giant aims to detect child sexual abuse and trafficking in iCloud photos. But, the launch of this feature was delayed as Apple said it first address the complaints filed by privacy advocates. The anti-CSAM feature is to find child sexual abuse images by scanning a user’s iCloud Photos against a list of known CSAM, which raised privacy concerns. If the feature detects enough matches, it will alert Apple’s moderators, who can then disable the account and report the images to legal authorities.

[ad_2]

Source link

RELATED ARTICLES

How Can You Blow Someone Who Has a Penis?

How Can You Blow Someone Who Has a Penis: To ensure that you're doing everything to the highest level of excellence Here's...

What Are Power Chains for Braces?

What Are Power Chains for Braces?: It's are an device that uses pressure to alter the position of your jaw and teeth. They can...

Need to Know Side Effects of Cosentyx

Need to Know Side Effects of Cosentyx : If you have been prescribed the medication Cosentyx, it is important to be aware...

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Indian Air Force Pilot Exams

Indian Air Force Pilot Exams: Its primary responsibility is to secure airspace and to conduct aerial warfare during a conflict. The IAF...

How Can You Blow Someone Who Has a Penis?

How Can You Blow Someone Who Has a Penis: To ensure that you're doing everything to the highest level of excellence Here's...

Facebook Dating Not Showing Up

Facebook Dating Not Showing Up: With the latest feature of Facebook you will be able to see your date based on the...

What Are Power Chains for Braces?

What Are Power Chains for Braces?: It's are an device that uses pressure to alter the position of your jaw and teeth. They can...