Thursday, January 20, 2022
Home Health Apple’s iOS 15.2 beta update adds Child Safety feature for Messages app

Apple’s iOS 15.2 beta update adds Child Safety feature for Messages app

[ad_1]

Apple has started rolling out the iOS 15.2 beta update, which brings one of the Child Safety features that the company announced earlier this year, though there is a slight modification. The latest update adds a Communication Safety feature for the Messages app, which as the name implies is aimed at keeping children safer online.

The new feature isn’t enabled by default and one will have to activate it manually on the Messages app. Once the feature is enabled, the app can reportedly detect nudity in images that are sent or received by children. Furthermore, if a nude image is sent to anyone, it will automatically be blurred and the child will receive warnings about the content, as per a report by Macrumors.

The company will also reportedly offer resources to contact someone they trust for help. If a child received a nude image, then the app will ask the child to not view the photo. It is worth noting that when Communication Safety was first announced, Apple asserted that if a child views a nude image in Messages, then the parents of children under the age of 13 will get the option to receive a notification for the same.

But, Apple appears to have removed this notification option as it could pose a risk to children if they are involved in a situation where there is parental violence or abuse. Apple appears to have found a better solution to this and will help offer guidance from a trusted adult.

The company says that the Messages app analyses image attachments to check for nudity in photos and that this will not affect user privacy as the messages will remain end-to-end encrypted. Apple will still have no access to Messages.

Besides, Apple announced one more safety feature a few months back, which is called anti-CSAM (child sexual abuse imagery detection). This is different from the Communication Safety feature and is expected to be rolled out in the future.

With this feature, the Cupertino giant aims to detect child sexual abuse and trafficking in iCloud photos. But, the launch of this feature was delayed as Apple said it first address the complaints filed by privacy advocates. The anti-CSAM feature is to find child sexual abuse images by scanning a user’s iCloud Photos against a list of known CSAM, which raised privacy concerns. If the feature detects enough matches, it will alert Apple’s moderators, who can then disable the account and report the images to legal authorities.

[ad_2]

Source link

RELATED ARTICLES

Choosing A Dentist – Tips To Find The Right One For You

Teeth Straightening You are here because you are wanting to find out what is required to have great teeth. And, you're not...

What is the healthiest energy bar to eat?

The energy bar is an interesting snack to have when you are on a diet or if you just want to follow healthy eating...

How Does Sleep Disorder Differ From Other Types Of Disorders?

Sleep disorders can be described as disorders of an individual's sleeping pattern. It can lead to an increase in the quality of...

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Why Custom Bakery Boxes Are Compulsory For Bakery Items?

Custom Bakery Boxes are a perfect addition to your line of baked goods. It is because your potential customers want to enjoy their...

How to Make Custom Bakery Boxes for Your Bakeshop?

Custom Bakery Boxes are an effective way to entice customers and make them feel special. While they can be pretty expensive to...

Why should We Choose Sydney for Studying Mechanical Engineering?

Mechanical engineering is a broad area that takes knowledge and applies it to developing novel mechanical systems to improve people's lives. Mechanical...

Know about Benefits Of Performance Tyres

Since buying a vehicle is not an easy task, many people try to cherish it. However, it is common for most people...