Wednesday, February 1, 2023
Home Health Apple’s iOS 15.2 beta update adds Child Safety feature for Messages app

Apple’s iOS 15.2 beta update adds Child Safety feature for Messages app

[ad_1]

Apple has started rolling out the iOS 15.2 beta update, which brings one of the Child Safety features that the company announced earlier this year, though there is a slight modification. The latest update adds a Communication Safety feature for the Messages app, which as the name implies is aimed at keeping children safer online.

The new feature isn’t enabled by default and one will have to activate it manually on the Messages app. Once the feature is enabled, the app can reportedly detect nudity in images that are sent or received by children. Furthermore, if a nude image is sent to anyone, it will automatically be blurred and the child will receive warnings about the content, as per a report by Macrumors.

The company will also reportedly offer resources to contact someone they trust for help. If a child received a nude image, then the app will ask the child to not view the photo. It is worth noting that when Communication Safety was first announced, Apple asserted that if a child views a nude image in Messages, then the parents of children under the age of 13 will get the option to receive a notification for the same.

But, Apple appears to have removed this notification option as it could pose a risk to children if they are involved in a situation where there is parental violence or abuse. Apple appears to have found a better solution to this and will help offer guidance from a trusted adult.

The company says that the Messages app analyses image attachments to check for nudity in photos and that this will not affect user privacy as the messages will remain end-to-end encrypted. Apple will still have no access to Messages.

Besides, Apple announced one more safety feature a few months back, which is called anti-CSAM (child sexual abuse imagery detection). This is different from the Communication Safety feature and is expected to be rolled out in the future.

With this feature, the Cupertino giant aims to detect child sexual abuse and trafficking in iCloud photos. But, the launch of this feature was delayed as Apple said it first address the complaints filed by privacy advocates. The anti-CSAM feature is to find child sexual abuse images by scanning a user’s iCloud Photos against a list of known CSAM, which raised privacy concerns. If the feature detects enough matches, it will alert Apple’s moderators, who can then disable the account and report the images to legal authorities.

[ad_2]

Source link

RELATED ARTICLES

Overview of Masseter Botox

Overview of Masseter Botox: Botox can be described as injectable to relax muscles. It is a drug that uses onabotulinumtoxinA which is a neurotoxin...

Need to Know Before Getting a Venom Tongue

Need to Know Before Getting a Venom Tongue : A venom piercing refers to the piercing of two tongues -both sides of...

Overview of O-Shot

Overview of O-Shot: For many women suffering from issues with sexual sexuality as well as women who do not sexual dysfunction, there is a...

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Somber Smithing Stone 3 : Different Methods to know

In this article we will go over the different methods for obtaining the Somber Smithing Stone 3, which can be found in...

How does Entrepreneurship Education Help You Leverage Your Creativity?

Education is the key to development. But will the traditional education system help you in your startup plans? Startup plans mean having...

Overview of Masseter Botox

Overview of Masseter Botox: Botox can be described as injectable to relax muscles. It is a drug that uses onabotulinumtoxinA which is a neurotoxin...

Different ways to fix Update Google Chrome

Different ways to fix Update Google Chrome: If you're like me, you probably use Google Chrome as your primary web browser. It's...