Tue, 21-Oct-2025

Google Ads | Google Ads | Google Ads | Google Ads | Google Ads | Google Ads | Google Ads | Google Ads

Apple Will Alert Parents About Kids Exchanging Sexually Explicit Images

Apple Will Alert Parents About Kids Exchanging Sexually Explicit Images

In the end of 2021, Apple will be launching new tools that alarms parents and children when the child sends or receives sexually explicit photos via messaging app. The new feature will be launched with the aim of limiting Child Sexual Abuse Material (CSAM).

Apple will easily detect unknown CSAM images on iPhone and iPads etc., while respecting the consumer privacy.

Furthermore, these warnings focus on guiding the child to make the right decision by choosing not to view the content. Even if the child taps on such content, he/she will be warned that proceeding further will notify the guardian, and that the parents want the user to be safe.

In principle, people think this is excellent and about time. CSE is very damaging, as the effects can go on for a lifetime and attempts to thwart those that perpetuate such crimes are long overdue.

[embedpost slug = “/apple-sets-ashley-on-undefined-leave-after-she-voice-about-sexism/”]