Apple’s controversial child pornography scan had to be put on hold after major protests by data protection activists, but Apple still has child protection on the agenda and is making a first push with iOS 15.2.
A new function has been integrated into the Messages app and is intended to protect children from receiving and sending nude pictures.
Parents can set up children’s iPhones, iPads and Macs to recognize nude pictures and warn the affected child against both viewing and sending such content.
For this purpose, it pixelates incoming images and displays a warning, a similar notice appears when trying to send such an image. However, the user can swipe away the notification and still receive or send the image.
When Apple first announced the plans it was intended that parents would be automatically notified if such an image was sent or received. However, this has changed slightly, now the child has the option to contact the parents themselves via a button in the warning.
For this function to be active at all, parents must activate it in the Parental Controls settings. This opt-in is intended to counteract concerns that authorities or other third parties could exploit the interface for monitoring.
Apple has also updated Siri to make it more difficult to find child pornography and to provide better assistance to victims of sexual violence. When searching for child pornography a user is also informed about the harmful nature of the content and links to prevention assistance are displayed.
This article originally appeared on Macwelt. Translation by Karen Haslam.