Apple is delaying its child protection features announced last month, including a controversial feature that would scan users’ photos for child sexual abuse material (CSAM), following intense criticism that the changes could diminish user privacy.
The changes had been scheduled to roll out later this year.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple said in a statement to The Verge. “
Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”