On Friday, the California-based Cupertino giant reported that it would need more time to gather inputs and make enhancements ahead of launching child safety features, 30 days after the company stated it would introduce a system to review iPhones for photos of child sexual abuse.
Approximately 90 policy and rights groups worldwide reported the Cupertino giant the previous month. It should give up schemes for scanning children’s messages for nudity and the smartphones of adults for photos of child sex abuse.
According to the program’s critics, the feature could be used by restrictive governments waiting to discover other material for suppression or arrests.
Apple stated in a statement on Friday,
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”