Skip to content
Subscriber Only
Technology
Cybersecurity

Apple Says Feature to Find Child Images Doesn’t Create Backdoor

  • Company defends rollout of new tools after privacy concerns
  • Apple to analyze user devices for explicit photos of children
Apple iPhone 12 Goes On Sale In Stores
Photographer: David Paul Morris/Bloomberg

Apple Inc. defended concerns about its upcoming child safety features, saying it doesn’t believe its tool for locating child pornographic images on a user’s device creates a backdoor that reduces privacy.

The Cupertino, California-based technology giant made the comments in a briefing Friday, a day after revealing new features for iCloud, Messages and Siri to combat the spread of sexually explicit images of children. The company reiterated that it doesn’t scan a device owner’s entire photo library to look for abusive images, but instead uses cryptography to compare images with a known database provided by the National Center for Missing and Exploited Children.