Trending

Apple to install software on iPhones to look for child abuse, reports say

Apple will soon be adding software to iPhones that will scan a user’s photos for images of child abuse, according to reports.

>> Read more trending news

Financial Times first reported that the tech giant will be adding a system called “neuralMatch,” Reuters reported.

The system is expected to be introduced to users soon and could be announced as early as this week.

Two security experts told Financial Times that the system has already been shown to academics.

The system, which will be automated, would flag images that look to be illegal, sending the photos to human reviewers. If the photos are illegal, law enforcement would be contacted, Financial Times reported.

It would use known child sexual abuse material (CSAM) fingerprints to compare images on a phone before sending them to a person to review, Gizmodo reported.

When the Financial Times asked Apple about “neuralMatch,” the company did not comment. It also has not been confirmed by Apple, Gizmodo reported.

Photos stored on cloud-based servers are already scanned for child abuse imagery, Financial Times reported, but Apple’s plan is to scan a user’s phone and the images stored on it.

But some are questioning if a line is being crossed that separates a user’s privacy and law enforcement.

Matthew Green, an associate professor at Johns Hopkins Information Security Institute doesn’t agree with the move, Gizmodo reported.

Green, in a series of Tweets Wednesday, called it a “really bad idea.”

Read his complete Twitter thread here.

He’s not alone.

Ross Anderson, a professor of security engineering at the University of Cambridge, called the plan “appalling,” when he spoke with Financial Times.

“It is an absolutely appalling idea because it is going to lead to distributed bulk surveillance of ... our phones and laptops,” Anderson told the media outlet.