Sections

ideals
Business Essentials for Professionals



Markets
11/08/2021

Its New Photo Scanning Child Protection System Defended By Apple




Its New Photo Scanning Child Protection System Defended By Apple
Despite a backlash from customers and privacy advocates, its new system that scans users' phones for child sexual abuse material (CSAM) was defended by iPhone maker Apple. Matches of known abuse material are searched by this technology prior to them being uploaded to the company’s iCloud storage.
 
Critics of the move however alleged that this technology could become a "backdoor" for the company use to eavesdrop on people. An open letter against the technology has been signed by and more than 5,000 people and organizations.
 
This has prompted Apple to promise to not “expand” the system for any reason.
 
In a warning issued last week, digital privacy campaigners alleged that anti-LGBT regimes could be boosted or crackdown on political dissidents in countries where protests are deemed illegal by authoritarian governments using the technology.  
 
But it "will not accede to any government's request to expand" the system, Apple said.
 
The company also talked about numerous safeguards it had put in place for stopping its system from being used for anything other than the detection of child abuse imagery. This was pointed out by the company in a question-and-answer document published by it.
 
"We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future," it said.
 
In the past, however, some concessions have been made by the company in different countries to continue doing business in such markets. For example, 39,000 apps from its Chinese App Store were removed by the iPhone maker last New Year's Eve as a part of the efforts of the Chinese government to curb the number of unlicensed games.
 
The company itself will also not be able to see or scan a user's photo album as the design of the anti-CSAM technology will prevent that from happening, Apple also said.
 
A database of hashes of known CSAM images provided by child safety organizations forms the basis for the system to look for matches on the device in a secured manner.  
 
Falsely flagging innocent people to police is almost impossible, Apple also claimed. "The likelihood that the system would incorrectly flag any given account is less than one in one trillion per year," it said. There is also a human review of positive matches.
 
The only thing that effectively could prevent the technology from being used for other purposes is the promise of the company of not allowing that to happen, argue privacy advocates.
 
For example, according to the argument of digital rights group the Electronic Frontier Foundation, "all it would take... is an expansion of the machine learning parameters to look for additional types of content".
 
"That's not a slippery slope; that's a fully-built system just waiting for external pressure to make the slightest change," it warned.
 
(Source:www.technologyreview.com)

Christopher J. Mitchell

Markets | Companies | M&A | Innovation | People | Management | Lifestyle | World | Misc