iPhone
REUTERS
Apple IncĀ (AAPL.O)Ā is planning to install a software on U.S. iPhones that will scan for child abuse imagery, the Financial Times reported on Thursday, citing people familiar with the matter.
Earlier this week, the company had elaborated its planned system, called "neuralMatch," to academics in the United States via a virtual meeting, the reportĀ said, adding that its plan could be publicized widely as soon as this week.
Please use the sharing tools found via the share button at the top or side of articles. Copying articles to share with others is a breach of FT.com T&Cs and Copyright Policy. Email licensing@ft.com to buy additional rights. Subscribers may share up to 10 or 20 articles per month using the gift article service. More information can be found here.
https://www.ft.com/content/14440f81-d405-452f-97e2-a81458f5411f
Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of peopleās personal devices.
Apple detailed its proposed system ā known as āneuralMatchā ā to some US academics earlier this week, according to two security researchers briefed on the virtual meeting.
The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US.
Apple confirmed its plans in a blog post, saying the scanning technology is part of a new suite of child protection systems that would āevolve and expand over timeā. The features will be rolled out as part of iOS 15, expected to be released next month.
āThis innovative new technology allows Apple to provide valuable and actionable information to the National Center for Missing and Exploited Children and law enforcement regarding the proliferation of known CSAM [child sexual abuse material],ā the company said.
āAnd it does so while providing significant privacy benefits over existing techniques since Apple only learns about usersā photos if they have a collection of known CSAM in their iCloud Photos account.ā
Reporting by Akanksha Rana in Bengaluru; Editing by Arun Koyyur
Qatar Secures Place Among the World's Top 10 Wealthiest Nations
Hamad International Airport Witnesses Record Increase in Passenger Traffic
Saudi Arabia: Any visa holder can now perform Umrah
What are Qatar's Labour Laws on Annual Leave?
Leave a comment