Apple delays child abuse detection system after backlash

0

Internet Privacy Updates

Apple bowed to pressure over a planned launch of software to detect photos of child pornography and sexual abuse on iPhones after a backlash from privacy activists.

The company said it would delay and potentially modify the new system, which was originally scheduled to launch this year.

“We have decided to take more time over the next few months to gather feedback and make improvements before releasing these critically important child safety features,” Apple said in a statement.

One feature on offer involved a system for matching files uploaded from a user’s iPhone to iCloud Photos with a database of known child sexual abuse images.

But the new controls, which were announced last month, raised widespread alarm among privacy and human rights groups who feared an iPhone image scanning tool could be abused. by repressive regimes.

The American Civil Liberties Union was among those who warned that any system to detect data stored on a phone could also be used against activists, dissidents and minorities.

“Given the widespread interests of governments around the world, we cannot be sure that Apple will always resist requests to scan iPhones for additional selected hardware,” Daniel Kahn Gillmor, ACLU technologist, noted Last week. “These changes are a step towards much worse privacy for all iPhone users.”

Apple’s change of course has of course dismayed some child protection activists. Andy Burrows, head of online child safety policy at UK charity NSPCC, said the move was “incredibly disappointing“And that the company” should have held on “.

Apple’s initial proposal had been well received by officials in the US, UK and India, but angered Silicon Valley during delicate negotiations between the tech industry and regulators over the fight. against online child abuse.

The WhatsApp manager called it “very worrying.” The Electronic Frontier Foundation, the Silicon Valley digital rights group, said it was a “shocking about-face for users who have relied on the company’s leadership in the field. confidentiality and security ”.

In an email circulated internally at Apple, child safety activists dismissed complaints from privacy activists and security researchers as the “screaming voice of the minority.”

Apple had spent weeks vigorously defending its plan, which it said involved “cutting edge” cryptographic techniques to ensure that the company itself couldn’t see what images were stored on customers’ devices.

He said the system would only be used for the protection of children and the involvement of a team of human reviewers, as well as a minimum number of images that must be detected before an account is reported. , would almost eliminate the risk of errors or abuse.

But Craig Federighi, senior vice president of software engineering at Apple, admitted that the introduction of the child pornography detection system alongside a separate tool that could alert parents if their children received sexually explicit photos via its iMessage system, was confusing.

“It’s really clear that a lot of the messages have mixed up pretty badly in terms of understanding things,” Federighi told the the Wall Street newspaper last month. “Looking back, introducing these two features at the same time was a recipe for this kind of confusion.”



Source link

Leave A Reply

Your email address will not be published.