Dozens of organizations are asking Apple not to scan user messages and photos

Dozens of organizations are asking Apple not to scan user messages and photos

3f85260a 791f 4d30 bc75 0ef93f02810f Apple

In an open letter released on Thursday, more than 90 privacy organizations called on Apple to refrain from implementing its child pornography and child protection plan, which provides for controls on the messages it exchanges. minors through iMessage as well as photos uploaded by all users in the company cloud.

Organizations are concerned that Apple may eventually succumb to the legal demands of authoritarian regimes and extend surveillance to the detriment of privacy and the right to free expression.

"While these tools are about protecting children and disseminating child pornography, we are concerned that they will be used for censorship, threaten the privacy and security of people around the world, and have devastating consequences for many children," the organizations wrote. in the open letter.

The big campaign, according to Reuters, was organized by the American non-profit organization Center for Democracy & Technology. Signatories include the American Civil Liberties Union, the Electromic Frontier Foundation, Access Now, Privacy International, and the Tor Project.

An Apple spokesman said the company responded to the concerns with a document released on Friday stating that the technical specifications of the new tool would prevent governments from violating them.

The letter was signed by several organizations in Brazil, which has repeatedly blocked Facebook's WhatsApp due to the company's refusal to decrypt messages related to police investigations. The country's senate also passed a law that would impose traceability of messages. A similar law was passed in India last year, Reuters points out.

Reactions to the tool, announced two weeks ago, have caught up with Apple Hexapine, which has been quick to point out that the photo scan is not about images stored on Apple's own devices, but about images uploaded by users to the cloud. the company's.

The system only deals with images of child pornography that have been marked as such in an international database, the company assured, emphasizing that all major Internet companies apply similar controls.

A different tool will recognize and blur nude images in messages exchanged by underage users via iMessage, allowing them to be viewed only with parental approval.

The letter claims that the measure puts children who grow up with intolerant or overly strict parents or children seeking educational material at risk.

In general, the organizations say, the new tools effectively eliminate message encryption in iMessage, which Apple has strongly defended in other cases.

"Once the 'backdoor' feature is introduced, governments could force Apple to extend it [..] and detect images that are considered controversial for reasons other than sexual provocation," the letter said.

Apple, for its part, has said it will reject such requests for an extension of controls, but did not say whether it will withdraw from authoritarian countries instead of enforcing their rulings.

Source: In.gr