Apple issues a vague statement promising "improvement" but still scans photos. But Apple said it still plans to implement the system after making "improvements" to respond to criticism.
Apple made this announcement today to Ars and other news organizations:
Last month we launched apps for features designed to help protect children from predators who use communication tools for their use and abuse, and to limit their spread, help us announce. Child Sexual Abuse Materials [CSAM]. Based on customer feedback, support groups, researchers, and others, we've decided to spend more time gathering information and improvements before releasing these important child safety features.
This statement is vague and does not say what kind of changes Apple will make or even which support groups and researchers it will collect information from. But with Apple's backlash from security researchers, privacy advocates and privacy-conscious customers, it looks like Apple will try to address user privacy concerns and give the government broader access to customers' photos. to reform the government.
Privacy groups warn of government access
It's not clear how Apple can implement the system in a way that addresses its critics' biggest privacy concerns. Apple has claimed that it will deny government requests to extend image scanning beyond CSAM. But privacy and security proponents argue that once the system is up and running, Apple likely can't avoid providing more user content to governments.Advertising
“Once this feature was introduced into Apple products, the company and 90 policy groups from the United States and around the world wrote in an open letter that competitors are under enormous pressure — and potential legal requirements — from the government. It will face the world not only to scan images A search for CSAM, but also for other images the government considers objectionable. Companies have categorized them as "terrorist" or violent extremist content, or even disturbing images of politicians pressuring the company to erase them with them. This batch can affect all images stored on the device, And not just photos uploaded to iCloud. Thus, Apple laid the foundation for global censorship, surveillance, and harassment.”
Apple previously announced that devices with iCloud Photos are enabled to scan photos before uploading to iCloud. Since the iPhone uploads any photo immediately after taking it to iCloud, if the user has already turned on iCloud Photos, new photos will be scanned almost immediately.
Apple said it will add a tool to the messaging app that “analyzes photo attachments” “This system will be optional for parents, and they can enable it to alert Apple devices when they receive or send sexually explicit photos to children and their parents.” This year, for the first time In the US, as part of an update to iOS 15, iPadOS 15, watchOS 8 and macOS Monterey. The scanning system can run later than the Apple app, but the company has never provided a specific release date to begin with.As we wrote earlier, Apple says that its CSAM scanning technology "analyzes an image and converts it into a number unique to that image." And if the hash image is checked, it is known as the same or nearly identical hash in the CSAM database. Once around 30 CSAM images have been identified, an account can be reported to the National Center for Missing and Exploited Children (NCMEC), the limit Apple has set to ensure “less than one in a trillion chances of incorrect reporting out there” is a case of “accounts.” This limit can be changed in the future to keep the rate of positive positivity from one to one trillion. p>
Apple has argued that its system is actually a privacy breakthrough. "It's special because it scans photos." Privacy the way we can imagine and in the most sensitive and observable way. "
" If you look at another cloud service, it now scans photos by looking at every photo in the cloud. “We wanted to be able to see images like this in the cloud without people seeing, and we created an architecture for that,” Craig Federighi, Apple's senior vice president of software engineering, said last month. He said Apple's system was "more private than anything it's ever been done before". Apple has partnered with NCMEC on the project, which has blamed privacy criticism for the "unnatural sound of the minority." Apparently Apple endorsed the statement because it distributed it in an internal memo to employees on the day of the photo-scanning plan
Apple promised to change photo-scanning apps on iPhone to face criticism
In 2018, Auckland passed a law allowing citizens to u...
High Energy Consumption Bitcoin is the dirty secret of this cryptocurrency. To mine bitco...
China's crackdown on cryptoc...