HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » Forums & Groups » Main » Latest Breaking News (Forum) » Apple unveils plans to sc...

Thu Aug 5, 2021, 05:30 PM

Apple unveils plans to scan US iPhones for images of child sex abuse

Source: The Hill

Apple will roll out an update later this year that will include technology in iPhones and iPads that allows the tech giant to detect images of child sexual abuse stored in iCloud, the company announced Thursday.

The feature is part of a series of updates Apple unveiled aimed at increasing child safety, but security researchers and advocates are warning the scanning update — along with one that aims to give parents protective tools in children’s messages — could pose data and security risks beyond the intended purpose.

With the new scanning feature, Apple will be able to report detected child sexual abuse material to the National Center for Missing and Exploited Children (NCMEC) which acts as a comprehensive reporting center and works in collaboration with law enforcement agencies across the country. The company will also disable users accounts if the abusive content is found, Apple said in the update.

Matthew Green, a security professor at Johns Hopkins University, told the Times “This will break the dam — governments will demand it from everyone.”

Read more: https://thehill.com/policy/technology/566603-apple-unveils-plans-to-scan-us-iphones-for-images-of-child-sex-abuse

27 replies, 1913 views

Reply to this thread

Back to top Alert abuse

Always highlight: 10 newest replies | Replies posted after I mark a forum
Replies to this discussion thread
Arrow 27 replies Author Time Post
Reply Apple unveils plans to scan US iPhones for images of child sex abuse (Original post)
left-of-center2012 Aug 5 OP
getagrip_already Aug 5 #1
Mawspam2 Aug 5 #2
Locrian Aug 5 #3
dalton99a Aug 6 #15
Journeyman Aug 5 #4
Mysterian Aug 6 #24
Girard442 Aug 5 #5
Sgent Aug 6 #10
Girard442 Aug 6 #18
Sgent Aug 6 #23
Kablooie Aug 6 #16
keithbvadu2 Aug 5 #6
marble falls Aug 5 #7
Hugh_Lebowski Aug 5 #8
Earth-shine Aug 5 #9
dalton99a Aug 6 #14
Girard442 Aug 6 #19
ExTex Aug 6 #20
dalton99a Aug 6 #22
Sgent Aug 6 #11
DVRacer Aug 6 #12
dalton99a Aug 6 #13
Lokilooney Aug 6 #17
Demovictory9 Aug 6 #21
Mysterian Aug 6 #25
dalton99a Aug 6 #27
TexasBushwhacker Aug 6 #26

Response to left-of-center2012 (Original post)

Thu Aug 5, 2021, 05:42 PM

1. Really. Bad. Idea. n/t

Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Thu Aug 5, 2021, 05:56 PM

2. I never liked Apple. Won't use Apple products...

...unless required by work. When I do, I always keep the camera lenz covered with electrical tape just for shit like this.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Thu Aug 5, 2021, 06:35 PM

3. all the more disgusting ....

that they are using child abuse to "justify" the abuse of privacy.
I H.A.T.E. Apple.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Locrian (Reply #3)

Fri Aug 6, 2021, 02:18 AM

15. Reminds me of Trump's little trick when he resumed federal executions


he started with a notorious racist murderer


Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Thu Aug 5, 2021, 06:39 PM

4. I hate how they use the threat of child pornography to infringe on our rights . . .

I know of no one who doesn't wish to see child pornographers caught and dealt harshly with, but too often the new methods of detection and apprehension require a collective loss of privacy and fourth amendment guarantees.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Journeyman (Reply #4)

Fri Aug 6, 2021, 09:13 PM

24. The few rights we have left thanks to the "war" on some drugs

I wish the Fourth Amendment was as sacred as the Second.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Thu Aug 5, 2021, 09:37 PM

5. There was an article many years ago about a drugstore chain that developed film.

They supposedly reported any sexually exploitative images of children, but when the reporter queried company management about the criteria for screening images, they clammed up. Basically, no one could be sure what kind of kid pix could get photographers in legal trouble.

The situation hasn't changed. Could a parent end up in court for a picture of a two-year-old's bare nipple?

And how would this even work? Apple reports troubling material to NMEC who does what? Reports their finding to local law enforcement who breaks down doors? Or just splashes a scarlet M for "molester" on their (virtual) doors and ruins their lives?

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Girard442 (Reply #5)

Fri Aug 6, 2021, 12:19 AM

10. Apple is putting all of your photos

through a formula that yields a number, and then sees if that number matches their database of numbers from the NCMEC. If it does they forward your info to them.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Sgent (Reply #10)

Fri Aug 6, 2021, 04:41 PM

18. A formula? Like, a program that uses AI to assign a porn score to each photo?

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Girard442 (Reply #18)

Fri Aug 6, 2021, 09:03 PM

23. No, its a complex hash function

AI is used in their other product they announced, but this is a matching function.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Girard442 (Reply #5)

Fri Aug 6, 2021, 03:38 AM

16. I remember a grandmother arrested for pictures of her granddaughter in the bath.

The photo developer reported her.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Thu Aug 5, 2021, 10:01 PM

6. Next year they will be looking for visits to certain web sites... such as DU?

Next year they will be looking for visits to certain web sites... such as DU?

Depending on which party is in office and on the Supreme Court.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Thu Aug 5, 2021, 10:08 PM

7. One of those ideas one says, "Great idea!" about, and less than five minutes later, go, "Nah."

Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Thu Aug 5, 2021, 10:30 PM

8. I thought I was going to have to say something controversial ...

And perhaps even end up shunned by my friends here.

I am happy to discover I don't, based on what's above.

I have ZERO ISSUE with Apple deciding to DELETE any pictures on their 'cloud storage' that are clearly illegal, to be clear. They shouldn't be expected to absorb that liability.

But actually 'scanning phones', the hardware people own, if that's what's being discussed, and then actually reporting people to the authorities?

I'm not sure I'm comfortable, despite the apparent value.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Hugh_Lebowski (Reply #8)

Thu Aug 5, 2021, 11:58 PM

9. The article says it will scan content in ICloud and eventually examine email and text messages ...

presumably, while they are in transit.

It does not say they will scan the actual phones.

So, if one does backups to ICloud, your content will be scanned.

Does anyone here actually think Apple is not already scanning your data? What about the Google Drive? Microsoft one-drive? Carbonite and other cloud backups?

They scan everything in their possession.

Right now, they look for viruses, illegal software, and other potential problems in your backups. With a court order, they'll do a deep scan and pull out your individual files.

Google scans every picture you upload. If Apple is not already doing it, they will soon.

Hey there, Hugh. It's a brave new world.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Hugh_Lebowski (Reply #8)

Fri Aug 6, 2021, 02:12 AM

14. +1. The database will be in the iPhone operating system:

To spot the child sexual abuse material, or C.S.A.M., uploaded to iCloud, iPhones will use technology called image hashes, Apple said. The software boils a photo down to a unique set of numbers — a sort of image fingerprint.

The iPhone operating system will soon store a database of hashes of known child sexual abuse material provided by organizations like the National Center for Missing & Exploited Children, and it will run those hashes against the hashes of each photo in a user’s iCloud to see if there is a match.

Once there are a certain number of matches, the photos will be shown to an Apple employee to ensure they are indeed images of child sexual abuse. If so, they will be forwarded to the National Center for Missing & Exploited Children, and the user’s iCloud account will be locked.

https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html

Reply to this post

Back to top Alert abuse Link here Permalink


Response to dalton99a (Reply #14)

Fri Aug 6, 2021, 04:53 PM

19. So, basically, if someone wants to set you up to be harassed, they just spam you a flagged pic.

Now it's on your device. You've been tagged as a perv. All your stuff is fair game now.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Girard442 (Reply #19)


Response to Girard442 (Reply #19)

Fri Aug 6, 2021, 07:25 PM

22. "Researchers have been able to do this pretty easily"

The tool designed to detected known images of child sexual abuse, called "neuralMatch," will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user's account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.

The detection system will only flag images that are already in the center's database of known child pornography. Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool — which doesn't "see" such images, just mathematical "fingerprints" that represent them — could be put to more nefarious purposes.

Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple's algorithm and alert law enforcement. "Researchers have been able to do this pretty easily," he said of the ability to trick such systems.

Other abuses could include government surveillance of dissidents or protesters. "What happens when the Chinese government says, 'Here is a list of files that we want you to scan for,'" Green asked. "Does Apple say no? I hope they say no, but their technology won't say no."

https://www.npr.org/2021/08/06/1025402725/apple-iphone-for-child-sexual-abuse-privacy

Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Fri Aug 6, 2021, 12:22 AM

11. More detailed and tehcnical

discussion: https://arstechnica.com/tech-policy/2021/08/apple-explains-how-iphones-will-scan-photos-for-child-sexual-abuse-images/

A lot of people aren't going to like this, but I'm actually in favor of a little more policing of the internet.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Fri Aug 6, 2021, 01:10 AM

12. This is an idea paved with good intentions

Who is going to say something about catching kiddie porn? The problem is we should know that they won’t stop there. Once the idea of scanning your account for illicit activities becomes normalized it will expand always does. Next it will be drugs or firearms this is an expansion of the police state. The police could never get the ability to go through your photos via a warrant without cause but your 4th amendment rights do not include corporations.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Fri Aug 6, 2021, 02:08 AM

13. Kick

Last edited Fri Aug 6, 2021, 02:47 AM - Edit history (1)





Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Fri Aug 6, 2021, 03:33 PM

17. It never gets old...

Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Fri Aug 6, 2021, 06:30 PM

21. Without warrant ?

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Demovictory9 (Reply #21)

Fri Aug 6, 2021, 09:17 PM

25. Warrants!?!

Ahhhhhhh......I remember the old days too...back when we still had a Fourth Amendment. The "war" on some drugs put that on life support.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Demovictory9 (Reply #21)

Fri Aug 6, 2021, 09:43 PM

27. The concept is so 2019


A monorail train displaying Google signage moves past a billboard advertising Apple iPhone security during the 2019 Consumer Electronics Show (CES) in Las Vegas, Nevada, U.S., on Monday, Jan. 7, 2019. Bloomberg | Getty Images

Reply to this post

Back to top Alert abuse Link here Permalink


Response to left-of-center2012 (Original post)

Fri Aug 6, 2021, 09:34 PM

26. How soon before hackers start planting child porn

in the cloud, attached to innocent people's IP addresses and then offer to remove it for a price?

Reply to this post

Back to top Alert abuse Link here Permalink

Reply to this thread