Apple’s spinning mirror: exploiting children for dictatorships
When I first heard about Apple’s new system for scanning images on everyone’s phones to detect if they’re being used to create child porn, I didn’t believe it. There is no way that such a system wouldn’t be abused. But Apple has now released both their statement and a description of how the system works, and it’s far worse than that. The system they describe does nothing to detect the exploitation of children or the creation of child pornography and little if anything to discourage its use. All it does is check photos against a hard-coded list of existing photos and detect exact matches.
You can use your iPhone to exploit children all day and their system won’t detect it. It only detects images that have been around long enough to enter the database Apple is using, and then only if the image is uploaded after the central servers have added it to their database. Images uploaded before the central organization learns about them are protected from discovery.1
As incentives go, this seems like a really bad idea. It increases the value of new material and promises to be about as effective as trying to enforce prohibition by targeting drinkers. There are always more drinkers. The more you crack down, the more you make.
And the more I read the document Apple released describing their CSAM detection system, the more I keep flashing back to the best geek culture movie of the eighties—probably the best movie of the eighties—and the epiphanal scene in a crowded burger bar beneath a flaming demon.
Lazlo: What would you use that for?
Ick: Making enormous Swiss cheese?
Mitch: The applications are unlimited.
Lazlo: No. With the fuel you’ve come up with the beam would last for, what, fifteen seconds? What good is that?
Chris: Oh, Lazlo, that doesn’t matter. I respect you but I graduated.
Mitch: Yeah, let the engineers figure out a use for it. That’s not our concern.
Lazlo: Maybe somebody already has a use for it. One for which it is specifically designed.
Jordan: You mean Doctor Hathaway had something in mind all along?
Lazlo: Look at the facts. Very high power. Portable. Limited firing time. Unlimited range. All you need is a tracking system and a large spinning mirror and you could vaporize a human target from space.
Think about what Apple’s system does, and how it works.
- It requires a list of known photos to check against.
- It only triggers on exact matches.
- It is involuntary. You neither opt in nor opt out.
- It reports back to a central authority after a configurable threshold of matches.
- The list of offending photos is hidden. Only the organization that compiled the data set knows what the offending photos are.2
- The system is designed specifically for image matching.
- The system triggers on uploading the offending images to iCloud.
- It does not trigger on images not uploaded, nor on images already uploaded.
- Uploads to iCloud usually occur automatically on receiving and saving an image via text messaging.
What is this useful for? What if, as Lazlo Hollyfield asked, the system already has a purpose, one for which it is specifically designed? That purpose clearly is not detecting that a phone is being used to exploit children. But all you’d need is a spinning space mirror real-time updates of the database of known images, and you could track the spread of anti-government images throughout China or anywhere else, bypassing all of the privacy measures that make the iPhone so desirable under totalitarian governments.
Images are extremely useful for delaying text matching when sending memes. It isn’t just China that finds the spread of anti-authoritarian memes threatening. Both House leaders and state governments such as California’s have demanded that social media services censor memes they disagree with.
Apple’s system could hardly be designed better for handling such requests. To outside appearances “child porn” is merely the root password they’re using to disable privacy on their phones. They’re using the legitimate fear of child pornography to enable a tracking system that is far more useful for authoritarian governments than for law enforcement in democracies. It is hard to believe that the Chinese government isn’t going to be all over this now that it’s been announced, if they’re not already the reason it was designed.
Apple is pulling a bit of a switcheroo here, potentially setting up a motte-and-bailey defense. They’re announcing two completely different things, and most reports seem to be conflating them as if they were two related systems. They’re not.3
One is the potentially dangerous but understandable use of machine intelligence to enhance parental controls. It can be turned on and off by parents, scans local images, and provides warnings to parents, not to governments. While there are certainly privacy concerns with this system, there are always privacy concerns when creating tools for parents. The iPhone must be set up as a child’s phone for this tool to work, and iOS already has a lot of tools built in for parents to monitor their children’s activities.
This is hardly the most intrusive item under parental control: parents can track their children’s movement and block arbitrary web sites, for example. Unless we want to pretend that children are adults, some sort of tradeoffs are necessary when parents allow their children to have a phone.4 There are strong arguments that this particular feature is over the line, but ultimately whether that’s true is a question of appropriate tradeoffs.
But that system is completely different from the database image matching system. It is completely separate and has a separate purpose. That purpose sure looks like appeasing totalitarian governments. It is not part of parental controls, it runs on all phones, and what the photos are is cryptographically secret. Only whoever Apple got the list from knows what photos are in that list. It is a system for tracking the spread of specific imagery and reporting its use.
I suspect that Apple announced these two items at the same time hoping that people will conflate the reasonable one with the unreasonable one.
It’s all smoke and spinning mirrors.
- May 15, 2024: Apple’s FiVe Minute Crush
-
I didn’t mean to do two AI-related posts practically back-to-back like this, but Apple’s very dystopian iPad Pro ad brought up some other thoughts partly due to my having almost finished posting my series on Alan Moore’s dystopian V stories.
Now, I’d recommend not reading too much into this “interesting” choice of visuals. Part of the problem with the ad is nothing more than the age-old development of silence culture in any large and entrenched business. Apple is far from the brotherhood of pirates portrayed in Andy Hertzfeld’s Revolution in the Valley. I suspect a lot of people saw how painfully bad the ad was and simply chose not to stick their necks out.
My first encounter with this culture, in a very similar situation, was tangentially, by way of a Radio Shack toy called “Galactic Man”. I could have sworn I’ve mentioned this on the site before but I can’t find it now. When I was a young guitarist in Hollywood, I worked part-time at a Radio Shack near Hollywood and Vine. It was a fascinating view of the Hollywood industry from the borderline: desperate property masters would occasionally come in searching for some thing they suddenly realized they needed, like a giant gold-plated telephone or a boxful of D-cell batteries they’d run out of on set.
The store’s manager kept a box of unsaleable items in the back room. As an employee, you were free to take anything you wanted out of it. That’s where I found Galactic Man. Galactic Man was a transformer knock-off. He was a laser gun that transformed into a robot. It was actually kind of cool, except for one possibly insurmountable problem: where does the laser gun’s trigger go when the toy transforms into a robot?
This is from my reading of the CSAM description, and I’m not sure about it. The description of inner and outer decryptions was difficult to follow.
↑If I’m reading the document correctly, even Apple doesn’t know what images they’re searching for—all they get is a set of hashes. They can see the images that get reported, but from the description it sounds like verification by Apple is not required for the system to work. A country such as China could simply require that all matches be reported without intervention by Apple.
↑There is also a third item that is so silly it has to be a deliberate slippery slope or a complete misunderstanding of human nature. If someone searches for offending material, Siri will tell them they’re being naughty and offer counseling advice.
↑On the other hand there are some things I was surprised to find are not part of parental controls. Parents cannot monitor text messages. So Apple has set up a complex system for alerting parents about explicit images, and nothing about the far more simple task of alerting parents about sexting or other dangerous, text-only activities.
↑
- Apple explains how iPhones will scan photos for child-sexual-abuse images: Jon Brodkin at Ars Technica
- “For years, Apple has resisted pressure from the US government to install a ‘backdoor’ in its encryption systems, saying that doing so would undermine security for all users. Apple has been lauded by security experts for this stance. But with its plan to deploy software that performs on-device scanning and share selected results with authorities, Apple is coming dangerously close to acting as a tool for government surveillance.”
- Apple’s Plan to “Think Different” About Encryption Opens a Backdoor to Your Private Life: India McKinney and Erica Portnoy at Electronic Frontier Foundation
- “Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”
- CSAM Detection at Apple Computer
- “CSAM Detection enables Apple to accurately identify and report iCloud users who store known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant information to the National Center for Missing and Exploited Children (NCMEC).”
- Expanded Protections for Children at Apple Computer
- “This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.”
- Lazlo at Real Genius•
- “I’ve been thinking about that laser you’ve designed…“
More Apple
- Apple’s FiVe Minute Crush
- Between 1984 and 2024, Apple’s advertising has gone from ridiculing 1984 to being 1984.
- How does Apple’s supposed anti-conservative bias matter?
- If you think Apple has a bias against conservatives or Christians, you definitely don’t want Apple to build a tool its employees can use to help guess an iPhone’s password.
- We have met the enemy, and he is our carrier
- If you want a phone that works as well as your Macintosh, you need a network that works as well as the Internet.
- Stephen Fry on iPhone killers
- “You’re only on this planet once—do something extraordinary, imaginative and inspiring. That’s the difference, ultimately.”
- The Ringtone Racket
- John Gruber adds his 99 cents to the iTunes ringtone debate, and comes to the same conclusion: Apple is losing its battle for the hearts and minds of consumers. It might make more money in the short-term, but it faces a significant chance of becoming just another company in the long-term.
- 19 more pages with the topic Apple, and other related pages
More iPhone
- How does Apple’s supposed anti-conservative bias matter?
- If you think Apple has a bias against conservatives or Christians, you definitely don’t want Apple to build a tool its employees can use to help guess an iPhone’s password.
- Another reason to keep Flash off the iPhone
- I don’t know what Czerniak’s position is about Flash on the iPhone; I hope he’s against it, because if his opinions about Flash on Snow Leopard gain any traction, Flash will never be on any mobile device.
- iPhone review process squeezes out another one
- Apple’s iPhone review process has now rejected the English language for being objectionable.
- Eucalyptus, revisited
- Eucalyptus is a great replacement for the paperback, not so great at using the fact that it’s a computer. But if you enjoy classics, I highly recommend it; it’s a beautiful e-reader for your iPhone/iPod Touch.
- Apple censors Kama Sutra
- Apple denied the beautiful e-reader Eucalyptus because it lets you search the web and find classics works of pornography… like the Kama Sutra. They’ve rejected the app because… you might use it to read Victorian porn.
More Real Genius
- Lazlo Hollyfeld on the electric car
- The problems with electric cars are insurmountable without completely new battery technology that no one who wants to mandate battery-powered cars is looking for. Almost as if the real purpose of electric cars is not transportation, but anti-transportation.
More totalitarianism
- Corpseman resurrected: correcting Betsy DeVos
- The left has once again decided that the way those people speak is ignorant, and that those people are too stupid to hold public office.
- Republican namecalling: single man creating laws smacks of “dictatorship”
- Extreme Republicans call U.S. President “dictator”, jealous that President’s actions to achieve goals bypass congressional obstruction and gridlock.