Mimsy Were the Borogoves

Mimsy Were the Technocrats: As long as we keep talking about it, it’s technology.

Apple’s spinning mirror: exploiting children for dictatorships

Jerry Stratton, August 11, 2021

Stop spying: Siri, stop spying on me.

Nope.; Apple; Siri

Siri, June 18 2018. “What a sense of humor,” I thought.

When I first heard about Apple’s new system for scanning images on everyone’s phones to detect if they’re being used to create child porn, I didn’t believe it. There is no way that such a system wouldn’t be abused. But Apple has now released both their statement and a description of how the system works, and it’s far worse than that. The system they describe does nothing to detect the exploitation of children or the creation of child pornography and little if anything to discourage its use. All it does is check photos against a hard-coded list of existing photos and detect exact matches.

You can use your iPhone to exploit children all day and their system won’t detect it. It only detects images that have been around long enough to enter the database Apple is using, and then only if the image is uploaded after the central servers have added it to their database. Images uploaded before the central organization learns about them are protected from discovery.1

As incentives go, this seems like a really bad idea. It increases the value of new material and promises to be about as effective as trying to enforce prohibition by targeting drinkers. There are always more drinkers. The more you crack down, the more you make.

And the more I read the document Apple released describing their CSAM detection system, the more I keep flashing back to the best geek culture movie of the eighties—probably the best movie of the eighties—and the epiphanal scene in a crowded burger bar beneath a flaming demon.

Lazlo: What would you use that for?

Ick: Making enormous Swiss cheese?

Mitch: The applications are unlimited.

Lazlo: No. With the fuel you’ve come up with the beam would last for, what, fifteen seconds? What good is that?

Chris: Oh, Lazlo, that doesn’t matter. I respect you but I graduated.

Mitch: Yeah, let the engineers figure out a use for it. That’s not our concern.

Lazlo: Maybe somebody already has a use for it. One for which it is specifically designed.

Jordan: You mean Doctor Hathaway had something in mind all along?

Lazlo: Look at the facts. Very high power. Portable. Limited firing time. Unlimited range. All you need is a tracking system and a large spinning mirror and you could vaporize a human target from space.

Think about what Apple’s system does, and how it works.

  • It requires a list of known photos to check against.
  • It only triggers on exact matches.
  • It is involuntary. You neither opt in nor opt out.
  • It reports back to a central authority after a configurable threshold of matches.
  • The list of offending photos is hidden. Only the organization that compiled the data set knows what the offending photos are.2
  • The system is designed specifically for image matching.
  • The system triggers on uploading the offending images to iCloud.
  • It does not trigger on images not uploaded, nor on images already uploaded.
  • Uploads to iCloud usually occur automatically on receiving and saving an image via text messaging.
Purgatory Cocktails: A laser through the Purgatory burger sign in Real Genius.; lasers

Real Genius often used subtle symbolism to convey deep meaning. Just as often, it dropped symbols so heavy they warped space.

What is this useful for? What if, as Lazlo Hollyfield asked, the system already has a purpose, one for which it is specifically designed? That purpose clearly is not detecting that a phone is being used to exploit children. But all you’d need is a spinning space mirror real-time updates of the database of known images, and you could track the spread of anti-government images throughout China or anywhere else, bypassing all of the privacy measures that make the iPhone so desirable under totalitarian governments.

Images are extremely useful for delaying text matching when sending memes. It isn’t just China that finds the spread of anti-authoritarian memes threatening. Both House leaders and state governments such as California’s have demanded that social media services censor memes they disagree with.

Apple’s system could hardly be designed better for handling such requests. To outside appearances “child porn” is merely the root password they’re using to disable privacy on their phones. They’re using the legitimate fear of child pornography to enable a tracking system that is far more useful for authoritarian governments than for law enforcement in democracies. It is hard to believe that the Chinese government isn’t going to be all over this now that it’s been announced, if they’re not already the reason it was designed.

Apple is pulling a bit of a switcheroo here, potentially setting up a motte-and-bailey defense. They’re announcing two completely different things, and most reports seem to be conflating them as if they were two related systems. They’re not.3

One is the potentially dangerous but understandable use of machine intelligence to enhance parental controls. It can be turned on and off by parents, scans local images, and provides warnings to parents, not to governments. While there are certainly privacy concerns with this system, there are always privacy concerns when creating tools for parents. The iPhone must be set up as a child’s phone for this tool to work, and iOS already has a lot of tools built in for parents to monitor their children’s activities.

This is hardly the most intrusive item under parental control: parents can track their children’s movement and block arbitrary web sites, for example. Unless we want to pretend that children are adults, some sort of tradeoffs are necessary when parents allow their children to have a phone.4 There are strong arguments that this particular feature is over the line, but ultimately whether that’s true is a question of appropriate tradeoffs.

But that system is completely different from the database image matching system. It is completely separate and has a separate purpose. That purpose sure looks like appeasing totalitarian governments. It is not part of parental controls, it runs on all phones, and what the photos are is cryptographically secret. Only whoever Apple got the list from knows what photos are in that list. It is a system for tracking the spread of specific imagery and reporting its use.

I suspect that Apple announced these two items at the same time hoping that people will conflate the reasonable one with the unreasonable one.

It’s all smoke and spinning mirrors.

May 15, 2024: Apple’s FiVe Minute Crush
Apple’s industrial press: The industrial press in Apple’s Crush! ad after crushing the life out of the arts and artists.; artists; Apple; advertisement

How out-of-touch do you need to be to see this as an uplifting, inspiring end to an ad featuring the destruction of human-like dolls and faces?

I didn’t mean to do two AI-related posts practically back-to-back like this, but Apple’s very dystopian iPad Pro ad brought up some other thoughts partly due to my having almost finished posting my series on Alan Moore’s dystopian V stories.

Now, I’d recommend not reading too much into this “interesting” choice of visuals. Part of the problem with the ad is nothing more than the age-old development of silence culture in any large and entrenched business. Apple is far from the brotherhood of pirates portrayed in Andy Hertzfeld’s Revolution in the Valley. I suspect a lot of people saw how painfully bad the ad was and simply chose not to stick their necks out.

My first encounter with this culture, in a very similar situation, was tangentially, by way of a Radio Shack toy called “Galactic Man”. I could have sworn I’ve mentioned this on the site before but I can’t find it now. When I was a young guitarist in Hollywood, I worked part-time at a Radio Shack near Hollywood and Vine. It was a fascinating view of the Hollywood industry from the borderline: desperate property masters would occasionally come in searching for some thing they suddenly realized they needed, like a giant gold-plated telephone or a boxful of D-cell batteries they’d run out of on set.

The store’s manager kept a box of unsaleable items in the back room. As an employee, you were free to take anything you wanted out of it. That’s where I found Galactic Man. Galactic Man was a transformer knock-off. He was a laser gun that transformed into a robot. It was actually kind of cool, except for one possibly insurmountable problem: where does the laser gun’s trigger go when the toy transforms into a robot?

  1. This is from my reading of the CSAM description, and I’m not sure about it. The description of inner and outer decryptions was difficult to follow.

  2. If I’m reading the document correctly, even Apple doesn’t know what images they’re searching for—all they get is a set of hashes. They can see the images that get reported, but from the description it sounds like verification by Apple is not required for the system to work. A country such as China could simply require that all matches be reported without intervention by Apple.

  3. There is also a third item that is so silly it has to be a deliberate slippery slope or a complete misunderstanding of human nature. If someone searches for offending material, Siri will tell them they’re being naughty and offer counseling advice.

  4. On the other hand there are some things I was surprised to find are not part of parental controls. Parents cannot monitor text messages. So Apple has set up a complex system for alerting parents about explicit images, and nothing about the far more simple task of alerting parents about sexting or other dangerous, text-only activities.

  1. <- Why teach programming?
  2. Hobby Computer Handbook ->