Photos. And increasingly, videos as well. We capture copious amounts of them on our phones. It syncs automatically with Google Photos or iCloud. We create albums in the app for events and share them with our friends and family, who can then add more photos to the shared album. We can switch phones without worrying about losing our photos, as long as we use the same account - everything's backed up on the cloud!
Except there is no cloud, there is only someone else's computer.
You Will Let Us Look At Your Photos And You Will Thank Us For It
CSAM Scanning Programmes
In August 2021, Apple announced a plan to scan photos stored on iCloud for any instances of Child Sexual Abuse Material (CSAM), in order to report it to authorities. They would later scrap this plan in an announcement on December 2022.
Google quietly started a similar programme and it continues to this day.
Apple's Promises of Privacy
Apple gave its users the following guarantees:
- Photos will be hashed (in such a way as to be consistent despite rotations and cropping, somehow) before being scanned
- Photos will only be checked against known CSAM
- Only when a certain threshold of images are matched will Apple investigate manually and consider alerting the authorities
What Went Wrong?
There were cases of false positives. Famously, in August 2022, a couple that had previously shared photos of their toddler's infected genitalia with a doctor (and received a quick and accurate diagnosis and treatment plan) lost access to the google account that they had used, and were the subject of a police investigation. They didn't get charged, but this points to a deeper problem.
The Revealed Risks
- An AI model that can look at your photos and flag it for illegal material. What happens if it's trained to detect not CSAM but, say, government dissent?
- The fact that "manual review" exists - what if the threshold for it is set really low?
- Evidence gained through illegal surveillance is inadmissible in court, in order to deter the police from violating citizen's privacy without reasonable cause. What does this look like when you're sharing everything about your life with a corporation?
Same Tech, Different Goals
A homophobic government could strongarm the companies into deploying models that check for pride flags and similar symbols of the LGBT+ community. If you believe that the correct thing to do here is to let the companies direct police action towards queer people (or any other community that a future government may target, some of which may include you) while you try to change the system through state-approved means, you may continue as you are. If, however, you believe that the correct thing to do is to take this power away from tech giants in the first place, then you also know it's preferable to do it one day before the government gets to them than one day after.
Alternatives
Move Everything to a Hard Drive
The simplest alternative, and how we used to do things pre-cloud. Organize photos by yourself and store them in a computer or an external disk. It starts getting tough if you have many photos, or if you want to be able to access it remotely.
Host Your Own Cloud
Nextcloud or something similar could be a place for you to upload your photos automatically, which you can then access and share over the internet.
Nextcloud Photos, Ente Photos, Immich, PhotoPrism
These open-source apps do something I'm pleasantly surprised about. You can host these services on a device of your choosing, and they will download an AI model to categorize photos by faces and objects. You get the full Big Tech Experience without any of the Big Tech Privacy Concerns.
Now, I have not tested these services so I do not know how user-friendly they are, but I can vouch for their privacy because they are popular FOSS projects with lots of eyeballs on them.