Last Friday, Apple revealed plans to begin scanning U.S. individual's iPhones content for pictures or messages of child abuse. This is a surprising turn for Apple, long lauded for protecting the privacy of their users.
What does this mean for you and for the future of iDevices?
The quick facts
The content scanning mentioned is multi-faceted:
- Images taken with your phone's camera or downloaded from the internet or social media are scanned by the photos digital fingerprint against a database of known Child Sexual Abuse Material (CSAM).
- Messages sent/received using the built-in Messages app will be checked using on-device machine learning to look for and warn about potentially explicit content
- Additionally, Search/Siri will now intervene in the event a search leads to CSAM-related results.
What this means
For starters, Apple has been very careful to specifically state they will not actively see content on your device unless they have reason to believe it is CSAM-related material. So chances are your vacation pictures or even things like pictures of your kids in the bathtub won't be an issue. Messages are also scanned on-device meaning no person will be reading your messages.
This all sounds innocent enough. However the privacy experts at the Electronic Frontier Foundation warn that this is a foot in the door for your private life to become much less private.
It should also be noted that systems like the one Apple is implementing are seldom perfect and commonly catch images which are not in appropriate at all such as statues or even puppies. The last thing anyone wants is to be locked out of their phone or have a knock at their door because they took a picture of their dog.
The other important aspect to understand is the potential future developments: While the system currently targets child sexual abuse, something which almost everyone agrees should be limited and stopped, it also presents the opportunity to expand the system to target other areas. And with a major fight currently brewing about Big Tech and their practices of limiting and removing information from their platforms, there is concern systems like what is being implemented by Apple could be abused in the future.
What to do about it
Unfortunately, there is no way to turn off this scanning, and with Android phones not being any more privacy focused, there's no real permanent solution. However, here are a few things you can do:
For starters, you can turn off uploading photos to your iCloud account. This will prevent your photos from being scanned for now. Offloading your pictures to a PC or external hard drive instead of uploading to iCloud will allow you to save your photos while keeping them private.
For searches, instead of using Siri you can use Safari and a search engine like Google, Bing or Duckduckgo. This method is going to be more cumbersome than just saying "hey Siri" but it is going to be at least somewhat more private.
Finally, for messages, there are a few alternative messaging apps out there in the App store such as Whatsapp which has stated they will not implement Apple's new scanning system. However even this isn't a perfect solution as Whatsapp has it's own issues. Short of just "not texting" however, there currently isn't a perfect solution.
Privacy vs protection
This is likely not the last we'll hear about changes device and software makers like Apple make in the name of protecting a vulnerable segment of the population. The question which needs to be asked is: are these changes worth the potentially massive privacy impacts to the innocent?
We'd love to hear your input. Reach out to us and let us know what you think.