1 min read

ICYMI: Aug 2 - Aug 8

ICYMI: Aug 2 - Aug 8

ICYMI is posted every Monday recapping privacy news over the last week from around the web.


Apple Introducing New Child Safety Features, Including Scanning Users' Photo Libraries for Known Sexual Abuse Material

Second, starting this year with iOS 15 and iPadOS 15, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies.

Apple said its method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, Apple said the system will perform on-device matching against a database of known CSAM image hashes provided by the NCMEC and other child safety organizations. Apple said it will further transform this database into an unreadable set of hashes that is securely stored on users' devices.

This like this always start with a universally reviled target (pedophiles) but it quickly turns into a slippery slope.

Apple does say that if the iCloud Photos backup is disable, this feature is also disable. And as a reminder, you should have iCloud backup disabled entirely anyway since Apple has access to a majority of the content that's is uploaded there.


Leaked Document Says Google Fired Dozens of Employees for Data Misuse

Google fired dozens of employees between 2018 and 2020 for abusing their access to the company's tools or data, with some workers potentially facing allegations of accessing Google user or employee data, according to an internal Google document obtained by Motherboard.

A common occourance across tech companies, that will probably never go away. Basically, don't upload anything that you don't want seen and don't use any service that captures sensitive personal information. You can read a long history of Google and it's (lack of) privacy towards users in this post.