October 23, 2021

KRIPTOMAKS

🚀The Bitcoin Foundation Latvia blogs par tehnoloģijām latviešu un angļu valodā 📈 Finansu pasaules ziņas🔑 Kriptovalūtu ziņas 🇺🇸🇩🇪🇷🇺Podkāsti trīs valodās💋Četri svarīgākie Bloomberg ziņu kanāli🎙️Ikmēneša Videožurnāls ar sekojošām tēmām💁‍♀️🏙️Ekonomika🗿 Mākslīgais intelekts📡 Kosmosa tehnoloģijas🤖 Robotika un kibernētika👩‍🔬👨‍🔬 Biotehnoloģijas 🛰️Visuma izpēte 🛸Marsa kolonizācija 📐 Arhitektūra 🍎 Apple 🌐 Domēni 🎮 Spēles

iCloud users say they’re downgrading because of CSAM scanning

“Is anybody downgrading their iCloud account in light of the recent news regarding hashing people’s photos?”

What you need to know

  • Apple recently announced new measures that will scan for Child Sexual Abuse Material in iCloud photos on-device.
  • Users have taken to Reddit to express their displeasure at the move.
  • Multiple users indicate they will downgrade their iCloud storage as a result.

Some users of Apple’s iCloud platform say they are going to downgrade their plans and stop using iCloud photos in response to recently-announced Child Safety measures that include scanning iCloud photos for CSAM images.

A reddit discussion post with nearly 500 upvotes and over 800 comments was started Tuesday and asks “Is anybody downgrading their iCloud account in light of the recent news regarding hashing people’s photos?”

User JonathanJK stated that they’d taken two hours “going through my settings, deleting emails and photos to create an offline back up work flow” and had realized that the settings were tedious and time-consuming, that lots of their data was going to iCloud unnecessarily, that they could get by with the free 5GB tier, and that the “cleansing itself is good for the soul.”

Multiple users responded in kind:

I have to do some planning first, but I will.

Same here. Thinking about setting up NAS storage for photos.

Yeah downgraded to free. If they’re going to f**k with my data, then I’ll store it myself.

As expected the thread sparked a huge debate in the comments. Apple announced new Child Safety measures last week including plans to scan for Child Sexual Abuse Material within photos uploaded to iCloud using on-device hashing to match images to a database of known CSAM content provided by the NCMEC and other organizations. Some privacy advocates have raised eyebrows at the move, which continues to be a hotbed of discussion online. Apple’s CSAM scanning does not apply to photos that aren’t uploaded to iCloud, so users can “opt-out” by disabling the feature.