A couple of weeks ago, Apple announced a new safety feature regarding children & sexually explicit photos. Now, CSAM will scan iCloud photos and photos sent through iMessage to keep kids safe from inappropriate pictures. While this seems great, some have not-so-great things to say about it.
In light of sex trafficking and child abuse, Apple decided to implement CSAM, or Child Sexual Abuse Material, into its new software update. Therefore, Apple can scan all iCloud photos, and automatically blur any inappropriate photos of the child. Also, if a predator tries to send explicit photos to the child, it will be blurred as well. Then, parents will get a notification, warning them about their child engaging in adult content. Apple claims this is an attempt to keep children safe online, especially since most kids under 12 have their own phones now. Obviously, parents are highly supportive of this. However, not everyone is on board with this new feature.
Some smartphone consumers have an idea how little privacy they actually do have. The ones who are already suspicious about Google and Apple spying on them are now finding this update to be quite concerning. Not only is this an invasion of privacy, but they speculate there’s a darker agenda behind the scenes. Such as, the government scanning and sorting people into campaigns. In other words, anti-CSAMers speculate that the government is going through Apple’s users and placing them into groups, such as LGBT, Black Lives Matter, Antifa, Q’Anon, etc.
This is very similar to the Patriot Act. Claiming to be protecting Americans and enforcing security to filter out the bad guys, some believe it’s a means of control. This only adds to the battle of liberty versus security.
What do you think about Apple’s CSAM safety feature? Do you support it, or does Apple need to get off of our iCould? Let us know!