By news@appleinsider.com (Wesley Hilliard) A child safety group pushed Apple on why the announced CSAM detection feature was abandoned, and the company has given its most detailed response yet as to why it backed off its plans.Apple’s scrapped CSAM detection toolChild Sexual Abuse Material is an ongoing severe concern Apple …
Source:: Apple Insider