Apple could implement their CSAM Detection Solution to be in compliance with a new Amendment to the UK’s Online Safety Bill
Last year, Apple Inc. announced, then halted, plans to detect child abuse imagery in its users’ photo libraries after widespread concern voiced by privacy advocates. Apple published two documents titled “CSAM Detection – A Technical Summary” and “Expanded Protections for Children – Frequently Asked Questions.” The program’s implementation was delayed but not cancelled.
It's now being reported that “The UK will compel technology companies to find ways to identify and remove child abuse images from their platforms in an amendment to the Online Safety Bill.
The amendment puts the onus on companies to source or develop methods to comply with regulator Ofcom’s orders or face fines of as much as £18 million ($21.5 million), or 10% of their global annual sales, whichever is higher, the Home Office said.
Tech companies have pushed back on the proposed legislation, with Meta complaining that it risks giving governments power to snoop on and censor private messages. The UK Home Office has rejected technology companies’ arguments that tools developed to identify child sexual abuse could damage the security of end-to-end encryption.
UK Home Secretary Priti Patel: “Privacy and security are not mutually exclusive. We need both, and we can have both and that is what this amendment delivers.” BNN Bloomberg
Comments