Saturday, 14 August 2021

Bringing You Tomorrow's Technology-Led Miscarriage Of Justice Today...

iPhones will send sexting warnings to parents if their children send or receive explicit images – and will automatically report child abuse pictures on devices to the authorities, Apple has announced.
A trio of new safety tools have been unveiled in a bid to protect young people and limit the spread of child sexual abuse material (CSAM), the tech giant said. While the measures are initially only being rolled out in the US, Apple plans for the technology to soon be available in the UK and other countries worldwide.

Of course it does! 

The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user's photo album.

*hollow laughter* I think they forgot to add 'Yet'...

Say, who's providing the data that it's going to match against?

Instead, the system will look for matches, securely on the device, based on a database of 'hashes' - a type of digital fingerprint - of known CSAM images provided by child safety organisations.

Oh. Organisations whose very existence depends on the existance of the thing they are planning to find. Well, I can't see any inherent danger in that, can anyone else? 

The company reiterated that the new CSAM detection tools would only apply to those using iCloud Photos and would not allow the firm or anyone else to scan the images on a user's camera roll.

Until they can figure out how to do it? 

3 comments:

  1. Should be obligatory on Luton, Rotherham and every other muslim ghetto taxi rank and the far left racist bbc ? oh look, a squirrel

    ReplyDelete
  2. 'Existance' - to be continued in the Mail.

    ReplyDelete
  3. "oh look, a squirrel"

    Quite!

    "'Existance' - to be continued in the Mail."

    Whoops! 😊

    ReplyDelete