Pro@programming.dev to Technology@lemmy.worldEnglish · 8 days agoGoogle will use hashes to find and remove nonconsensual intimate imagery from Searchblog.googleexternal-linkmessage-square6linkfedilinkarrow-up148arrow-down12file-text
arrow-up146arrow-down1external-linkGoogle will use hashes to find and remove nonconsensual intimate imagery from Searchblog.googlePro@programming.dev to Technology@lemmy.worldEnglish · 8 days agomessage-square6linkfedilinkfile-text
minus-squareLorem Ipsum dolor sit amet@lemmy.worldlinkfedilinkEnglisharrow-up1arrow-down1·8 days agoThere was a github thread about this when it came up for CSAM, they managed to easily circumvent it. I’m rather confident this will end up similarly
There was a github thread about this when it came up for CSAM, they managed to easily circumvent it. I’m rather confident this will end up similarly