Mark with a Z@suppo.fi to Not The Onion@lemmy.worldEnglish · 2 days agoMeta denies torrenting porn to train AI, says downloads were for “personal use”arstechnica.comexternal-linkmessage-square75linkfedilinkarrow-up1683arrow-down13file-textcross-posted to: [email protected]
arrow-up1680arrow-down1external-linkMeta denies torrenting porn to train AI, says downloads were for “personal use”arstechnica.comMark with a Z@suppo.fi to Not The Onion@lemmy.worldEnglish · 2 days agomessage-square75linkfedilinkfile-textcross-posted to: [email protected]
minus-squarePhoenixz@lemmy.calinkfedilinkEnglisharrow-up24·edit-21 day agoOne wonders how much child porn was in there… But it’s AI, so itsa aaaaalllll fine
minus-squaremanuallybreathing@lemmy.mllinkfedilinkEnglisharrow-up4·22 hours agoIt’s better to say child sex abuse material (csam), the term “child porn” both legitimizes the conent, and infers children could ever be active and consenting participants sexualised content involving children is abuse and should be labled as so
minus-squareulterno@programming.devlinkfedilinkEnglisharrow-up1·19 hours agoAlso, there are far too many things with the acronym, cp
minus-squareEvotech@lemmy.worldlinkfedilinkEnglisharrow-up7·1 day agoIt’s kinda weird how spwsific you have to be with certain models to not make them vwry very young looking people
One wonders how much child porn was in there…
But it’s AI, so itsa aaaaalllll fine
It’s better to say child sex abuse material (csam), the term “child porn” both legitimizes the conent, and infers children could ever be active and consenting participants
sexualised content involving children is abuse and should be labled as so
Also, there are far too many things with the acronym, cp
It’s kinda weird how spwsific you have to be with certain models to not make them vwry very young looking people