• Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    24
    ·
    edit-2
    1 day ago

    One wonders how much child porn was in there…

    But it’s AI, so itsa aaaaalllll fine

    • manuallybreathing@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      22 hours ago

      It’s better to say child sex abuse material (csam), the term “child porn” both legitimizes the conent, and infers children could ever be active and consenting participants

      sexualised content involving children is abuse and should be labled as so

    • Evotech@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      It’s kinda weird how spwsific you have to be with certain models to not make them vwry very young looking people