So J has had to take trainings on AI CSAM (nudify apps: great for bullying, blackmail), and one `silver lining` of "nonconsensual sexual content too cheap to meter" is, when any media is presumptively false, it *may* shift the moral judgment from the person imaged to the person sharing the media.
add a skeleton here at some point
12 days ago