Discord has updated its coverage meant to guard youngsters and youths on its platform after reports got here out that predators have been utilizing the app to create and unfold youngster sexual abuse supplies (CSAM), in addition to to groom younger teenagers. The platform now explicitly prohibits AI-generated photorealistic CSAM. As The Washington Post not too long ago reported, the rise in generative AI has additionally led to the explosion of lifelike photographs with sexual depictions of youngsters. The publication had seen conversations about the usage of Midjourney — a text-to-image generative AI on Discord — to create inappropriate photographs of youngsters.
Along with banning AI-generated CSAM, Discord now additionally explicitly prohibits another type of textual content or media content material that sexualizes youngsters. The platform has banned teen courting servers, as properly, and has vowed to take motion towards customers partaking on this habits. A earlier NBC News investigation discovered Discord servers marketed as teen courting servers with individuals that solicited nude photographs from minors.
Grownup customers had beforehand been prosecuted for grooming youngsters on Discord, and there are even crime rings extorting underage customers to ship sexual photographs of themselves. Banning teen courting servers fully might assist mitigate the problem. Discord has additionally included a line in its coverage, which states that older teenagers discovered to be grooming youthful teenagers might be “reviewed and actioned underneath [its] Inappropriate Sexual Conduct with Kids and Grooming Coverage.”
Other than updating its guidelines, Discord not too long ago launched a Household Heart software that folks can use to control their youngsters’ exercise on the chat service. Whereas dad and mom will not have the ability to see the precise contents of their youngsters’ message, the opt-in software permits them to see who their youngsters are mates with and who they speak to on the platform. Discord is hoping that these new measures and instruments can assist hold its underage customers secure together with its outdated measures, which embrace proactively scanning photographs uploaded to its platform utilizing PhotoDNA.