Discord, the popular chat platform, has recently updated its policy aimed at protecting children and teenagers from predators who use the app to create and distribute child sexual abuse materials (CSAM) and groom young users. These updates come in response to reports highlighting the misuse of the platform for these illegal and harmful activities.
One of the significant changes in the updated policy is the explicit prohibition of AI-generated CSAM, specifically photorealistic images. With the rise of generative AI technology, there has been an alarming increase in lifelike images featuring sexualized depictions of children. The Washington Post recently reported on conversations discovered regarding the use of Midjourney, a text-to-image generative AI tool, on Discord to create inappropriate images of children. By banning AI-generated CSAM, Discord aims to prevent the proliferation of such harmful content on its platform.
In addition to AI-generated CSAM, Discord now also explicitly prohibits any form of text or media content that sexualizes children. This includes a ban on teen dating servers, which have been found to facilitate solicitation of nude images from minors. A previous investigation by NBC News uncovered Discord servers that were advertised as teen dating servers, attracting participants who engaged in inappropriate behavior. Discord has taken a strong stance against such behavior and has pledged to take action against users involved in sexualizing children and engaging in grooming activities.
The issue of grooming and exploitation of children on Discord is not new. There have been cases of adult users grooming children on the platform, and even criminal groups extorting underage users to share explicit images of themselves. By completely banning teen dating servers, Discord hopes to mitigate this issue and create a safer environment for its underage users. The platform has also made it clear that older teens found to be grooming younger teens will face consequences under their Inappropriate Sexual Conduct with Children and Grooming Policy.
Along with updating its rules, Discord has introduced a new tool called the Family Center. This opt-in feature allows parents to monitor their children’s activity on the platform. While parents cannot see the content of their children’s messages, they can see their friends and contacts on Discord. It is an additional measure aimed at ensuring the safety of underage users. Discord believes that these new measures, combined with existing measures such as proactive scanning of uploaded images using PhotoDNA, will help protect their young users.
Discord’s commitment to child safety is evident through these policy updates and the introduction of the Family Center tool. The platform acknowledges the importance of creating a safe and secure environment for all users, especially children and teenagers. By actively addressing the misuse of its platform and taking steps to prevent the spread of harmful content, Discord aims to protect its community and provide a responsible online space for young users.
It is crucial for social media platforms like Discord to remain vigilant and adapt their policies to address emerging challenges and protect vulnerable users. As technology continues to evolve, so do the methods used by predators to exploit and harm children. Discord’s efforts to stay ahead of these threats and proactively safeguard its users are commendable. The company’s commitment to continuously improving its safety measures should serve as an example for other platforms to follow, as ensuring the well-being of young users must be a top priority in the digital age.