A new report has brought to light concerning statistics regarding child safety issues on the social platform Discord. According to an NBC News investigation spanning six years, there have been 35 cases in which adults have been prosecuted on charges of “kidnapping, grooming or sexual assault” that involve Discord communication. Out of these cases, at least 15 individuals have pleaded guilty or received verdicts, while many others are still pending.
In addition, the investigation uncovered 165 more cases, including four crime rings, where adults were prosecuted for sharing child sexual exploitation material (CSAM) on Discord or using the platform to coerce children into sending sexually explicit images, a practice known as sextortion. These illegal activities predominantly take place in hidden communities and chat rooms on Discord.
Upon conducting a Google search using the term “site:justice.gov Discord,” numerous disturbing results were found. Notably, one harrowing case involved a teenager who was groomed on Discord for months, subsequently taken across state lines, raped, and discovered locked in a backyard shed, as reported by the police.
Stephen Sauer from the Canadian Center for Child Protection stated, “What we see is only the tip of the iceberg,” highlighting the gravity of the issue. Discord has faced criticism in the past regarding its handling of child abuse complaints. Last year, CNN reported multiple incidents of CSAM on the platform, with some parents claiming that Discord provided little assistance.
Earlier this year, the National Center on Sexual Exploitation (NCOSE) issued a statement titled “Discord Dishonestly Responds to How it Handles Child Sexual Abuse Material After Being Named to 2023 Dirty Dozen List.” NCOSE found that despite identifying and reporting CSAM links, Discord’s servers still hosted the material over two weeks later. The organization criticized Discord for its passive actions and recommended that the platform ban minors until substantial transformations are made. Currently, Discord boasts over 150 million users.
Discord, in response, released a transparency report affirming its commitment to child safety. The report detailed that the company had disabled 37,102 accounts and removed 17,425 servers for child safety violations. John Redgrave, the Vice President of trust and safety at Discord, expressed his belief that the platform’s approach to the issue has significantly improved since its acquisition of the AI moderation company Sentropy in 2021. Discord now utilizes various systems to proactively detect CSAM and analyze user behavior, claiming to identify and index the majority of materials that have already been verified.
Despite these efforts, Discord’s systems currently cannot detect child sexual abuse materials or messages that have yet to be indexed. NBC News discovered 242 Discord servers created within the past month that employ thinly disguised terms to market CSAM.
Other social media platforms also face challenges concerning CSAM. A recent report highlighted Instagram’s role in facilitating and promoting a network of accounts dedicated to underage illicit content. However, Discord has faced particular scrutiny for its cooperation with law enforcement. In one instance, after Ontario Police requested the preservation of records, Discord reportedly asked for payment. Engadget, a technology news website, has reached out to Discord for comment on these allegations.
In conclusion, Discord’s child safety issues have come under intense scrutiny as alarming statistics and case studies have emerged. The platform has been implicated in numerous instances of child exploitation, including grooming, sextortion, and the sharing of CSAM. While Discord claims to prioritize child safety, concerns remain regarding its ability to proactively detect and remove such harmful content. The ongoing dialogue around these issues underscores the need for continuous improvement in safeguarding children online.