Discord has provided more insight into how the shooter who opened fire in a Buffalo, New York supermarket over the weekend used its service prior to the tragic act of violence.
The shooter, 18-year-old Payton Gendron, is charged with first degree murder in the mass shooting, which left 10 people dead and three injured. In the month leading up to the attack on the Buffalo Tops grocery store, which he researched and selected in an effort to harm as many Black people as possible, he used Discord to document his plans in extreme detail.
According to Discord, the suspected shooter created a private, invite-only server that he used as a “personal diary chat log.” The server had no other members until 30 minutes before the attack began, when a “small group of people” received an invite and joined.
“Before that, our records indicate no other people saw the diary chat log in this private server,” a Discord spokesperson told TechCrunch. The company declined to provide further details about the server, the members who joined or their activity, citing the ongoing investigation.
In a statement to TechCrunch Discord described a “mix of proactive and reactive tools” it uses moderate content, including machine learning, community moderators and reporting tools for users. The company’s safety team also monitors the social network’s servers and takes action based on observed “platform trends or intelligence” on the platform.
“We have a dedicated Counter-Extremism sub-team that works to identify and remove any spaces where users are organizing around violent and hateful ideologies that target racial, ethnic, and religious minorities before they are reported to us,” a Discord spokesperson said. “The team also works to track and deplatform violent and hateful networks that use Discord to disseminate extremist content.”
Discord, a text and voice chat app, is best known for its large, public messaging rooms but it also allows users to create private, invite-only servers. In updates to the Discord server, which shares a username with the Twitch channel he used to livestream the shooting, the suspect documented his violent, racist views in depth. He also detailed the logistics of how he would carry out the mass shooting, including the gear he would use, his shopping trips leading up to the shooting and his day-of plans.
While it’s unknown what other Discord servers Gendron was active in, he references his activity on the app in the chat logs. “I didn’t even think until now that the people in my discord groups are probably going to get no knock raided by ATF and FBI agents,” he wrote. While Discord served as a kind of digital journal for the atrocities he would later carry out, he also compiled a nearly 200-page screed about his beliefs, weapons and plan to commit violence in Google Docs.
In early May, he expressed concerns that Google might discover his plan for violence in messages sent on the private Discord server. “Ok I’m a bit stressed that a google worker is going to see my manifesto fuck,” he wrote. “WHY did I write it on google docs I should have had some other solution.” Unfortunately, those concerns were unfounded. After the shooting, Google did remove the document for violating its terms of service.
The suspect, who livestreamed the shooting over Twitch, also spent time on 4chan’s /pol/, an infamous submessage board rife with racism, misogyny and extremism. Unlike mainstream social networks like Discord, 4chan does not do any proactive content moderation and only removes illegal content when required to do so. In Discord chat logs reviewed by TechCrunch the shooter notes that he “only really turned racist” after encountering white supremacist ideas on 4chan.
Five years ago, Discord was implicated in the Charlottesville Unite the Right rally, an open gathering of white supremacists and other far-right extremists that ended with one counter-protester dead. The rally’s participants and organizers came together in private Discord servers to plan the day’s events and discuss the logistics of what would take place in Charlottesville. The company responded by cracking down on a number of servers hosting extremism, though maintained that it did not read messages on private servers.
In a company page about its approach to extremism published last year, Discord noted that 15 percent of its workforce — roughly 60 people — worked on its trust and safety team, up from one during the Charlottesville rally. The company explains that its safety team “splits its time between responding to user reports and now proactively finding and removing servers and users engaging in high-harm activity like violent extremist organizing.”
“We’ve been paying close attention to violent extremist groups and movements ever since we learned how the organizers of the 2017 Unite the Right Rally in Charlottesville, Virginia utilized Discord to plan their hateful activities,” the company said.
Like Reddit, most of Discord’s hands-on moderation comes from community moderators within its chat rooms. And like most social media companies, Discord relies on a blend of automated content scanning and human moderators. Last year, the company acquired Sentropy, an AI software company that detects and removes online hate and harassment, to bolster those efforts.
In the years following the deadly violence in Charlottesville, Discord successfully sought to distance itself from its association with the far-right extremists and white supremacists who once called the social network home. More recently, Discord has also put some distance between its current brand and its origins as a popular chat app for gamers, reframing itself as an inviting hub for a huge spectrum of thriving online communities.
“Our deepest sympathies are with the victims and their families,” a Discord spokesperson said of the tragedy in Buffalo, adding that it is assisting law enforcement in the ongoing investigation. “Hate has no place on Discord and we are committed to combating violence and extremism.”