I finally took notice of Discord, an instant messaging platform built around communities called servers, after the Reddit community r/WallStreetBets made use of the platform to organize its buying of GameStop stock in the trading frenzy earlier this year.
Increasingly, digital communities on platforms like Reddit and Discord are driving real-world events built around misinformation, anonymity and decentralized swarm behavior. We must reevaluate the moderation of social media platforms and the users’ roles in self-moderating their online communities.
Since its inception in 2015, Discord has found popularity with video game players who use the site for real-time voice, text and video chats with fellow players during multiplayer games. Since the rise in popularity of games like Fortnite, an online multiplayer game, Discord’s membership has simultaneously exploded, with more than 850 million messages shared on the platform every day. Aside from its regular chat functions, which are in public servers based on specific topics, Discord also has private, invite-only servers for people to form groups or communities on their own.
Yet these same features also make Discord a breeding ground for misinformation and content unsuitable for younger users, creating problems the rest of the world must reckon with offline. While many Discord servers have moderators, bots and a zero tolerance policy, there is a growing concern among users, regulators and parents about the effectiveness of these measures to deal with some of the problems I saw during my time on the platform.
When I first signed up for Discord, I decided to learn more about cryptocurrency. I searched the site for popular cryptocurrency servers, clicked on a server aptly named r/CryptoCurrency and agreed to its terms of membership, which I quickly skimmed over. I clicked on the crypto-trading channel and an ever-updating list of coin prices popped up on the right-hand sidebar of my feed. There were 50 messages in the first three minutes.
Most of the messages were incomprehensible to me. There were memes, ridiculous icons and usernames, innumerable emoji reactions to posts and so many acronyms I had yet to understand.
I switched over to a more focused channel — one that made sense for March: taxation. I scrolled through the recent messages and was struck by the questions people were asking: Are there taxes associated with buying cryptocurrency? If I buy Ethereum in dollars and sell it for another cryptocurrency, is the trade taxed? Surely, I thought, Discord was not the place for finding out important information that could impact someone’s taxes. It seemed like people were simultaneously crowdsourcing hard-to-find information and relying on self-described experts.
Clearly, the main issue I faced was whom to trust. I knew virtually nothing about trading or cryptocurrency and had no idea whose advice to take. On Discord and similar platforms, users regularly seek out advice from anonymous users. There are virtually no consequences for intentionally giving dangerous advice or for engaging in shilling, a deceptive practice in which an individual talks up a stock they are invested in so others will buy it and drive up its price.
These problems with moderation were evident during the GameStop stock fiasco. Though Discord banned the r/WallStreetBets server for hate speech shortly after the trading frenzy, the server quickly returned — along with two offshoots — and Discord’s insistence that it implemented better moderation of audio, images and coded language seems lackluster in light of how easy it was for these servers to reappear and for users to create even harder-to-access offshoot servers. Every new offshoot creates another place that requires moderation to help protect vulnerable and malleable users.
However, banning servers on Discord does not mean these people and the misinformation they share will disappear from the internet. Emerging social media platforms like Discord and Parler offer greater anonymity, harder-to-access communities and few legitimate consequences under the guise of free speech. Users with an interest in creating and sharing misinformation are easily able to find new ways and places to do so — with dire consequences for those who fall prey.
The short-lived ban of the r/WallStreetBets server highlights the inability of platforms to truly combat misinformation and create safe environments for users. With increasing polarization online and the pandemic changing the structure of life, people seem to be desperate for a sense of community. But the challenge of creating healthy and helpful communities that are inclusive and safe is one that Discord has failed to overcome. Simply put, many platforms are still factories of misinformation and lack the moderation to end its production.
As Georgetown University students who want to see the world become a better place, we have a responsibility to build healthy and inclusive communities on social media. And as online platforms continue to play a role in the spread of misinformation, we must think critically about the role of social media platforms in society and the responsibility these platforms hold.
Young people still dominate platforms like Reddit and Discord, and self-moderation of misinformation in our communities can make a huge difference in someone’s life. New communications and social media technologies offer us all a way to connect, but they must also be managed and regulated using creativity and innovation — tasks that Georgetown students are poised to tackle.
Nicholas Budler is a first-year in the Master in Communication, Culture & Technology program.