Skip to main content

FM
Former Member

The Good, the Bad, and the Ugly of Online Communities

Since the early days of the internet, online communities have flourished. But these days, have groups like Reddit, QAnon, and even Facebook developed outsized impacts in ways we may not realize?

https://hips.hearstapps.com/hmg-prod.s3.amazonaws.com/images/goodbaduglyonline-1-1616105025.png?resize=980:*With collective groups coming together en masse, online and in real life, we're seeing norms being shattered as we progress forward. So we’re bringing you a series of stories that illuminate and celebrate collective power in action and how it's reshaping our world.

Before the widespread adoption of the internet, power dynamics were primarily constrained to real-world locations: the office, the dinner table, and the bedroom. In the modern era, amplified by algorithms and social media clout, power dynamics have shifted, changed, and impacted our society and culture in ways both positive and poisonous.

Whether we're talking about the attack on the Capitol or the latest hot take on WallStreetBets, the power of online communities is undeniable. Masses of anonymous, faceless people sitting behind the perceived anonymity of their screens have collectively done what used to be largely thriller-movie fodder, like move markets and attempt to destroy governments.

Online communities have upended livelihoods, fortunes, and even threatened to render the fabric of democracy as we know it. They've also created tremendous social change and increased awareness around everything from sexual harassment in once cloistered industries to diversity and inclusion in nearly every aspect of society. Everything from the #MeToo movement to the Capitol attacks in January have originated in online communities and on social media.

So, how did we get here? From listservs to AOL chat rooms, from multi-player video game worlds to the darkest corners of the internet which spawned conspiracy theory group QAnon, how we’ve gathered online has taken on form after form.

But not all digital populaces have been created, are, or even want to be equal in their desires to exact change for the better or worse. So, what can we do to amplify those online collectives doing good in the world, while also addressing those working for naught in the ongoing culture war over the internet?

The rise of the power of online communities

https://hips.hearstapps.com/hmg-prod.s3.amazonaws.com/images/mobile-phone-users-royalty-free-illustration-1616189530.?resize=768:*smartboy10Getty Images

Professor Whitney Phillips is an Assistant Professor of Communication and Rhetorical Studies at Syracuse University who has been studying the power of mis and disinformation, political communication, and digital ethics since 2008. She's the author of a book on the rise of online communities and the havoc they can wreak, and the co-author of two other books: The Ambivalent Internet: Mischief, Oddity, and Antagonism Online, which examines the way we express ourselves online (for good and ill) and a recent book called You Are Here, on how to make sense of the current chaotic online world we live in, and discover the real truth behind the multitude of false claims made online. She says that online communities have always acted as a gathering node for good and bad — regardless of whether they were perceived as fringe and antisocial or mainstream and highly-social.

“The assumption around early trolling spaces and hotbeds of negative activity was that they were fundamentally antisocial anonymity,” Phillips says. “We explored this in This is Why We Can’t Have Nice Things, and found that even the fully anonymous forums demonstrated the most social behavior you could observe — each person in the group performs for each other's ‘cameras’ because their identities are wrapped up in group norms about being a good community member. That can be positive to do good, but, if a community's aim is to raise hell, then, well you know what happens. At its core, that impulse to perform for the people around you or in your group is the thing that prods people on, and adhering to the group norm dictates what the outcomes are. If the group is fundamentally on pro-social outcomes, anonymity can enhance that but if the group norms are terrible then you have negative outcomes."

Phillips points to what she identifies as a watershed moment in online communities' power: Gamergate, which bubbled into the media consciousness in August of 2014 and began before that in 2013. Here's what happened: Gamers who were active on the anonymous English-language image board website 4chan, Twitter, and IRC (Internet Relay Chat), targeted several women in the video game industry. Why? Specifically, as this Washington Post story points out, the typical white, "nerdy" male gamers went after notable female game developers, like Zoe Quinn and Brianna Wu, for advocating for increased diversity in gaming.

Quinn was the creator of a popular game called Depression Quest and traditional gamers took issue with the game because it was more of a story and piece of art than a typical video game. Many prominent video game critics gave it rave reviews, however, as it garnered media coverage, an ex-boyfriend of Quinn's posted somelong and uncomfortably intimate blog posts accusing Quinn of sleeping with various people, including a prominent video game journalist, to get coverage and get ahead in her career. The hashtag #GamerGate began popping up, and Quinn became the victim of doxing (when someone publishes private and personal information on the internet with the intent to do harm), as well as rape, violence, and death threats.

Gamergate eventually gave way to what became the start of an Internet culture war, which Phillips says are the early beginnings of the right-wing, white supremacist, Capitol-storming groups we are grappling with in the current day.

“Gamergate was a turning point in a lot of ways because it was this hate and harassment campaign that became a cultural inflection point,” Philips says. “It matters because it was the first time, en masse, that you could clearly see the harm that could result from coordinated harassment starting online. It hadn’t been so prominent before.”

Yet, as Phillips points out, the response to what happened to the people targeted by Gamergate was largely ignored. “All the responses, all the norms that were there in 2013, clearly, demonstrably showed what the consequences of out-of-control hate and harassment were,” she says. “The prevailing thing to do was to do nothing. So, all of those problems were allowed to simmer and increasingly became worse. The more news coverage things like Gamergate got, the more people self-selected to further white supremacy, yet still nothing was done. Then, fast forward to today and Trump arrives. Now, people are ‘shocked,’ and they can't believe that the Internet is being used for ill. Yet when this really became a larger problem, the people with the largest microphones weren't saying anything about it."

As context, Gamergate was one of the first times that hatred, racism, and sexism from some of the darker, less mainstream corners of the internet made enough waves to rise to the level of media coverage and public consciousness. The more attention it got, the more people were drawn to the movement. Many point to Gamergate as the inflection point for what would become an emerging and more public power of white supremacists and hate groups online.

It’s not all bad, however. Just look at movements like #BlackLivesMatter, which became what it is today, as a direct result of the power of online communities. The hashtag began in 2013, after the killing of Trayvon Martin. When George Zimmerman was acquitted of Martin's murder, Alicia Garza, a labor organizer in California, used the phrase on her Facebook page, and a movement was born. Fast forward to the events of summer 2020, when protestors rallied around #BlackLivesMatter using platforms like Twitter, Facebook, Instagram and other social platforms to organize and take to the streets worldwide to advocate for equality and diversity in all walks of life. The BLM movement uses social media and the power of online communities to bring awareness to issues of racism, oppression, and violence against people and communities of color. It continues to impact our world today.

The rise of BLM underlines the fact that the internet is still also widely used by activists and organizers directly seeking to make positive, social impact and change in America and around the globe.

#MeToo was also born online as a hashtag, and similarly saw millions of people rally around it’s message, wherein women (and some men) could speak out and publicly share their stories of sexual harassment at work or in another part of their lives.

Though parts of the internet that spawn ill-willed actors and groups, there are also ways in which people focused on progressive values, equality, and justice are also coming together in powerful ways that, at best, don’t just remain hashtags or social media activism, but grown into real-life movements that can change lives for the better.

The myth of “Free Speech”

https://hips.hearstapps.com/hmg-prod.s3.amazonaws.com/images/chain-and-lock-on-laptop-royalty-free-illustration-1616189617.?resize=768:*sorbettoGetty Images

According to Phillips, understanding how these online communities and movements can both do such good and be so destructive comes down to a cultural understanding of the idea of free speech.

"The first and most pronounced network factor and catalyst of power of online communities is the obsession with a particularly narrow understanding of what free speech is and should be. The way many people and companies talk about freedom of speech is in a negative sense. They say it is freedom from anyone telling you what to say or not say," Phillips says.

That idea goes back to the foundation of our democracy, as Phillips notes. “Free speech is built on the idea of the marketplace of ideas. It is this belief that you have to keep society open and you want more speech, not less — even if that means allowing the worst kind of speech. There’s this idea that rational, deliberate engagement by citizens will ensure that the good things float to the top,” Phillips says. “There are a lot of problems with that model, though. The most basic is that people’s voices don’t carry equally. Free speech only works if everyone’s voices are valued. Certain people’s voices are taken more seriously. You can have dissenting opinions that never enter into consideration. This is kind of a rhetorical survival of the fittest. It goes back to this idea that more free speech is better and more is right. In the world of the internet that’s just not true.”

Why has free speech come to include protections for hate speech? According to Phillips, it comes down to two things: Algorithms and monetization. First, you need to understand how social media and search algorithms work. When you search for something or signal to an internet company like Facebook or Google that you are interested in, like, or frequently read about a specific topic, the algorithm learns that behavior. It will then surface those topics and interests to you more often to continue to keep you coming back for more. That's because these companies make money off your attention and the advertisements that they can sell to get you to buy more things. "The networks are ambivalent in the sense that they could be utilized for pro-social ends and negative, antisocial, anti-democratic ends," Phillips says.

The second factor is monetization. "The freest possible speech is the most monetizable,” Phillips says. "That's why the problem started to present when social platforms started to scale. You have more harassment and it becomes increasingly difficult to ignore. These companies said, okay, we are going to let the market decide and make money off it instead of moderating it. The companies didn't have any financial incentive to step in and say, ‘hey, this is poisoning our spaces.’"

Phillips describes our current era as the “network crisis.”

“This isn’t just going to get fixed on its own,” she says. “It’s not that our systems are broken. They are working as they were designed to work. The marketplace of ideas was calibrated to do it this way, and it will get worse over time.”

It raises questions as platforms and social media companies become more powerful — if, indeed, like other parts of media, they should be regulated in some form or fashion, if they should or would encourage or follow any ethics guidelines, and if so, what that would look like? Currently, the internet operates much, like the Wild West. And though that does generate a lot of open speech, societies may be asking themselves if we are seeing the best outcomes at present.

Our responsibilities as members of powerful online communities

https://hips.hearstapps.com/hmg-prod.s3.amazonaws.com/images/conference-call-shared-screen-young-royalty-free-illustration-1616189743.?resize=768:*nadia_bormotovaGetty Images

Phillips points out, both in our conversation and in a piece she penned for Wired, that even though tech companies have started more strictly enforcing the rules on their platform, and even if the collective decided to rise up and take companies like Facebook and Google to task, it won’t erase the negative impacts that online communities will continue to wield. More must be done; not only by companies but by users of these platforms (read: us).

In her book, You Are Here, she and her co-author, Ryan Milner, argue that in order to return to some form of unity, everyone has to come to terms with the history and context that allowed the current societal rift to happen. In the book, Milner and Phillips draw parallels between biomass pyramids that explain predator and prey relationships and networked environments that explain how and why events like the Capitol attacks happened. As she writes in her Wired article, "by focusing only on the top of the pyramid...we fail to address the underlying ecological conditions that nourished them. Those conditions are older than Donald Trump's presidency, older than Facebook's recommendation algorithms, and older than Fox News. They emerge from decades of conspiratorial messages within far-right Evangelical circles, as well as the sophisticated media networks that helped spread them. And if we don’t address these causes, it’s only a matter of time before a new generation comes roaring to the top of the pyramid.”

She suggests that the first step to combat the overwhelmingly negative impact of the power of online communities is education. “This is not just about fact-checking,” she says, “It’s about teaching people new skills and knowing that research can make things worse because of the algorithms. How do we teach students not just what to do on the Internet, but also instill a more communitarian ethic (i.e. how your behavior impacts the community) ? How do we focus on community freedoms and diversified voices? We also need to continue to recount how we got into this time and rethink how we are responsible for each other.”

That means that as a responsible consumer of media, we each must consider how our words, tweets, shares, likes, photos, and status updates, impact the rest of the network. She explains it as the “Green New Deal for the Digital Age.” As she says, “Where are you putting your money and your attention? What kinds of things are you avoiding putting in the metaphorical and literal river by sharing, not sharing, or amplifying? As individuals, no one is going to solve it alone, but if we can get thinking about our relationships to others we might have a chance.”

She suggests that the best way to frame this is to ask yourself: “What could the downstream consequences be of amplifying, ignoring, or addressing this issue? Who might this impact?”

Essentially — take a breath — and think before you search, post, share, or ignore.

Power dynamics that are both good and bad can be magnified within online communities, and as we know online movements can become real ones. In addition to technology companies taking the content posted on their platforms seriously, it’s very much also up to us as individuals to take responsibility for both the good and the bad that technology offers.

Abigail Bassett is an Emmy-winning journalist, writer and producer who covers wellness, tech, business, cars, travel, art and food. Abigail spent more than 10 years as a senior producer at CNN. She’s currently a freelance writer and yoga teacher in Los Angeles. You can find her on Twitter at @abigailbassett.

Add Reply

×
×
×
×
×
Link copied to your clipboard.
×
×