As the interactive web grows and continues to connect us in different ways, online readers and content consumers are empowered more than ever to voice their opinion. The influence of comments, opinions and reviews attached to the information we are consuming online is undeniable, delicately divided by positive, informative feedback and irrelevant, aggressive trolling.
How often do you reference the comments under a news article, social post or product review? How important are other opinions to your content?
In this blog, I hope to revisit the nature behind commenting and how it affects our perception of content. Design, validity and anonymity play a critical role in many of our online interactions.
A recent article published in the New Yorker extensively explores the mental mechanisms behind online comments. In it, they touch on the phenomenon called the “online disinhibition effect.”
Termed by psychologist John Suler, the general concept of online disinhibition is that once online, Internet users shed the social restrictions associated with intrapersonal identity and face-to-face communication, adopting tendencies that they normally wouldn’t express to achieve an emotional catharsis, or peak. Increased affection, openness or aggression in online comments can be attributed to the protective anonymity and dissociative nature of the web. Therefore, we can expect anonymous user generated content (comments) to be mentally encouraged and potentially extreme.
This can be easily observed in online gaming, where players live vicariously through a screen name and, in doing so, disregard social norms—promoting a vividly aggressive and competitive environment.
Consider the cartoon “On the Internet, nobody knows you’re a dog." Created in 1993 by Peter Steiner, the illustration alludes to the protective anonymity of the web, suggesting any user can have a voice without a verified public identity.
We are more likely to trust others who clearly identify themselves online. Social networks like Facebook, Instagram and Twitter offer a curated, somewhat accurate window into the user behind a profile. Yet, those who decide not to identify themselves still hold a considerable influence over the content they interact with.
Consider the magazine Popular Science. In September, its news branch announced it would be shutting off its commenting system. The announcement, written by the magazine’s online content director Suzanne LaBarre, says the overwhelming amount of trolling and spam compromised the scientific thought and findings the magazine stands for. LaBarre writes, “Even a fractious minority wields enough power to skew a reader’s perception of a story.” Consequently, the credibility of research and, even possible funding, was endangered.
Most of us think of the web as a powerful tool of free speech, but when the intent of content is compromised by a dissenting minority, original ideas can be uncontrollably skewed. Hence, in the most extreme cases, trolling has emerged as a serious, strategic action to dissuade and polarize original content. An article in the New York Times explores the secret subculture of trolling and how it affects our lives away from the screen.
Many other online entities like Popular Science are taking steps to combat abusive commenting, trolling and spam. The Huffington Post maintains a strict comment policy where all comments are heavily moderated to maintain the integrity of an article. Repeated ad hominem attacks, derogatory slurs and spam will lead the banning of the user’s IP address.
YouTube has made a drastic redesign to its commenting system. By integrating Google+ in an effort to circumvent negative, irrelevant comments, YouTube hopes to encourage personalized social transparency. The change has created a mixture of support and resentful backlash.
Social media coordinators are effectively the moderators of their social pages, and should be mindful of harmful comments that threaten your content’s messaging. Well known applications like Facebook and Reddit offer “Top Comment” or “Most Recent” options that easily change the type of public opinion readers see.
Countless websites offer public reviews, comment sections and chat forums, but few require valid identification requirements beyond an email address in order to contribute an opinion. Public discussion and debate is essential to the free web we know today, yet as Internet users and content consumers, we should be critical of how we interpret information online, especially within public comments. What do you think?
Andrew Osegi is a Content Associate with Kuno Creative and lives in the Live Music Capitol of the World, Austin, TX. His focus is in content publishing, social media management and community engagement. He likes breakfast tacos, barbecue and researching the ever-changing trends between technology and culture.
Photo Credit: Brian Moore, The New Yorker, eurobas