Killing comments will not cure our toxic Internet culture



[ad_1]

"If I painted a site that we were going to have, then in the end, I said to myself," By the way, at the bottom of all our articles, we will obviously let a pseudonym avatar do and say what they want without moderation "- if there was no convention on the Internet, if it was not this thing that was accepted, one would think it's a crazy idea," Ben Frumin, editor-in-chief of The week, Told Nieman Lab in 2015. Over the past five years, comments have persisted. But in the era of ubiquitous social media, does the internet still need a comment section?

The comments section is, by definition, directed by the readers who use it to express their reactions and opinions to press articles, often via pseudonymous or entirely anonymous accounts. This format works both at the expense of news sites.

On the one hand, it provides a forum for people who might be uncomfortable sharing their opinions offline because of social or legal repercussions, said Dr. T Frank Waddell, an assistant professor at the faculty of journalism and communication from the University of Florida. Engadget.

On the other hand, "the online comment section has somehow become the Wild West of people sharing their opinions," he continued. "And when these conversations become negative, the way in which the information is perceived can have harmful consequences, even if the comments section is completely separate."

A recent study from the University of Texas at Austin revealed that incivility in comments could have a disproportionate negative impact on the reader's perception of not just the article itself, but also of the media as a whole – AKA the Nasty Effect. Specifically, the researchers found that people who read stories containing only negative comments "had a less positive attitude towards the site and considered it less valuable", as well as "feeling less loyal to the site and less close to the commentators ". In addition, the order of comments (whether you read a positive or a negative first) makes little difference, but there seems to be a saturated level of negativity to be achieved.

asdf "data-caption =" asdf "data-credit =" Center for Media Engagement at the University of Texas at Austin "data-credit-link-back =" "data-dam-provider = "data-local-id =" local -1-1174156-1556808665129 "data-media-id =" 3f665bb1-9c8d-451e-9cbd-8fe673f2cb59 "data-original-url =" https://s.yimg.com / os / creatr-uploaded-images / 2019 -05 / b05326d0-6ce9-11e9-9f67-11430329be88 "data-title =" asdf "src =" https://o.aolcdn.com/images/dims?crop=746 % 2C600% 2C0% 2C0 & quality = 85 & format = jpg & resize = 1600% 2C1287 & image = 0%</p>
<p>"The argument was that it was anonymous comments [that caused problems] and people felt emboldened to say the worst things because they had no repercussions, "Craig Newman, <em>Chicago Sun-Times</em> editor-in-chief, said <em>Digiday</em> in 2014. "Then, at some point in the last two years, a change has occurred and anonymity is no longer needed when one has had to vomit a horror." Is almost like there was no more shame. "</p>
<p>"Toxicity and instability are like everywhere," said Engadget, Gina Masullo Chen, assistant professor at the Faculty of Journalism at the University of Texas at Austin and co-investigator of the study of January. "I mean, research really shows it: one in five, 20%."</p>
<p>This is partly due to humans' cognitive bias towards negative information, said Waddell. "It's the central tendency that negative information is more memorable and relevant to our decision making and information." The fact that this toxicity seems more widespread than it actually is can be attributed to the effect of movement.</p>
<p>"In the context of news, we mostly see [the bandwagon effect] when the crowd or online readers are negative. "However, positive comments do not seem to have much of an impact on the readers' opinion." when you have comment sections, where people criticize news stories or the quality of information, we see a lot of negative effects. "</p>
<p>Despite the negative impact that comment sections have on our perceptions of the news, neither Waddell nor Chen anticipates their departure in the near future. "I am not in favor of eliminating comments," Chen said. "First of all, because I do not think we can ever come to this place." She points out that even if all the information sites on the Internet closed their forums, these conversations would always take place, only on social media such as Twitter or Facebook. "I'd rather use techniques that will improve the flow of comments," she explained, "and some of the things that improve them are really very strong moderation and pre-moderation."</p>
<p>It's not as easy as just turning off comments. This could also have direct financial consequences on the information sites. "In a phenomenon known as shared reality," said Maria Konnikova in <em>New Yorker</em> in 2013, "our experience of something is influenced by whether we share socially or not, completely remove the comments and take away some of the shared reality, which is why we often want to share or comment in the first place. that others will read and react to our ideas. "</p>
<p>In short, comment sections are designed to generate repetitive traffic on a page (you post a comment and check occasionally if it has any responses or feedback) and encourage users to share this content with others. 39; others. Disabling this feature would invariably result in lower site traffic and ad revenue.</p>
<p>"Unfortunately, the dependence of online digital media on ad revenue leads to many concessions – such as the need for a comment section to encourage clicks and more time spent on the page," said Waddell. "If we counted less on that, it might be a big advantage for journalism."</p>
<p>"If we close the comments section," he said, "I suspect that these conversations will simply be through other media, such as Twitter, Facebook, or maybe Instagram."</p>
<p><img alt=

Dr. Chen's research has reached similar conclusions. She pointed out that when comment sections began to be cascaded in 2014, industry analysts were waiting for conversations to simply be transferred to social media. The assumption was that, at least for Facebook, conversations would improve and become more civil because of the site's "real name" policy, which effectively eliminates the anonymity of users. "But nothing proves that it happened," she said.

Chen also said that his own research compared conversations around a pre-Trump administration of the White House. "We found that Twitter was more incivil than Facebook and that Facebook's people had more reasoned conversations," she said. "But that's partly because you have more space on Facebook.You are not limited to 140 characters.It's hard to argue sensibly in a very small space."

So, instead of closing comments en bloc, Waddell suggests that reporters play a more active role in the discussion following the publication of an article. "Research shows that when journalists participate in the process of producing comments, negative comments that could have been shared tend to be a little less influential than when they are not moderated and that the audience is allowed to fully guide the conversation. "

"We also found that if you have attractive comments from reporters in the feed, it improves the overall score, even if they just say," Hey, here's a follow-up of my story "or" Oh, excellent comment, "it reinforces the good behavior and improves the content of the conversation," said Chen.

Facebook Developer Conference

However, moderating comments can have a serious emotional impact on the humans responsible for doing so. Earlier this year, Facebook was trapped by the appalling working conditions and mental health support its content moderators had to endure. The problem became so uncontrollable that a number of employees began to believe the conspiracy theories for which they had been hired to filter them.

As such, a number of companies seek to outsource the tedious work of moderating comments to algorithms and AI, although these systems need sufficient skills to cope with the rigors of work. "Training algorithms to detect incivilities have had mixed success," Chen said. "It's complicated, you know, people say sarcastically that an algorithm is not as good, but over time we can train them."

"My research shows that people can tolerate some nonsense," Chen said. "It's not as if an uncivil comment is going to be the critical point, no – people can handle imperfectly, they just can not stand it's really horrible."

Images: University of Texas at Austin (graphic); Getty (phone on social networks); AP (Zuckerberg to F8 2019)

[ad_2]

Source link