[ad_1]
"If I painted a site that we were going to have, then in the end, I said to myself," By the way, at the bottom of all our articles, we will obviously let a pseudonym avatar do and say what they want without moderation "- if there was no convention on the Internet, if it was not this thing that was accepted, one would think it's a crazy idea," Ben Frumin, editor-in-chief of The week, Told Nieman Lab in 2015. Over the past five years, comments have persisted. But in the era of ubiquitous social media, does the internet still need a comment section?
The comments section is, by definition, directed by the readers who use it to express their reactions and opinions to press articles, often via pseudonymous or entirely anonymous accounts. This format works both at the expense of news sites.
On the one hand, it provides a forum for people who might be uncomfortable sharing their opinions offline because of social or legal repercussions, said Dr. T Frank Waddell, an assistant professor at the faculty of journalism and communication from the University of Florida. Engadget.
On the other hand, "the online comment section has somehow become the Wild West of people sharing their opinions," he continued. "And when these conversations become negative, the way in which the information is perceived can have harmful consequences, even if the comments section is completely separate."
A recent study from the University of Texas at Austin revealed that incivility in comments could have a disproportionate negative impact on the reader's perception of not just the article itself, but also of the media as a whole – AKA the Nasty Effect. Specifically, the researchers found that people who read stories containing only negative comments "had a less positive attitude towards the site and considered it less valuable", as well as "feeling less loyal to the site and less close to the commentators ". In addition, the order of comments (whether you read a positive or a negative first) makes little difference, but there seems to be a saturated level of negativity to be achieved.
Dr. Chen's research has reached similar conclusions. She pointed out that when comment sections began to be cascaded in 2014, industry analysts were waiting for conversations to simply be transferred to social media. The assumption was that, at least for Facebook, conversations would improve and become more civil because of the site's "real name" policy, which effectively eliminates the anonymity of users. "But nothing proves that it happened," she said.
Chen also said that his own research compared conversations around a pre-Trump administration of the White House. "We found that Twitter was more incivil than Facebook and that Facebook's people had more reasoned conversations," she said. "But that's partly because you have more space on Facebook.You are not limited to 140 characters.It's hard to argue sensibly in a very small space."
So, instead of closing comments en bloc, Waddell suggests that reporters play a more active role in the discussion following the publication of an article. "Research shows that when journalists participate in the process of producing comments, negative comments that could have been shared tend to be a little less influential than when they are not moderated and that the audience is allowed to fully guide the conversation. "
"We also found that if you have attractive comments from reporters in the feed, it improves the overall score, even if they just say," Hey, here's a follow-up of my story "or" Oh, excellent comment, "it reinforces the good behavior and improves the content of the conversation," said Chen.
However, moderating comments can have a serious emotional impact on the humans responsible for doing so. Earlier this year, Facebook was trapped by the appalling working conditions and mental health support its content moderators had to endure. The problem became so uncontrollable that a number of employees began to believe the conspiracy theories for which they had been hired to filter them.
As such, a number of companies seek to outsource the tedious work of moderating comments to algorithms and AI, although these systems need sufficient skills to cope with the rigors of work. "Training algorithms to detect incivilities have had mixed success," Chen said. "It's complicated, you know, people say sarcastically that an algorithm is not as good, but over time we can train them."
"My research shows that people can tolerate some nonsense," Chen said. "It's not as if an uncivil comment is going to be the critical point, no – people can handle imperfectly, they just can not stand it's really horrible."
[ad_2]
Source link