In social media debates, threats, hate speech and nasty comments are growing more and more common.
How do Norwegian political parties treat hate speech on their Facebook pages? What are their options and how do they sanction users?
Associate Professor Ihlebæk of OsloMet and Associate Professor Bente Kalsnes of Kristiania University College set out to find the answers to these questions.
They interviewed the people responsible for the Facebook pages of eight political parties at Stortinget, the Norwegian parliament.
“The moderators told us that they tolerate a lot of hateful comments. However, sometimes moderating is necessary because you cannot turn off comments on a Facebook page. That option only exists in Facebook groups,” Ihlebæk explains.
Hiding comments is the most common response
“Our study shows that the most common response is to hide hateful comments. Hidden comments are only visible to the user and his or her network. Nobody else can see them.”
The researchers discovered that the moderators use the hide function to avoid negative comments and criticism from other members.
“In this way, they avoid triggering negative reactions, since the users are not aware that their comments have been hidden.”
Ihlebæk and Kalsnes are critical of moderators hiding rather than deleting posts.
“We understand that this is a useful function, but how does it affect the debate if the users do not know that they have been moderated?" Ihlebæk asks. "Participants will then believe they are still participating in the debate and will thus not have the opportunity to learn from their mistakes. That is unfortunate."
Lack of transparency
The associate professor also points out that the political parties do not inform users that they use this function.
“The lack of transparency about how moderation works can make matters worse,” Ihlebæk believes.
She underlines that Facebook also has a responsibility here.
“It would lead to more transparency if Facebook had a function telling its users when moderators have hidden their comments.”
“In order to prevent the comments section from getting out of control, it would have been helpful to have the option of turning off page comments in some situations.”
Deleting comments is also a common response among the political parties. However, all the parties claim that they rarely block users since they want people to participate in debates.
The lack of transparency around moderation and how it works can make matters worse.– Associate professor Karoline Andrea Ihlebæk
If comments under a post get out of control, the only alternative on Facebook pages is to delete the whole post. Moderators rarely choose this option.
At the same time, Facebook does have a filter function.
“Many of the moderators use the filter and add words that often appear in hateful comments. Comments containing these words are not be visible to other users,” Ihlebæk adds and gives examples: “Words such as quisling, Nazi, racist and Satan are examples of words that the parties filter out”.
The question of responsibility
The researchers also asked how the interviewees perceived their responsibility for moderating the parties' Facebook pages.
“We felt that they were very aware of their responsibility. Some of them spoke about their editorial responsibility and others about their responsibility to facilitate a respectful debate.”
Most parties had Facebook moderation guidelines and procedures in place.
“The political parties we spoke to do not have the same resources. Some had incorporated procedures better than others, but all of the parties had discussed how they were going to conduct moderation.”
The parties’ debate rules and guidelines were not always easily accessible for users of the pages.
“The guidelines were difficult to find on the party's pages, and none of the parties stated that they hide hateful comments. On the other hand, this has a lot to do with how Facebook has designed its pages,” Ihlebæk explains.
The parties report that moderation takes time.
Ihlebæk and Kalsnes also tested Facebook’s tools for moderating pages and groups as a part of their study.
“In an ideal world, users should be notified that their comments have been moderated and for what reason. However, giving this feedback would require a lot of resources,” Ihlebæk concludes.
Reference
Kalsnes B, Ihlebæk KA. Hiding hate speech: political moderation on Facebook (journals.sagepub.com). Media, Culture & Society. September 2020.