Norwegian version

How do political parties stop hate speech on Facebook?

Illustation picture. Man looking at a screen.

In social media debates, threats, hate speech and nasty comments are growing more and more common.

How do Norwegian political parties treat hate speech on their Facebook pages? What are their options and how do they sanction users?

Associate Professor Ihlebæk of OsloMet and Associate Professor Bente Kalsnes of Kristiania University College set out to find the answers to these questions.

They interviewed the people responsible for the Facebook pages of eight political parties at Stortinget, the Norwegian parliament.

“The moderators told us that they tolerate a lot of hateful comments. However, sometimes moderating is necessary because you cannot turn off comments on a Facebook page. That option only exists in Facebook groups,” Ihlebæk explains.

Facebook pages and groups

Facebook pages are for companies, products, organisations, public figures etc. You create a page when you want to promote yourself or your company or if you want people to follow you without being friends on Facebook. As page administrator you receive different data about those who follow you, and your posts show up in your followers’ news feed.

A Facebook group is a better choice for a group of people with common interests and/or something specific to discuss. It works best when members visit the group often and actively participate. You can create an open, closed or secret group. If you are a member of a group, you receive content from the group in your news feed.

Hiding comments is the most common response

“Our study shows that the most common response is to hide hateful comments. Hidden comments are only visible to the user and his or her network. Nobody else can see them.”

The researchers discovered that the moderators use the hide function to avoid negative comments and criticism from other members.

“In this way, they avoid triggering negative reactions, since the users are not aware that their comments have been hidden.”

Ihlebæk and Kalsnes are critical of moderators hiding rather than deleting posts.

“We understand that this is a useful function, but how does it affect the debate if the users do not know that they have been moderated?" Ihlebæk asks. "Participants will then believe they are still participating in the debate and will thus not have the opportunity to learn from their mistakes. That is unfortunate."

Lack of transparency

The associate professor also points out that the political parties do not inform users that they use this function.

“The lack of transparency about how moderation works can make matters worse,” Ihlebæk believes.

She underlines that Facebook also has a responsibility here.

“It would lead to more transparency if Facebook had a function telling its users when moderators have hidden their comments.”

“In order to prevent the comments section from getting out of control, it would have been helpful to have the option of turning off page comments in some situations.” 

Deleting comments is also a common response among the political parties. However, all the parties claim that they rarely block users since they want people to participate in debates.

The lack of transparency around moderation and how it works can make matters worse. – Associate professor Karoline Andrea Ihlebæk 

If comments under a post get out of control, the only alternative on Facebook pages is to delete the whole post. Moderators rarely choose this option.

At the same time, Facebook does have a filter function.

“Many of the moderators use the filter and add words that often appear in hateful comments. Comments containing these words are not be visible to other users,” Ihlebæk adds and gives examples: “Words such as quisling, Nazi, racist and Satan are examples of words that the parties filter out”.

How to moderate

  • Filter: Add a list of words that often appear in hateful comments. Comments containing these words will be hidden to users. The users are not informed that a comment has been filtered.
  • Targeting: Target the post in order to limit which users see it in their news feed.
  • Block users: Limit users' possibilities by blocking them from commenting on the page. If a user is blocked, he or she will be notified.
  • Hide comments: If a comment is deleted, the user will see that it is gone.
  • Delete comments: if a comment is hidden, it will still be visible for the user posting it and the user's network.
  • Delete post: Remove the whole post.
  • Turn off comments: Only possible for posts in groups, not on pages.

The question of responsibility

The researchers also asked how the interviewees perceived their responsibility for moderating the parties' Facebook pages.

“We felt that they were very aware of their responsibility. Some of them spoke about their editorial responsibility and others about their responsibility to facilitate a respectful debate.”

Most parties had Facebook moderation guidelines and procedures in place.

“The political parties we spoke to do not have the same resources. Some had incorporated procedures better than others, but all of the parties had discussed how they were going to conduct moderation.”

The parties’ debate rules and guidelines were not always easily accessible for users of the pages. 

“The guidelines were difficult to find on the party's pages, and none of the parties stated that they hide hateful comments. On the other hand, this has a lot to do with how Facebook has designed its pages,” Ihlebæk explains.

The parties report that moderation takes time.

Ihlebæk and Kalsnes also tested Facebook’s tools for moderating pages and groups as a part of their study. 

“In an ideal world, users should be notified that their comments have been moderated and for what reason. However, giving this feedback would require a lot of resources,” Ihlebæk concludes.

Reference

Kalsnes B, Ihlebæk KA. Hiding hate speech: political moderation on Facebook (journals.sagepub.com). Media, Culture & Society. September 2020. 

Contact

Loading ...
A research article from:
Faculty of Social Sciences (SAM)
Published: 05/02/2021
Last updated: 10/02/2021
Text: Heidi Ertzeid
Photo: Maskot / NTB Scanpix

Featured research

Statsrådene Kjell Ingolf Ropstad, Bent Høie, statsminister Erna Solberg og Guri Melby på pressekonferanse.
How to communicate effectively during a pandemic

There are some common mistakes the media and health authorities make when communicating a pandemic threat, according to Professor Harald Hornmoen.

Six teenagers relaxing together, and using their mobile phones.
Researchers seek to better understand the lives of young people in Norway

Using a variety of methods, OsloMet researchers are gaining new insights into young people's opinions, struggles and aspirations.

Woman using her finger to touch a big table-like touch screen with big letters and one letter marked in green.
In an emergency, accessibility counts

Universal design is a necessity for some, but good for all. This mantra is driving OsloMet researchers Weiqin Chen and Terje Gjøsæter in their ongoing work.

Computer screen seen through the glasses of a man looking at the screen. Most of it is out of focus except the small part seen through the glasses.
Lack of knowledge affects people with visual impairments

"Employers do not know enough about visual impairment and therefore choose not to hire applicants," says research fellow Gagan Chhabra at OsloMet.

Picture of older lady using the tangible cup.
Make new friends and avoid feeling isolated using a tangible cup

Researchers at OsloMet have created a coffee cup that can make information technology more accessible to the elderly. The cup facilitates social contact and finding new online friends.

Sognsvann Lake in Oslo under cloud cover.
How to prevent people from becoming radicalised

If we are serious about keeping people from radicalising and committing acts of terrorism, the work begins long before they become active on extremist websites.