News

Small State, Large Platforms: A New Way of Understanding Internet Moderation

29. November 2024.13:28
Content moderation on social media in small countries like Bosnia and Herzegovina requires joint effort, coordination, and cooperation between organizations, individuals and state agencies. This was a key conclusion of this year’s Internet Governance Forum (IGF) in BiH.

This post is also available in: Bosnian

Jillian C. York na panelu IGF-a u Sarajevu. Foto: Detektor

During a panel of the fifth Internet Governance Forum, held in Sarajevo, participants exchanged knowledge on how small countries like BiH moderate online content on major platforms and social media networks.

Jillian C. York, Director of the Electronic Frontier Foundation, observed that content moderation on social media in 2008 was nothing like what we deal with today.

“Back then, the platforms were still small. Now the world is different,

and we have different platforms, different rules and regulations within countries themselves. At first, there was human-led moderation, while today moderation comes down to algorithms. As much as 90 percent of Meta, the company that owns Facebook and Instagram, is first moderated through algorithms,” York said.

She noted that platforms often reflect the worldview of their home country. For example, the United States tends to have a broader understanding of the right to freedom of expression, while Germany takes a stricter view, especially concerning hate speech.

Problems are also evident in content moderation at the technical level.

“During the genocide in Myanmar, there were eight people who moderated one social media network. Moderation must take place in the local context. What we do know is that good moderation and a good understanding of local culture is crucial,” York said.

Such inconsistencies make it difficult for smaller states and societies to gain representation. Additionally, York noted that moderators assigned to a particular country often operate from other locations, and that moderation in, for example, Gaza and Palestine is under external pressure.

York did not have information on the number of Meta moderators  dedicated to Bosnia and Herzegovina.

“In small countries, moderation is inconsistent, and local specificities aren’t taken into account,” York said, emphasizing that sometimes social media networks have an outsized influence on small societies.

Maida Culahovic, Policy Coordinator at “Why Not?” CSO, highlighted the lack of clarity surrounding the role of BiH’s institutions in addressing moderation issues.

“The role of institutions must be clearly specified. We don’t have that. A long time ago, we spoke about an Internet free of regulation, when that role was technical and passive, as intermediaries between content uploaders and consumers,” said Culahovic.

She stressed the need to distinguish between illegal content and harmful content, which, while not necessarily illegal, can negatively impact freedom of expression, public discourse, and democratic processes.

“We need to advocate for a meaningful and realistic approach, considering what we have at our disposal and our influence on large platforms, since we are small and not as relevant,” Culahovic said, citing  Slovakia as a good example of a small European country that has managed to assert itself despite having few native-language moderators.

Maja Calovic from the Coalition for Content Moderation also emphasized that BiH’s market size, language, and limited number of speakers makes the country less visible to major platforms, necessitating coordinated action from the non-governmental sector.

“There are organizations that specialize in different things – anti-LGBT attacks, disinformation, hate speech (…) We need to empower the civil sector as well as others involved in these processes to be as active as possible,” Calovic said.

Violations of the law online can be reported to authorities like  the police, data protection agencies, the Press Council, and organizations dealing with disinformation.

“However,” Calovic noted, “there is no point where this can be communicated to the platforms, because we don’t have the capacity for that.”

She explained that platforms have internal mechanisms based on a mix of guidelines,  laws, business practices, even cultural attitudes specific to their country of origin.

“They all have tools to report certain types of content. As for how effective this is, everyone can find examples of how quickly they react and whether they react,” Calovic concluded.

Edin Ikanovic from the Srebrenica Memorial Center discussed genocide denial online, contrasting the large number of instances with the very small number of resulting indictments.

“Now, we’re again in a situation where instances of denial are on the rise because we don’t have judicial results,” he said.

He stressed that reporting controversial content denying the Srebrenica genocide often requires hard work, not just a simple click.

“We know how to report content and we know the procedure. Reporting by one person means nothing. In order to remove certain content, you have to hire an army of people, and only then is there a chance of removing the content,” Ikanovic concluded.

Irvin Pekmez


This post is also available in: Bosnian