Foto from Vanco Dzambaski
As part of the advocacy efforts within the project Reporting Diversity Network – The New Agenda, Innovative Media hosted a panel discussion titled “Democracy at Risk: Fact-checking and Content Moderation on Social Media” on April 2, 2025.
The panel was dedicated to the integrity of facts and effective content moderation as paramount to safeguarding democratic processes, exploring the pressing challenges of combating hate speech online, how effective media monitoring serves as an early warning system, identifying emerging hate speech trends and allowing for timely interventions to prevent escalation and mitigate potential harm before it spills over into real-world violence.
Speakers at this panel were: Jona Plumbi, journalist, Faktoje, Albania; Maida Kjulahovikj, Media and Digital Policy Expert, Why not, Bosnia and Herzegovina; Nacho Strigulev, journalist, Blue Link, Bulgaria; and Despina Kovachevska, Media Monitoring Specialist, Innovative Media, North Macedonia. Elida Zylbeari, president of Innovative Media, moderated the panel.

The panel was a call to action to all institutions responsible for detecting and denouncing hate speech online as part of their official duty to be more active as data from RDN monitoring’s show that the presence of hate speech on the digital platforms is overwhelming in North Macedonia and Balkan region. Also, a call to all stakeholders was issued, to be more proactive in their respective fields, so together we could fight this phenomenon.
Elida Zylbeari said that the twin pillars of fact-checking and content moderation are not mere technicalities; they are the lifeblood of a healthy democracy. Without them, the lines between fact and fiction blur, and the voices of reason are drowned out by a cacophony of falsehoods.
“Social media platforms, the modern-day town squares, cannot remain neutral in the face of blatant falsehoods and harmful content. They bear a responsibility to create an enabling environment, one that promotes accuracy and discourages the spread of harmful misinformation. This doesn’t mean censorship; it means implementing transparent policies, enforcing community standards, and investing in technologies that can identify and flag misleading content “she added.

Despina Kovacevska a media monitoring specialist from Innovative Media said that hate speech and disinformation are related:
“Through hate speech, disinformation is justified, stigma and bias increase. And, especially when we have unprofessional media that chase clicks and profit instead of thoroughly checking the facts, then we have a big problem,” shared Kovacevska.\

Jona Plumbi, from the fact-checking organization Faktoje, noted that a multifaceted approach to the problem is needed.
“The level of media literacy is at a very low level in our country; thus, the subject of media literacy should be integrated into education.” said Plumbi.

Maida Ćulahović, a Media and Digital Policy Expert from the organization Why not from Bosnia and Herzegovina said that without clear legislation, the platforms will do nothing, i.e. voluntarily implement measures to address content moderation.
“In the last elections, we tried to work with the platforms for posts and content that violate the code of conduct [of Meta’s platforms], illegal content or those that contradict the electoral code, and there was very little responsiveness. Out of 119 reported contents, we received a response for only 25 of them. And only 3 were removed, but only after we appealed the initial decision,” Ćulahović said.

Nacho Strigulev, a journalist from Blue Link, Bulgaria, continued the discussion on the same topic, saying that we cannot rely on platforms to defend the information space, because they are, at the end of the day, profit-oriented companies:
“Platforms have no problem transforming themselves into weapons against democracy and the democratic order because such content brings clicks and interaction with the content. On the other hand, through that they make profit” said Strigulev.
He further mentioned that such algorithms are everywhere and such algorithms [for example, “For You” pages] should be regulated, and turned off during election periods. Strigulev also drew attention to various chatbots that use AI-based technology and said that between 6 and 50 percent share disinformation that we know is linked to the Kremlin, including sharing sources from the Pravda network of pages.

In the end, the speakers agreed that the EU Digital Services Act is the best way for harmonization of the national legislatives in the process of EU accession. This process should not be done pro forma, but should be an inclusive and transparent process, especially if we want the big platforms to take this region more seriously.
This discussion was part of a larger conference celebrating International Fact-Checking Day—Together for Truth: Whole-of-Society Approach to Safeguard Democracy—that brought together civil society organizations, media, fact-checkers, activists, as well as high officials, ambassadors and representatives of national institutions.
