Share This Now! How Conspiracy Theories Swamped North Macedonia

The fight against the spread of fake news is becoming more sophisticated in North Macedonia – but those involved in disseminating misinformation are upping their game as well.

The day starts with coffee and unread messages: a few from friends, a few work related, a paid furniture ad, and one with lots of exclamation marks that indicates that it must be read immediately before it is deleted from the Internet. This is because it reveals a big secret, hidden from ordinary people.

That “secret” may refer to the “fake” pandemic, the “dangerous” new vaccine, the “global conspiracy against Donald Trump”, the “dark truth about child-eating elites” – an especially a popular term – and so on.

The sender or sharer may well be an ordinary person that we know personally or through social networks, and who sends such content for the first time or occasionally.

Spreading misinformation through personal messages has become increasingly common in North Macedonia, as elsewhere.

But this is not the only novelty. As the fight against fake news has intensified, with changes of algorithms on social networks and the inclusion of independent fact-checkers, so have the techniques that allow false content to remain undetected on social networks for as long as possible.

“Sending personal messages is an attempt to spread misinformation faster, before it can be detected,” explains Rosana Aleksoska, from, Fighting Fake News Narratives, F2N2, a project led by the well-known Skopje based NGO MOST, which searches for misinformation on the Internet.

Among the newer methods used to avoid detection, she notes, is the mass sharing of print screens instead of whole texts, and, in countries that use Cyrillic script like North Macedonia, Cyrillic and Latin letters are deliberately mixed.

See and share before it’s removed
One video that recently went viral on social networks in North Macedonia, fuelling panic about COVID vaccines, was released on December 8.

In it, a former journalist appears to interpret a document outlining possible contra-indications in and side-effects from the newly developed Pfizer vaccine against COVID-19 – but presents them as established facts.

It got more than 270,000 views and 5,300 shares on Facebook.

While the video reached a large audience, those numbers only partly show just how far the misinformation spread.

The video soon found itself in the inboxes of many other people, after Facebook acquaintances sent it to them in a direct message, urging them to see it as soon as possible, before it was deleted or marked as fake.

People who believe in conspiracy theories, or regularly participate in disseminating them, send direct messages to each other, informing them that new material has been released.

At a first glance, one might think it sounds like a small obscure group, hanging out online.

But the results of a recent public opinion poll conducted by the Balkans in Europe Policy Advisory Group, BiEPAG, showed that only 7 per cent of the population in the region do not believe any of the best-known conspiracy theories, and over 50 per cent believe in all of them. The combined percentage of all those who said they believed in all or just in some of the theories was over 80 per cent.

With these huge numbers, it is not surprising that more misinformation also ends up in the virtual mailboxes of those who “don’t believe”, persuading them to switch sides. Some of these people receive three or four such messages a week.

What the messages have in common is that they are accompanied by urgent words: “See this before they delete it from Facebook”, or, “Share and disseminate”, or “They could no longer remain silent, take a look”, etc.

Because people pay more attention to personal messages than to other social media posts, they are more likely to see this content. They may well also spread them, explains Bojan Kordalov, a Skopje-based expert on social networks and new media.

“The way they are set up and designed, fake news gives people a strong incentive to spread them,” he said.

The pandemic was the main topic of misinformation this year, but in North Macedonia this topic intertwines with others, ranging from Euro-Atlantic integration to politics, Aleksoska from F2N2 observes.

“The object of the attack is people’s emotions – to provoke an intense reaction,” she says.

As the year went on, the subject of messages also changed. At first they focused on the “false” nature of the virus, and then later on how there was no need to wear masks or observe social distancing and other health-protection measures.

After the breakthrough in discovering a vaccine was made, the messages began to focus on the alleged dangers and health risks of vaccination.

“Don’t believe, check” – as we instruct you
The video about the supposed effects of the vaccine that gained traction in North Macedonia is a typical example of what typical disinformation looks like. Similar videos are produced every day.

Among the private messages received by social networks users are videos of people posing as doctors from the US, Canada, Belgium, Britain or Germany, filming themselves with webcams, warning that vaccines may well be deadly.

In one video, which focuses on reading the instructions on the Astra Zeneca vaccine, it is also clear that the creators of fake news use the same messages as those who fight fake news, such as: “Don’t believe, check”.

However, they also provide the guidelines about what to “check”.

“Don’t trust us, investigate for yourself. For example, visit these sites. Or google this term, ChAdOx-1. See here, it says – micro cloning,” the narrator in this video can be heard saying as the inscriptions from the vaccine packaging are displayed.

“They convince us that it is safe, but the traces are here in front of us,” the narrator adds, in a dramatic tone.

Finding new ways to bypass filters
Although outsiders have no direct insight into exactly how social networking algorithms detect suspicious content, as they are business secrets, many experts on these technologies told BIRN that certain assumptions can be drawn.

As the creators of disinformation can also be technologically savvy, they have likely drawn their own conclusions and seek new ways to bypass known filters.

One common alarm is when content goes viral quickly. This signals to social networks that the content needs to be checked. But if several different messages containing the same main point are sent, instead of one identical message, the protection algorithms may have a harder time detecting the content’s risk.

Apart from masking the content, spreaders of misinformation use different formats to avoid detection.

Print screens of articles and of social media posts may be shared instead of the actual articles or posts. Some users even do this with their own posts, and republish them as photos.

“Print screens are common in conducting disinformation campaigns. This is just one of the mechanisms they use,” Aleksoska explains. “The problem is much bigger, so the answer must be comprehensive and coordinated.”

Print screens are not only more difficult for the software to detect, but make it harder for people to check, especially if the name of the media outlet that published the content is omitted or cut from the photo.

The part of the internet in North Macedonia recently saw a print screen from a Swiss media outlet circulating with the title in German reading: “Currently no vaccine can be approved.” Hundreds of people shared it.

The publisher that first spread this print screen claimed that the Swiss had rejected the German vaccine “because of the risk of death”.

But the real text does not say at all that Switzerland rejected the German vaccine but only that it will first implement a risk control strategy “to prevent side effects or fatalities”.

This way, those who spread fake news have a clear advantage over those who fight to stop it.

In order to reach the original article, one has to first rewrite the title in German in a search engine, find the text with an identical title among the results and translate it with an online tool. While doing this, ten people will have since received this print screen and will just click “Share”.

Print screens in North Macedonia have also recently been used to spread untrue information about the current dispute between North Macedonia and its neighbour, Bulgaria, which has refused to allow Skopje to start EU accession talks.

Some of these posts present Bulgaria’s demands as something that North Macedonia already accepted.

Since the main bone of contention is the Macedonian language and identity, it is one of the most sensitive issues currently preoccupying the public.

Another technique used to avoid or baffle filters is mixing Cyrillic and Latin letters that are identical in meaning or form, like the letters a, e, n, x, u, j, s, as well as some others.

When a social media user complains that a post has been removed from their profile, in some cases, another user will advise them next time to mix up the letters, making it harder to detect problematic content.

Ideological foot-soldiers do the hard work
But why would anyone advise others on how to make it harder to for social networks to detect their problematic content.

Checking some of the profiles that publish and spread misinformation reveals that, besides the usual suspicious suspects – like thematic profiles with false names that only publish information from one or more sources, or people who are part of formal or informal organizations and spread their ideology – a large number of users have no known connection to disinformation networks.

Most are ordinary people who do not hide their identities, publish photos of family trips, but also from time to time share some “undiscovered truth” about the coronavirus or a “child abuse plot” – wedged between lunch recipes and pictures of walks in parks.

Fact-checkers and communication technology experts agree that disseminating misinformation is a highly organised activity, often done with a malicious intent – but also that many people share such content without hidden motives. They clearly feel a responsibility to be “on the right side”.

“Some people spread fake news because they believe in it and think that by doing so they are contributing to some kind of fight for the truth to come to light,” Kordalov explains.

This makes the fight against misinformation even more difficult, because while organised networks create and spread false news at the top, most of the work of dissemination is done by individuals and micro-communities that have no connection to them, or even between each other.

“All conspiracy theories are just pieces of the master theory that says that certain elites rule the world. The more somebody believes in that, the more likely he or she would read and share content supporting this theory,” Aleksoska notes.

However, there are some solutions. Algorithms, according to Kordalov, can be reprogrammed to recognise new forms of false news. No final answer can be found to misinformation, he admits, but the two sides constantly compete and the side that invests most effort and resources will lead in the end.

Technological competition, however, is not enough if it is not matched by stronger institutional action, because creating mistrust in institutions is one of the main goals of disinformation campaigns.

Kordalov says it is not enough for the PR services of institutions just to issue announcements rebutting fake news related to their work each time they spot it. They must be actively involved in a two-way communication and react to false news quickly.

“This is often called ‘damage control’ but this is not the point. Their [institutions’] job is to serve the citizens, and providing real information is part of that service,” he says.

One way for institutions to protect public trust in them is to provide high quality services, he adds. If they work well, and if citizens feel satisfied with them, it will be harder for disinformation to hurt them.

Check Also

Russian Offensive Campaign Assessment, November 18, 2024

Russian officials continued to use threatening rhetoric as part of efforts to deter the United …