Bezoek de website voor leraren en scholieren →

Since 2016 the world has been witnessing the Rohingya genocide in Myanmar, in 2021 we all saw the Capitol riots on tv. What do networks have to do with these events?

These two are completely different geopolitical events, yet they have similarities. Namely, preceding and during both events, there was a storm of misinformation and propaganda spreading, both offline and online. Social media platforms played a crucial role in this. And we should be discussing more the responsibility, and the power, that social media platforms have regarding what we see on our screens. Of course, there is a whole political dimension hanging above social media and the right to express freely your opinion. It is not an easy topic to discuss. But let us not forget, behind the political questions, there are also technological ones. For example, what we see on our screens is often chosen by algorithms, which of course choose which information to post to us according to certain criteria, like our preferences, but also sponsored material. And it is often not us who determine these criteria, at least not directly.

In this article, we are going to explore what was going on behind the screens during these two events and try to understand how the same social media platform could be used to reduce the riots, or at least their intensity.

The network behind the screen

Social media sites are networks of people, and two people are connected in terms of friendship (Facebook, Whatsapp), or followers (Instagram, Twitter). These networks are also growing since new users register on these sites. As you can imagine, social media networks get very large. Each profile can be seen as a node of a graph, and links between them (e.g., If a profile follows some other profile) can be viewed as edges between the nodes. Now if we consider the whole social networking site as a graph created in such a way, we observe that the graph is endlessly expanding with time and is extremely large.

Theories in random graphs and complex networks are generally used to understand these complex graph structures. In literature, we use the preferential attachment models, for example, to model dynamic random graphs like social media sites. In these models, the new nodes are likely to connect to an existing node that already has many connections. If you make an account on Instagram it is more likely you will follow some famous people at first.

So in the long run, the highly connected nodes tend to make much more new connections and become more central. This property is known as the rich-get-richer effect. There is one more interesting effect in the literature, namely rich-by-birth which can be observed in several static random graph models. In the social media setting, the rich-by-birth effect describes profiles and people who immediately get many connections in the network when they are created. In my research, I work on and try to understand the properties of, a special case of these preferential attachment models where the rich-by-birth effect is observed. 

Back to real life

Now we move to the real-life problems of propaganda and the spread of misinformation. For this, we need to understand the role social media can play in triggering riots. Often riots are triggered by spreading fake news and hate speeches. Moreover, specific groups of people may also want to trigger riots for personal or political gain. Here they use social media sites to spread their misleading ideas. If these wrongdoers want to make the riot happen, they try to make the fake news and hate speeches visible to as many people as possible, and as soon as possible. They don’t reach out to all the users but use some properties of these Networking sites.

These Networking sites use an algorithm that “shows you what you want to see”, i.e. if you start seeing some specific content, it’ll show you more similar content. This attribute is similar to the rich-get-richer process. And this process goes on and on. These wrongdoers continue to create such hate-spreading fake content and link them with some taglines or so-called hashtags. Of course as a reader and a user you want to shape a global impression and read many different views. But, in contrast with real life, it is difficult to seek pluralism when an algorithm decides what you are going to see.

During the Capitol riots, protesters used the hashtag #stopthesteal to steal the show.  A study shows that on the 7th and 8th of January 2021, more than 70% of the Parler (an American alt-tech social networking service) posts in the US contained this hashtag. More importantly, most of the Parler users were driven by a flood of the same hashtag, and most of the other hashtags used at that time were also of the same nature as #stopthesteal. You can have a look at this article published on Just Security (an online forum for the analysis of security and democracy based at the Reiss Center on Law and Security at New York University School of Law), and also this article in the Washington Post.

The wrongdoers made the hashtag posts rich with repeated use of fake news of voter fraud. Afterward, they didn’t have to do anything. Rich-always-get-richer property took over the action from that point and the rest is in the news.

Sometimes they target some genre of popular pages and start spreading their propaganda from there. When the roots of the fake news are too popular, they get a kick-start and it spreads too quickly. These sources are rich from the time of their birth and fake news spreads even more quickly as it is combined with the rich-always-get-richer effect.

To Myanmar

During the Rohingya genocide in Myanmar, “The military  benefited a lot from Facebook,” said Thet Swe Win, founder of Synergy, a group that focuses on fostering social harmony in Myanmar. He was awarded the Human Rights Tulip Award in 2019 by the Dutch Government. According to Facebook’s head of cybersecurity policy at that time, Nathaniel Gleicher, Facebook had found “clear and deliberate attempts to covertly spread propaganda that was directly linked to the Myanmar military.” (Source: Business and Human Rights Resource Centre). Facebook later revealed that the Myanmar military was using several “seemingly independent entertainment, beauty and informational pages” having 1.3 billion followers to spread hatred against Rohingyas.

This gave them a very good kick-off to their propaganda of sweeping away the minority Rohingya Muslims from the country. Additionally, they created thousands of fake profiles to make this fake news richer (Source: NYT). After that, the rich-always-gets-richer and rich-right-from-birth did their work together, and the result is the Rohingya Genocide. In August 2018, a study estimated that more than 24,000 Rohingya people were killed by the Burmese military and local Buddhists since the "clearance operations" which started on 25 August 2017. The study also estimated that over 18,000 Rohingya Muslim women and girls were raped, 116,000 Rohingyans were beaten, and 36,000 Rohingyans were thrown into fires. According to the United Nations reports, as of January 2018, nearly 690,000 Rohingya people had fled or had been driven out of Rakhine State who sought refuge in Bangladesh (Source: Wikipedia).

Nowadays, these Networking sites use algorithms to stop these rumors from spreading. But we are more interested here in how these rumors spread and how we can be responsible people doing our job by not allowing the rumors to spread. Although we can do mostly nothing when it’s state-sponsored (e.g. Rohingya genocide), unless the social networking sites step in and actively stop their misuse, we can attract their attention by reporting as much such content as possible. Additionally, we can also be part of campaigns against the riots and spread that sentiment to stop the wave of misinformation.

In the Capitol riot, it could have turned worse if there was not a anti-riot campaign present on Twitter. Additionally, Twitter banned @realDonaldTrump “due to the risk of further incitement of violence.” (Source:Twitter).

So as a responsible citizen of your country, you should always report these fake news and hate speeches without making delay. The respective social media networks have their algorithms to verify your report and can block the whole chain of rumor spreading. If you feel that you are being used in the chain, use these opportunities to break the chain by not only not spreading the news further, but also by reporting it immediately. More importantly, you can start or spread the truth (only after verifying it) through a campaign using the similar process followed by the rioters. Thus you can also take part in stopping the riots.

To conclude, I hope that this article makes clear how much power and impact social media platforms have in influencing what we see on our screens. My research focuses on the mathematical models, like the preferential attachment model, used to understand such complex networks. Although the above-mentioned real-world networks are more complex and have various other properties than what we observe in the simple preferential attachment models, we can gain much intuition and understanding of the rich-get-richer and rich-by-birth effects. The field of evolving random graphs is novel and needs a lot of research. Our ultimate aim is to get an as complete as possible understanding of these effects and how they affect our daily lives.


The featured image is taken from Unsplash and was made by Jon Tyson.