Online conspiracy communities are more resilient to deplatforming.
Corrado MontiMatteo CinelliCarlo M ValensiseWalter QuattrociocchiMichele StarniniPublished in: PNAS nexus (2023)
Online social media foster the creation of active communities around shared narratives. Such communities may turn into incubators for conspiracy theories-some spreading violent messages that could sharpen the debate and potentially harm society. To face these phenomena, most social media platforms implemented moderation policies, ranging from posting warning labels up to deplatforming, i.e. permanently banning users. Assessing the effectiveness of content moderation is crucial for balancing societal safety while preserving the right to free speech. In this article, we compare the shift in behavior of users affected by the ban of two large communities on Reddit, GreatAwakening and FatPeopleHate , which were dedicated to spreading the QAnon conspiracy and body-shaming individuals, respectively. Following the ban, both communities partially migrated to Voat, an unmoderated Reddit clone. We estimate how many users migrate, finding that users in the conspiracy community are much more likely to leave Reddit altogether and join Voat. Then, we quantify the behavioral shift within Reddit and across Reddit and Voat by matching common users. While in general the activity of users is lower on the new platform, GreatAwakening users who decided to completely leave Reddit maintain a similar level of activity on Voat. Toxicity strongly increases on Voat in both communities. Finally, conspiracy users migrating from Reddit tend to recreate their previous social network on Voat. Our findings suggest that banning conspiracy communities hosting violent content should be carefully designed, as these communities may be more resilient to deplatforming.