The field of online communities and social media is shifting towards decentralized platforms, where users self-regulate through mechanisms like blocking actions. Researchers are exploring the relationships between community founder traits, community attributes, and sustainability, as well as the dynamics of moderation, misinformation, and toxicity on these platforms. Studies have shown that decentralized platforms can maintain low toxicity levels and promote high-credibility sources, but also pose challenges for balancing community safety and user autonomy. The role of social correction, content moderation, and deradicalization is also being investigated, with findings suggesting that moderation actions can contribute to the abandonment of fringe movements. Noteworthy papers include:
- A study on self-moderation in decentralized platforms, which provides a comprehensive analysis of user blocking behavior and its implications for community safety.
- A longitudinal analysis of a decentralized social media platform, which reveals its evolution after public launch and the effectiveness of its moderation efforts.
- A paper on content moderation, which finds that banning radical communities can positively affect participation in recovery communities and act as a deradicalization catalyst.