Decentralized Online Communities and Social Media Regulation

The field of online communities and social media is shifting towards decentralized platforms, where users self-regulate through mechanisms like blocking actions. Researchers are exploring the relationships between community founder traits, community attributes, and sustainability, as well as the dynamics of moderation, misinformation, and toxicity on these platforms. Studies have shown that decentralized platforms can maintain low toxicity levels and promote high-credibility sources, but also pose challenges for balancing community safety and user autonomy. The role of social correction, content moderation, and deradicalization is also being investigated, with findings suggesting that moderation actions can contribute to the abandonment of fringe movements. Noteworthy papers include:

  • A study on self-moderation in decentralized platforms, which provides a comprehensive analysis of user blocking behavior and its implications for community safety.
  • A longitudinal analysis of a decentralized social media platform, which reveals its evolution after public launch and the effectiveness of its moderation efforts.
  • A paper on content moderation, which finds that banning radical communities can positively affect participation in recovery communities and act as a deradicalization catalyst.

Sources

Tell me who its founders are and I'll tell you what your online community looks like: Online community founders' personality and community attributes

Self-moderation in the decentralized era: decoding blocking behavior on Bluesky

A longitudinal analysis of misinformation, polarization and toxicity on Bluesky after its public launch

Social Correction on Social Media: A Quantitative Analysis of Comment Behaviour and Reliability

Does Content Moderation Lead Users Away from Fringe Movements? Evidence from a Recovery Community

Built with on top of