“Towards an Epistemic Compass for Online Content Moderation”
New paper published by Abraham Tobi, titled ‘Towards an Epistemic Compass for Online Content Moderation‘, in Philosophy & Technology.
Abstract
The internet provides easy access to a wealth of information that can sometimes be false and harmful. This is most apparent on social media platforms. To combat this, platforms have implemented various methods of content moderation to flag or block content that is inaccurate or violates community standards. This approach has limitations – from the epistemic injustices that might occur due to content moderation practices to the concerns about the legitimacy of these for-profit platforms’ epistemic authority. In this paper, I highlight some of the epistemic challenges of online content moderation with a focus on how it harms internet users and moderators. If we are to moderate content effectively and ethically, we must attend to these challenges. Hence, I map out an epistemic compass for online content moderation that looks to attend to these challenges. I argue for a pluralistic model of content moderation that categorises content online and distributes the task of content moderation between human moderators, automated moderators, and community moderators in a way that plays to the strengths of each content moderation model. My compass is beneficial for two reasons: first, it allows room for the internet to realise its potential as a democratising force for knowledge, and second, it helps minimise the epistemic downsides of relying on profit-driven companies as epistemic authorities.