Content moderation practices and technologies need to change over time as requirements and community expectations shift. However, attempts to restructure existing moderation practices can be difficult, especially for platforms that rely on their communities to conduct moderation activities, because changes can transform the workflow and workload of moderators and contributors' reward systems. Through the study of extensive archival discussions around a prepublication moderation technology on Wikipedia named Flagged Revisions, complemented by seven semi-structured interviews, we identify various challenges in restructuring community-based moderation practices. We learn that while a new system might sound good in theory and perform well in terms of quantitative metrics, it may conflict with existing social norms. Our findings also highlight how the intricate relationship between platforms and self-governed communities can hinder the ability to assess the performance of any new system and introduce considerable costs related to maintaining, overhauling, or scrapping any piece of infrastructure.
内容审核实践和技术需要随着时间的推移和社区期望的变化而改变。但是,试图重组现有的节制实践可能很困难,尤其是对于依靠社区进行节制活动的平台,因为变化可以改变主持人和贡献者的奖励系统的工作流程和工作量。通过研究Wikipedia上的预先审核技术的广泛档案讨论,名为“标语修订版”,并得到了七次半结构化访谈的补充,我们确定了重组基于社区的审核实践的各种挑战。我们了解到,尽管一个新系统在理论上听起来可能不错,并且在定量指标方面表现良好,但它可能与现有的社会规范相抵触。我们的发现还强调了平台与自治社区之间的复杂关系如何阻碍评估任何新系统的性能并引入与维护,大修或取消任何基础设施有关的相关成本。