It’s Time to Grant Content Moderators Full Employment
Without their work, Twitter, Facebook, and YouTube would be inundated not just by spam but by bullying, hate speech, and other harmful content.
Facebook’s corporate headquarters campus in Menlo Park, Calif.
Photographer: Josh Edelson/AFP /Getty Images/AFP
Content moderation has made headlines lately as Twitter cracked down on several of President Donald Trump’s tweets regarding mail-in ballots and protests against police misconduct. Facebook, meanwhile, has refrained from taking action on contentious presidential posts.
These decisions by the social media giants are exceptional, as they involve top executives groping for ways to handle a singular user who fires off his incendiary missives from 1600 Pennsylvania Avenue. Ordinary content moderation—the process of deciding what stays online and what gets removed—looks very different, although it is no less important. Moderators generally evaluate suspect posts that have been reported by users or flagged by automated systems.
