Moderating content online has become a necessity for social networks as more people are experiencing the negative impacts of inappropriate or disturbing content posted on these platforms. However, filtering the posts of individual users is a big challenge. For Facebook alone, looking into every post that their 2.3 billion active users make each day would take up too much time. Enforcing their policies and guidelines can also have a double-sided effect. They have to balance free speech and anything that resembles hateful or racist remarks, discrimation, bullying, and other blatantly damaging content. But they also need to look into the context of posts, otherwise they would face backlash from users for removing a post that they believe does not express any intent to harm. So the life of a Facebook moderator is harsh and difficult, and here are some stories that detail its risks and potential danger to the lives of people working in this business.
(Image credit: Annie Spratt/Unsplash)