Social media moderation should not be nearly as difficult as it is made out to be. There will always be challenges, but it need not be as complex as it is now. More importantly, it should not be difficult for social media users to understand what they are and are not allowed to post on the various platforms.
The New York Times was recently given more than 1400 pages of rules for content on Facebook that are enforced by 15,000 moderators worldwide. Facebook moderators often struggle with what should be allowed to stay and what should be removed, which leads to a lot of mistakes.
To be fair, Facebook was originally intended as a way for college students to share photos and stories online. When I first joined Facebook back in 2006, you needed an e-mail address that ended in .edu in order to create an account. Facebook never expected to become the leviathan it is now, and Facebook has been making up the rules as they face new challenges. Facebook was clearly not prepared for this level of influence.
Of course, there is an obvious solution to Facebook’s moderation problems: Have a simple, easily understandable set of rules rather than a Byzantine handbook of well over a thousand pages – especially a set of rules that are updated every two weeks, making the already opaque standards even less transparent.
The low hanging fruit is common across most social media platforms: No doxxing, no explicit threats, no porn, no fraud or scams, no copyright violations, and nothing that is illegal in these United States, your base of operations. This is not an exhaustive list, of course, but it is a good start.
What Facebook needs to do is re-build the rules from the ground up. Decide what you want your platform to be and then go forward based on that. Be totally transparent about what is and is not allowed and make all of the rules public and easy to find. There is no good reason to be secretive about what is and is not permitted on the platform. That will restore trust among users and make the platform easier to use.