Content Moderation
Content moderation is the process of screening and monitoring user-generated content based on platform-specific rules and guidelines so that the content is either approved or rejected for publication on the platform. The moderation process is a way for websites to ensure that content submitted by users adheres to the website's rules, and is not illegal, inappropriate, harassing, etc.
Online platforms with rich user-generated content, such as social media platforms, online marketplaces, sharing economies, dating sites, communities, and forums, all use content moderation in some way.
Why Content Moderation?
Scalable content moderation allows you to place content in its context and validity for online publishing, having determined its authenticity and intent for the platforms’ usefulness. Users' profiles, replies, images, videos, links, and out-of-context terms & language are checked. The content would be classifiable accordingly if it turns out inappropriate for online posting on your website. Here comes the content moderation — it aims at eliminating every unuseful text from the website that in any sense seems to be abusive, explicit, or full of foul language. Orginally published - A Deep Insight into Content Moderation & Its Types