Your company has full control over how you want to moderate community content submissions. It can be limited to certain parts of the platform, split up by topic among several moderators, or turned off entirely. Your moderators can either need to approve all content before being visible, or only content that has been flagged, either by users or your list of keyword triggers. Any changes to these settings appear in the audit log.
Moderators can flag, approve, deny, and take other actions on content in bulk, making the job less time consuming. Notifications come in every time content is submitted that requires moderation (these are bundled—max 1 notification per hour), and the user gets notifications as well. Moderation is included in our mobile apps, so moderators can respond quicker from wherever they are.
How Our Customers Use Content Moderation
Keep the platform from becoming cluttered by irrelevant content
Ensure that any objectionable content is removed
Stay ahead of potential issues around content
Protect company interests