Content moderation
Last updated
Was this helpful?
Last updated
Was this helpful?
Enable your community members to take an active role in keeping conversations safe and on-topic with our new Content Moderation feature. From within any post or comment, users can now flag inappropriate content by choosing from a clear set of predefined reasons—such as spam, hate speech, or self-harm—and, if needed, provide a brief custom explanation. On the admin side, all reports flow into a unified moderation feed where each item displays the reason(s) and counts at a glance, complete with filtering tools to prioritize urgent issues. This structured reporting framework not only speeds up resolution times but also empowers your moderation team with the context they need to enforce community standards and foster a healthier, more trustworthy environment.
Against community guidelines
Harassment or bullying
Self-harm or suicide
Violence or threatening content
Selling/restricted items
Sexual content or nudity
Spam or scams
False information/misinformation
Others (Optional details text field, 300-character max)
Report Post
✅
✅
✅
✅
✅
Report Comment
✅
✅
✅
✅
✅
Report Post with Reason
✅
✅
✅
-
-
Report Comment with Reason
✅
✅
✅
-
-