When you report content on Flip, we make sure it goes through the proper process of review. To keep our community a safe and learning space, all reported content goes through automatic detection, human review, or both.
Additionally, through our Code of Conduct, users commit to the practices that align with Flip’s mission to foster a social learning space. When content is automatically detected or reported by a user as inappropriate, we only take further action once the content is confirmed by our human moderators as a violation of our Code of Conduct.
These are the stages we use to moderate content on Flip:
Stage 1: Detection
- If a user tries to submit a text with inappropriate words, they’ll receive a popup that explains why it needs to be changed. We provide a feedback option so that users can let us know if their message was detected by error.
- Our machine learning service automatically reviews all new videos. If a video is automatically detected as inappropriate, it’s sent to our human moderators for additional review.
- Users can report content at the group, topic, video response, or comment level. All reported content is sent to our human moderators for review. Learn more about how you can report content.
Stage 2: Moderation
When a video is automatically flagged or content is reported by a user, we send it to our human moderators for review. Before we remove any content, our human moderators must qualify it as a violation of our Code of Conduct.
- Leads can also moderate content in their groups. They can change their settings and hide comments and videos by default so they’re only visible after they’ve approved them. Learn more about how leads can make member videos and comments private.
- Since all new content is reviewed by our internal moderation service, this moderation setting does not change whether the content is moderated by Flip.
Stage 3: Action
Only when content qualifies as a violation of our Code of Conduct, the following can happen:
- User removal
- Content removal