Our platform supports various forms of user-generated content, including text, images, and videos. All uploaded content undergoes moderation.
Moderation Approach
We utilize a hybrid moderation model combining advanced automated tools and human oversight. Specifically, we employ Hive Moderation technology, a leading automated moderation system, to review content immediately upon upload. Only after passing initial automated moderation is content published on the platform.
For complex or unclear cases, our moderation team conducts manual reviews to ensure adherence to content guidelines.
Moderation Workflow
Content moderation occurs at multiple stages:
- Initial Screening: Content must pass automated moderation via Hive technology before publication.
- Periodic Reviews: Published content may undergo additional periodic automated or manual checks to ensure continuous compliance.
- User Reports: Users can report inappropriate content or profiles, prompting a review by our moderation team. Reports initially receive an automated response based on the volume of complaints and are subsequently reviewed manually within 48 hours.
Service Level Agreements (SLAs)
Our moderation team aims to:
- Provide an initial automated response immediately upon receiving user reports, followed by a manual review within 48 hours.
- Complete initial automated content moderation immediately upon content submission.
- Resolve appeal reviews within 72 hours of receiving the appeal request.
User Notification and Appeals
Users receive clear, timely in-app notifications detailing the moderation outcome, including brief explanations for content removal or approval. Users have the right to appeal moderation decisions, triggering a further manual review by our trained moderation specialists.
Moderation Team and Training
Our moderation activities are carried out exclusively by internal staff operating under strict confidentiality agreements (NDAs). Team members are thoroughly trained by highly experienced specialists, adhering strictly to established moderation best practices.
Compliance and Standards
Our moderation processes comply fully with applicable U.S. laws, including the California Consumer Privacy Act (CCPA), and align with Stripe policies. We adhere strictly to industry best practices for user-generated content moderation.
Record-Keeping and Auditability
We maintain comprehensive moderation records for audit purposes. These records enable effective review and response to inquiries regarding moderation actions, ensuring accountability and transparency.