CONTENT MODERATION POLICY

By using this website, you agree to this policy – please read carefully

⦁ Introduction

This document sets out In2me’s Content Moderation Policy (“Policy”). This Policy explains when, why, and how In2me can moderate Content, and how Content that violates In2me’s Terms of Service is handled. If you are an In2me User, this Policy forms part of your legal agreement with In2me. Please carefully read this Policy in full. Here are a few key things to note:

  • Our Terms of Service and Community Guidelines explain what is and what is not permitted on In2me.

  • Appeals concerning a decision to deactivate content, deactivate an account, or issue a final warning must be addressed by sending an email to compliance@in2me.io, subject to our Appeals Policy.

⦁ Interpretation

Unless specifically defined in this Policy, the meanings given to words defined in the Terms of Service have the same meanings in this Policy.

⦁ When In2me moderates content

While In2me is a space for creators and fans to be who they are and express themselves freely, we moderate content to check if it complies with our Terms of Service, which includes our Acceptable Use Policy and Community Guidelines.

⦁ Why we moderate content

In2me enables creators to monetise their content and connect with their fanbase. To protect our User community, In2me can, but is not obligated to, moderate Content to comply with In2me’s Terms of Service.

⦁ What content we moderate

At any time, we can choose to review and remove Content that is shared on In2me, including text, audio, images, direct messages, and videos.

⦁ How we moderate content

In2me uses state-of-the-art digital technologies paired with human moderation to check whether the Content is allowed. Our trained moderators identify and take action to remove any Content which they believe violates our Terms of Service.

⦁ What technology In2me uses to moderate content

We use a variety of technology tools to identify content which may violate our Terms of Service. These include:

  • Image and text scanning technologies: These tools help our human moderators prioritise the content they review. They automatically scan media and text for potential violations of our Terms of Service. These potential violations can then be reviewed by human moderators prior to any decision to deactivate content.

⦁ Consequences of sharing improper Content

If a User shares Content that violates In2me’s Terms of Service, we may:
i. deactivate the Content
ii. issue a warning
iii. issue a final warning
iv. deactivate the account
v. ban the account holder from the In2me platform

If a User commits serious violations or repeatedly posts Content that violates In2me’s Terms of Service, we may suspend or terminate that User’s account(s) and may prohibit that User from opening new accounts on In2me.

⦁ Some of the reasons we may remove content

We most commonly remove Content for violations of our Acceptable Use Policy, for example, Content that:
i. features a creator we cannot verify as 18+
ii. features a creator whose identity we cannot verify, including accounts or content which are wholly AI-generated
iii. features a creator pretending to be under 18, even if this is role-play or fiction
iv. is sexually explicit and includes a person who is not the verified creator and has not confirmed their age, identity, or consent
v. involves nudity in a public place or around animals
vi. is illegal where you live or where the content was captured
vii. features blood or could be considered violent or extreme
viii. features weapons or illegal drugs
ix. features an everyday object being used in a way that is likely to cause harm, including using it as a sex toy
x. refers to creators and fans meeting in person, including raffles and competitions

⦁ Illegal Content

In2me will deactivate the Content and, where appropriate, make a report to law enforcement. We report all suspected CSAM to the National Center for Missing and Exploited Children through their CyberTipline. You can read more about how we comply with our data sharing obligations in our Privacy Policy and how we assist law enforcement.

⦁ Appeal of a Content Moderation decision

If you believe In2me incorrectly deactivated your Content or account, you can appeal the decision by completing an In2me Deactivation Appeal Form in accordance with our Appeals Policy.

en_USEnglish