ENSURING CONSENT FOR INTIMATE IMAGES
In2me empowers Creators from all genres to connect with their fans and monetize legal content. Users may not post or share intimate photos or videos of someone that were produced or distributed without their consent on In2me.
Sharing Non-Consensual Intimate Images (NCII) is against In2me’s Terms of Service. In2me will take action against anyone who attempts to abuse the platform in this way.
What is a Non-Consensual Intimate Image?
A Non-Consensual Intimate Image (NCII) is a sexually explicit image or video of someone that is taken and/or shared without their consent. This includes images or videos taken in an intimate setting that were not intended for public distribution.
How does In2me prevent the sharing of NCII on its platform?
Each Creator is personally and legally responsible for the content they share on In2me. It is against In2me’s Terms of Service for Creators to post, upload, or share any explicit content featuring another person without that person’s express consent.
Creators must ensure they have the informed consent of any person featured in the explicit content they wish to share on In2me. In2me takes steps to confirm the age and identity of anyone featured in explicit content, and that they have consented to their image being used before the content is shared on the platform.
What happens if In2me finds suspected NCII on its platform?
If we identify suspected NCII on our platform, we remove it. We immediately investigate any User who tries to share NCII on In2me, and we suspend and permanently ban any User found to have shared or attempted to share NCII.
We work with law enforcement agencies to support investigations and prosecutions of anyone who shares NCII on In2me. Unlike many other online platforms, all In2me Users are verified, which means we can provide law enforcement with actionable information to assist in their investigations.
How do I report suspected NCII?
Each post and account on In2me has a report button. If you see any content on In2me that you suspect could be NCII, please click the report button immediately. Alternatively, you can tell us what you saw by emailing support@in2me.io.
What if I previously consented to an intimate image of me appearing on In2me but have changed my mind?
If anyone featured in content on In2me tells us they no longer consent to that content being shared, we take that content down. You can contact support@in2me.io at any time to withdraw your consent to images appearing on In2me.
What else does In2me do to prevent the distribution of NCII?
We partner with StopNCII.org, a tool that enables individuals to ‘hash’ their intimate images or videos. Hashing creates a unique identifier from the media’s metadata. This information is added to the StopNCII hashlist and shared with industry partners like In2me.
The StopNCII hashlist is integrated with the In2me platform. If someone attempts to upload an image that has been hashed by StopNCII, it is blocked from being uploaded.
You can find out more at: Stop Non-Consensual Intimate Image Abuse | StopNCII.org
We also provide information about StopNCII and other support services to anyone who reports intimate image abuse to In2me.