Meta has introduced a new tool named Stop NCII, designed to help individuals who have had their photos leaked online or manipulated into explicit images without their consent. This tool aims to eliminate such images from various social media platforms.
Stop NCII, an acronym for ‘stop non-consensual intimate image’ abuse, operates by asking users to submit the original and edited photos, along with some additional information. The tool requires details about the person in the image, their age, whether all photos/videos are available, and if nudity is involved.
The process ensures complete confidentiality, eliminating the need for users to discuss their situation with anyone. It uses hashed data, meaning that users do not need to upload any sensitive images. Once the case is created, users receive a case number and a PIN, which they must retain as these are not recoverable. Users can check the status of their case at any time using this information.
Stop NCII then scans for images matching the hashed data across its partner social media platforms. If it identifies any images that violate its intimate image abuse policy, it attempts to remove them.
Currently, the partnering companies include Facebook, Instagram, TikTok, Reddit, OnlyFans, Threads, and Bumble. There are hopes that more partners, including Twitter, will join this initiative soon.
However, it’s important to note that this tool does not prevent individuals who already have the leaked images in their local storage from keeping them. Despite this limitation, the introduction of Stop NCII marks a significant step forward in the fight against non-consensual intimate image abuse on social media.