tiktok: TikTok and Bumble join initiative to prevent image sharing without consent

Social media platforms tik tak as well as Bumble joined the initiative to prevent the publication of intimate images on the Internet without consent.

Social media platforms partner with StopNCII.org (Stop Non-Consensual Intimate Image Abuse), which hosts a tool developed in collaboration with Meta.

TikTok, Bumble, Facebook and more Instagram will detect and block any images included in the StopNCII.org hash bank, according to Engadget.

The website uses on-device hashing technology by which people who are threatened with intimate image abuse can create unique identifiers for their images (also known as “hashes” or digital fingerprints). This process takes place on their device. According to the report, to protect user privacy, StopNCII.org only downloads a unique string of letters and numbers, not the files themselves.

Moreover, hashes submitted to StopNCII.org are shared with participating partners.

Discover stories that interest you



If an image or video uploaded to TikTok, Bumble, Facebook, or Instagram matches the corresponding hash and “meets partner policy requirements”, then the file will be submitted to the platform’s moderation team.

When moderators find that an image violates their platform’s rules, they will remove it, and other partner platforms will also block the image, the report says.

The tool has been available for a year and more than 12,000 people have used it to prevent the sharing of intimate videos and images without permission.

To date, users have created more than 40,000 hashes, the report says.

Stay on top technology as well as startup news it is important. subscribe to our daily newsletter with the latest and must-read tech news delivered straight to your inbox.