Vietnamese NGO Contributes to Curb the Spread of Non-Consensual Intimate Images
Together with more than 50 non-governmental organization partners around the world, the Institute for Social Development Studies (ISDS) is supporting UK Revenge Porn Helpline’s launch of StopNCII.org to help stop the non-consensual sharing of intimate images (NCII) on the internet.
|This new venture of technology offers victims a preventative tool, something which can only further support victims to feel empowered when online.|
The Institute for Social Development Studies (ISDS) is a non-profit, non-governmental organization, under the Vietnam Union of Science and Technology Associations (VUSTA). Established in 2002, ISDS has become one of the most respected research institutions in Vietnam because of the quality of its work and the commitment to applying scientific knowledge in solving national problems by providing insights.
ISDS is the only representative of Vietnam to participate in the platform.
StopNCII.org, which stands for “Stop Non-Consensual Intimate Images,” is the first global initiative of its kind to safely and securely help people who are concerned with their intimate images (photos or videos of a person which feature nudity or are sexual in nature) may be shared without their consent.
The UK Revenge Porn Helpline, in consultation with Meta (formerly Facebook), has developed this platform with privacy and security at every step thanks to extensive input from victims, survivors, experts, advocates, and other tech partners.
“It’s a massive step forward,” said Sophie Mortimer, the helpline’s manager. “The key for me is about putting this control over content back into the hands of people directly affected by this issue so they are not just left at the whims of a perpetrator threatening to share it.”
The Revenge Porn Helpline (RPH) — established in 2015 — has supported thousands of victims of non-consensual intimate image abuse. With an over 90% removal rate, RPH has successfully removed over 200,000 individual non-consensual intimate images from the internet.
During the submission process, StopNCII.org gets consent and asks people to confirm that they are in an image. People can select material on their devices, including manipulated images, that depict them nude or nearly nude. The photos or the videos will then be converted into unique digital fingerprints known as “hashes,” which will be passed on to participating companies, starting with Facebook and Instagram.
StopNCII.org will not have access to or store copies of the original images. Instead, they will be converted to hashes in users’ browsers, and StopNCII.org will get only the hashed copies.
Other large platforms have expressed an interest in joining the initiative, including social media companies, adult sites, and message boards, Mortimer said, although they are not yet ready to announce their participation in the program.
Each participating company would use hash-matching technology to check whether images matching the hashes had been uploaded to their platforms. If they were to detect matches, content moderators on the platforms would review the images to ensure that they violated their policies and that the tool had not been misused by someone submitting another kind of non-violating image. If platforms were to determine that the images or the videos violated their policies, they would delete all instances of them and block attempts to re-upload them.
|StopNCII.org is a free tool designed to support victims of Non-Consensual Intimate Image (NCII) abuse.|
StopNCII.org is for adults over 18 years old who think an intimate image of them may be shared, or has already been shared, without their consent. For people who are under 18, there are other resources and organizations that can offer support.
Meta and the Revenge Porn Helpline recognized that people could abuse the system by submitting non-violating images just to get pictures taken down. But Mortimer said the review process by human content moderators should mean that non-violating images would remain on the platform, reported NBC News.
Karuna Nain, Meta’s director of global safety policy, also flagged that the system can detect only exact image matches, so if people have “very egregious intentions,” they can make small changes to images or videos to evade detection, creating a frustrating game of whack-a-mole for the subjects of the images.
Currently, the platform's only available in English. In the near future, ISDS will make a video on how to report for Vietnamese people.
|Exploring Cooperation Opportunities between Manipur (India) and Quang Tri (Vietnam) |
Both these provinces are at important locations to foster regional connectivity and offer complementary opportunities for cooperation.
|US Built School for Vietnam's Northern Province |
US built school in a commune of the Red River Delta province of Nam Dinh
|Diplomats Donate Blood Amid Covid Shortages |
Blood donation campaigns launched by embassies amid shortages due to Covid-19