Woman who found her video on Pornhub creates app to help survivors of image-based abuse

Woman who found her video on Pornhub creates app to help survivors of image-based abuse

After almost taking her own life on finding out that a video of her was uploaded to Pornhub without her knowledge, a Chinese woman is fighting back with an app to help survivors of non-consensual pornography.

The 25-year-old, who wishes only to be identified as Tisiphone – the Greek goddess of vengeance – found out about the video, which was made without her knowledge, after a friend told her. She is now developing an AI-based app to help women find instances of image-based sexual abuse online.

Image-based sexual abuse “is the non-consensual taking, sharing or threatening to share nude or sexual images of a person, including the use of digitally-altered imagery”.

The app, Alecto AI, named after the Greek goddess of anger, who punishes those who commit moral crimes, works using facial recognition software. Users’ faces are scanned and their images then searched for online.

“Damage control is particularly painful [for survivors]. The infringing content is sometimes hosted on various platforms … It is difficult to search for this content scattered all over the internet while being forced to relive our trauma over and over again,” Tisiphone, who worked for a prominent US tech company, said. “We can’t defend ourselves unless we have access to technology that can help us do so.”

A 2019 study conducted in Australia, New Zealand and the United Kingdom, found that more than 1 in 3 people have been victims of image-based sexual abuse, an increase from 1 in 5 in 2016.

A survey of more than 2000 Australians which formed part of the study, found that young people were twice as likely as those over 40 to be victims of image-based sexual abuse, with those between 20-29 years the most likely victim group. Men and women reported a similar frequency of victimisation, though women were more than twice as likely as men to be fearful for their safety and men were more likely than women to be perpetrators.

The study’s lead author, Associate Professor Anastasia Powell of RMIT University, said that image-based sexual abuse is not limited to “revenge porn”.

“We found that image-based sexual abuse is used by perpetrators of domestic violence and sexual assault, in stalking and sexual harassment, as well as in threats and bullying by peers and other known people,” Powell said.

“Not only this, but we found high numbers of victims had never consented to having their image taken. Our interviews with victims uncovered cases of people being photographed or filmed without their knowledge in the shower, while sleeping, over Skype and during sex. We also found no increase in people sending consensual sexy selfies. All this suggests it’s not victim behaviour driving the rise in abuse, but rather the actions of perpetrators.”

While most states and territories in Australia now have specific laws criminalising image-based sexual abuse, Tisiphone’s app will be a welcome aid to detect instances of such abuse.

The app is due to launch by the end of the year and Tisiphone is working to ensure that users’ data will be kept secure. It will initially only be available on a paid basis, although the hope is that it will eventually be free if companies like Facebook agree to sponsor the technology.

If you have been a victim of image-based abuse, you can report it to the eSafety Commissioner and/or the police.




Women’s Forum Australia is an independent think tank that undertakes research, education and public policy advocacy on issues affecting women and girls, with a particular focus on addressing behaviours and practices that are harmful and abusive to them. We are a non-partisan, non-religious, tax-deductible charity. We do not receive any government funding and rely solely on donations to make an impact. Support our work today.

I’ll stand with women ▷