Umibot: A Case Study

Umibot: A Case Study

RMIT Professor and CCSRI Member Nicola Henry has over 20 years of research experience in the field of sexual violence. She utilised her expertise to start a chatbot designed to help victim-survivors of image-based abuse.

The growth of social media has been accompanied by the growth of online abuse, including cyberbullying, grooming and image-based abuse.

In 2019, 1 in 3 Australians surveyed said they had experienced at least one form of image-based abuse since the age of 16, where someone had taken, shared, or threatened to share, nude, semi-nude or sexual images (photos or videos) without consent. Image-based abuse also includes fake or digitally altered images, including “deepfakes” – fake but highly realistic videos created using artificial intelligence.

The impacts of this growing form of online abuse have been studied and are well understood, but many victims need help to find further support.

Background

RMIT’s Professor Nicola Henry has been a member at the Centre for Cyber Security Research and Innovation (CCSRI) since 2021, bringing over 20 years of research experience in the field of sexual violence. Her work investigates the extent, nature and impacts of sexual violence and harassment, including legal and prevention responses in Australian and international contexts.

In the course of her research into people’s experiences of image-based abuse, Professor Henry found that many people were unaware that what had happened to them was a crime. For many victims, the number one priority was to get the content removed or taken down, but many did not know where to find help and support.

After identifying the need for innovative digital tools to help address image-based abuse, Professor Henry and her colleague, Research Fellow Dr Alice Witt, joined forces through an Australian Research Council (ARC) Future Fellowship project to investigate the role of digital tools, platforms and services for detecting, preventing and responding to image-based abuse.

Using artificial intelligence, Professor Henry and Dr Witt developed a chatbot called Umibot (Umi for short) to provide information, support and general advice to victim-survivors, bystanders and perpetrators of image-based abuse.

“Image-based abuse is a huge violation of trust that’s designed to shame, punish or humiliate. It’s often a way for perpetrators to exert power and control over others,” said Professor Henry.

“We know victim-survivors of image-based abuse face a spectrum of experiences over and above image-based abuse, so we developed Umibot as a fully inclusive and trauma-informed empowerment tool to support people who’ve had diverse experiences and come from different backgrounds.”

Umibot - Case Study

Download and read the full case study for the Umibot project by clicking the button below.

25 July 2023

Share

25 July 2023

Share

Related News

aboriginal flag
torres strait flag

Acknowledgement of Country

RMIT University acknowledges the people of the Woi wurrung and Boon wurrung language groups of the eastern Kulin Nation on whose unceded lands we conduct the business of the University. RMIT University respectfully acknowledges their Ancestors and Elders, past and present. RMIT also acknowledges the Traditional Custodians and their Ancestors of the lands and waters across Australia where we conduct our business - Artwork 'Luwaytini' by Mark Cleaver, Palawa.