“Even though the experiences and impacts of the 75 victim-survivors we interviewed were so diverse, there was a common theme – the participants often didn’t know that there was anything that could be done to help them. The few that had tried reporting to the police had terrible experiences in doing so –they were either blamed or had their experiences minimised.”
This sparked something in Henry, there had to be a better way to help victim-survivors. Having been interested in the implementation of digital tools to aid individuals who’ve experienced image-based abuse and having an awareness of the evolving technologies, Henry submitted a project application to the ARC in 2019.
“I was aware of chatbots, though my experience was really from a consumer perspective. I also knew of Hello Cass, which is an Australian chatbot for victims and survivors of sexual and domestic abuse.”
I really wanted to explore how different digital tools might help victim-survivors of image-based abuse and a chatbot felt like it could be the right format.
This idea to centralise information and resources for victim-survivors of image-based abuse came to life in Umibot, a chatbot co-created by Henry and RMIT Research Fellow, Dr Alice Witt, Umibot is driven by expert research and rooted in understanding. The process for developing Umibot was a long one, “both challenging and rewarding,” Henry says.
Exploring rule-based, hybrid, and context based chatbots
In general, there are three types of chatbots. The first is a rule-based chatbot where users navigate the conversation by selecting pre-filled questions and responses. The second is a hybrid chatbot, like Umibot, which uses AI-language processing as well as the rule-based button interface. The third is a context based chatbot which is wholly reliant on AI, like ChatGPT. The main difference between the AI in a hybrid chatbot and the AI in a context chatbot is that a hybrid bot is more controlled. Hybrid bots use their own intelligence to respond, while context chatbots learn from their interactions with users, simultaneously making them more intelligent and potentially more risky.
As a hybrid chatbot, Umibot users can either select pre-filled questions or ask queries of their own. To bring Umibot to life, Henry and Witt worked with Tundra, a Melbourne-based digital agency. One of the most impressive aspects of this project was the dedicated collaboration amongst the teams specialising in research-driven content, technical development and UX design. In short, Umibot is the result of a partnership between both content and technical expertise.
Reflecting on the Umibot project
Henry talks about the project with a gentle humour, as if to not boast about the years of time and effort that went into creating Umibot. Built on Amazon Lex, Henry and Witt compiled a 500-page database of knowledge and resources to train the bot with. “Umibot has been a great challenge and a great joy. We’ve learned so much and we’re now in the process of writing a journal article of best practice guidelines for developing a chatbot. There’s a lot to consider with regards to privacy, safety and the ethics as well as the theoretical frameworks behind the content of a bot.”
Thinking back on the years of researching image-based abuse, Henry has noticed quite a shift in public perceptions and government and platform action. “At the time, the term ‘revenge porn’ was being used, and there wasn’t another term. This was problematic because it only captured a very narrow set of behaviours. There are all sorts of motivations for sharing non-consensual images, and it’s not always related to a relationship or a breakup. We’ve now seen a real shift in thinking about image-based abuse beyond that context, it’s helped to change the frame of view. Although it’s still common for people to blame and shame victims, including even well-meaning people, I’m seeing that far less now”.
While there is still work to be done, Henry recognises the importance of the increased public awareness and media coverage. This attention and Henry’s research have helped advocate for real change. There are now specific criminal offences for cases of image-based abuse in all states and territories (with the exception of Tasmania).