Hickin also chairs Microsoft’s Australian Responsible AI Ethics Committee, which seeks to support customers and governments as they navigate the ethical issues of AI around data and the machines, services and the systems that wrap around that data; essentially the human experience of AI.
At a recent RMIT Activator Workforce Innovation in Uncertainly webinar, Hickin shared his thoughts on some of the pressing issues facing the workforce as technology becomes ever more prevalent in our lives.
What skills, tools and techniques can we use to adapt to the challenges and opportunities of online collaborative environments?
Even with the broad digitisation of our lives today, there’s a general shift to invest in the most human of skills.
We need to accentuate the very human skills that we all bring to the table – our adaptability, our curiosity; the skill of learning is something we do very uniquely, that machines require huge amounts of data and still don’t come close to the rapid learning of the human brain.
We talk about machine-learning, but it is still a very limited domain in its real sense and application, whereas humans have an adaptability and curiosity that is unique to us; we can take a situation and work around it. Take for example, a small child can be shown a few pictures of cats and from that tiny dataset, quickly and easily identify multiple types/breed/colours of cats. A machine learning model would require 1000’s of images to achieve a limited degree of accuracy.
When I talk to my kids, when I do career development with internal people, I focus on their ability to adapt and change to a situation and understand to the value of continuous learning – above all, I think that’s the most fundamental skill that is critical now; our ability to not stand still, to continue to move at the pace of the world around us and learn with that.
What steps we can take to build and enhance trust in autonomous and AI-based systems?
Of all the tools we use in society today, AI and autonomous systems are often considered the ones that are most threatening to humans: AI has the potential to augment our need to think about problems and autonomous systems will replace our need to be there to do things.
AI can be thought of as the modern-day wheel and spear of our era. They are the tools that are going to help us elevate forward as a culture and a community; they are the tools that will help us make the next industrial leap forward.
What we shouldn’t forget, if we look back over the past 200 years, as we’ve done with any leap forward in industrialisation, it may have taken away five opportunities but created 15 more for new learning and new growth.
I think most of us would prefer to be in a world where robotic systems help us and work with us, but don’t try to replicate us. The value is there to build, to augment, but not to replace.
How has COVID-19 shaped what you are working on?
The short-term impact is that I am doing a lot of work to ensure the trust remains front and centre in a number of the COVID responses we are seeing across governments and agencies.
When we look at data to drive analysis, trajectory and management of the pandemic, there’s a whole range of concerns being raised around trusting the data, trusting the systems, trusting the machines, and then developing the tools to help people get better awareness of that data.
It's also about understanding how different communities – indigenous, remote, city based – are using that technology and thinking about any compromising issues there might be around people’s basic expectations of privacy, and their trust in that.
Any tips for coping in an increasingly digital world?
Being is adaptable is key but, in a more practical sense, I would say to anyone, no matter what your role or whatever interactions you have with technology today, learn to code something.
Learn to understand the constructs of logical coding. It doesn’t matter if you want to build something – build a bot, build a QA system, learn what it is, get comfortable with the idea of it, embrace it, because code is becoming intrinsically a part of a lot of the systems we interact with and I think it’s important we all take some time to understand it.
It’s not just because coding is a valuable tool, but because it’s also important to understand its limitations. Code can’t solve everything and when you learn to code, you start to understand how coding is really about using a tool set to solve a problem, back to the human skill we value above all – adaptability and problem solving.
Story: Karen Phelan