Share this @internewscast.com
If you’ve recently applied for a position and secured an interview, you may need to adjust your preparation strategies — this time, your success might be in the hands of artificial intelligence rather than a person.
By 2024, Artificial Intelligence (AI) was “moderately” utilized by 43 per cent of Australian companies in recruitment processes, and 19 per cent of these companies relied on it “extensively” to hire new employees, as detailed in The Australian Responsible AI report.
Technology is being incorporated into hiring in several ways, including through the analysis of resumes and the assessment of candidates.
‘Confronting’ interview with AI
But nowadays, AI can be used more directly in the process through so-called “robo-interviews”.
Melbourne-based social enterprise, Sisterworks, experienced this late in 2024, when a group of its graduates found themselves in a situation neither they nor their teachers had prepared for.
“We actually got caught on surprise, when towards the end of last year, we sent about nine sisters to an interview and they didn’t make it, they didn’t pass the interview,” The group’s CEO, Ifrin Fittock, told SBS News.
“We found out that they’re actually not being interviewed face to face, but actually being interviewed by videos slash AI interviews.
“I think they just fail because they don’t know what to do.”
Sisterworks has been helping migrant and refugee women enter the workforce for more than a decade, and their recent experience with AI is not unique, with job seekers increasingly sharing similar experiences.
Robo-interviews are essentially interviews where a candidate is assessed by an AI system instead of a human. Typically, candidates record themselves and answer a series of questions, which are then evaluated by AI.
The use of technology in recruitment is often praised for its potential to save both time and money, but there are concerns that this type of hiring process can create new barriers for migrants and refugees.
“The challenges with these AI recruitment or AI interview for some of our sisters is really, first of all, English is not their first language, but also the level of digital literacy that they may or may not have,” Fittock said.
“It’s all really quite confronting.”

Concerns about ‘discrimination’
These are not the only existing concerns about AI hiring systems.
The way that these systems operate “creates serious risks of algorithm-facilitated discrimination”, according to a study published by the Law School at the University of Melbourne.
Natalie Sheard, who led the study, told SBS News: “In my research I heard of systems not being accessible to job seekers with disabilities, heard of CV screening systems using things like gaps in employment history to screen candidates out.
“An employment history gap is often seen as a gender-related indicator since women are more likely to take breaks from work to care for families, parents, or children.”
Previous reports indicate that the new hiring process might discriminate against applicants wearing headscarves, individuals with disabilities, and women.
Sheard’s research also highlights the issue around data sets and algorithms for the AI models.
“A lot of these systems are built overseas, so they might’ve been trained on data on populations that aren’t comparable to the Australian population,” she said.
“The system may not function effectively for certain demographic groups in Australia. This includes refugees, migrant women, and First Nations people.”
There are benefits too
Andreas Leibbrandt, a professor at Monash University, has also researched the role of AI in recruitment.
He says there are benefits to the technology, and if deployed correctly, a system can be less biased than a human.
In one study, conducted in the US, he found women and non-anglo candidates are more likely to apply for a job if they know AI tools are being used.
“It’s not that both women or ethnic minorities feel there’s no bias in the AI algorithm, but they feel that this bias is less so than when they’re faced with a human recruiter,” he said.
In the same time, he is concerned about a lack of transparency, regulation, and potential bias built into algorithms.
“These AI algorithms are fed with training data sets or data sets from corporations. But their training data in itself may be biased. It may come from an organisation where there was, for instance, [a] bias against women,” Leibbrandt said.
While it remains uncertain whether AI hiring systems reduce bias or reinforce discrimination, applicants from various backgrounds appear to be adapting.
Video interview training is now part of the Sisterworks job course, and Fatemeh Zahra Hazrati, originally from Iran, is among those taking part.
“AI has [a] real big effect on our lives nowadays,” she said.
“All of us have to learn it, [and we] just need to adapt ourselves for new things and accept challenges to learn new things.”
For the latest from SBS News, and .