One of Movember's founders pulled Edan Haddock aside and asked him, point blank, what on earth he was doing.
"We're humanistic. Why are you bringing in a robot to talk to our candidates? That's not us at all."
Edan Haddock, Head of Talent and People Experience at Movember and one of twelve winners of the HR Influence Awards 2026, had introduced an AI agent named Joel into the talent function. The founder's reaction was understandable. Movember is a movement built on human connection to men's health. Bringing in a chatbot felt like a contradiction.
Then Edan explained why he'd done it. The founder's response? "Oh, absolutely."
That exchange captures the AI conversation in HR right now better than any trend report could. The technology isn't the hard part. The hard part is explaining what it's for, getting people comfortable with it, and making sure the purpose stays human even when the tool isn't.
Having been on the call for all twelve winner interviews as the producer of the awards, I noticed AI came up in almost every conversation, often unprompted. But the discussion was never about replacing people. It was about what becomes possible when the repetitive work gets handled, and what people leaders need to protect in the process.
The story behind Joel starts with a number that bothered Edan. Three hundred people might apply for a single role at Movember. Maybe ten would get a phone conversation. Three or four would go through the full interview process. One would get hired.
The rest? They never got to share their story.
"People come to us for a reason," Edan says. "You don't move into the not-for-profit sector for dollars and bonuses. That's not the core driver. You move into this space because you're connected to the cause."
Many of those applicants have deeply personal connections to Movember's mission. They've lost friends. They've been through cancer. They have someone in their life they want to protect. And with a two-person talent team spread across six time zones, there was no way to hear from all of them.
Joel changed that. The AI agent engages with applicants in a 30-minute, two-way conversation. Not a questionnaire. Not a one-way screening tool. A genuine exchange where people can share their connection to the cause.
"It isn't a question one, how would you do, question two. It's conversational," Edan explains. "What we can do on the back of that, rather than a blanket 'hey, thanks but sorry you haven't met all the criteria for this role,' we can actually acknowledge their experience and their story."
The outcome is that people who don't get the job still feel heard. Some of those stories involve losing someone they love. Being able to acknowledge that, even through an AI agent, matters.
"Using AI technology to be humanistic, which is a very controversial thing for me to say, but I think there is meaning and there is power in that."
Edan is the first to admit he got the rollout wrong initially. He assumed everyone would understand the purpose. They didn't. The founder's challenge taught him something that applies to every AI implementation: if you can't articulate the human reason behind the technology, people will fill that gap with their worst assumptions.
Drew Mayhills, Chief Learning and Innovation Officer at AIM WA, approaches AI from a different angle. His 2023 Churchill Fellowship focused on AI for teacher effectiveness and equity in remote communities. He's since helped train hundreds of professionals through AIM WA's fundamentals of AI in the workplace course.
What Drew found during his fellowship reaffirmed something he already suspected: AI can accelerate the production side of work dramatically. What it can't accelerate is the human side.
"Make no mistake, learning is a relational, social and creative endeavour," Drew says. "We might be able to 10X the rate at which we produce lessons or mark papers or whatever else. What we can't 10X is the rate at which relationships are built, the rate at which trust is built."
He uses a phrase that's stuck with me since the interview: partner in thinking.
"I think we're unwise to outsource the thinking. But I know that I make better decisions when I test some of my ideas with my partners in thinking. I'm not looking for the easy output or the answer. I'm looking to be challenged, reflect, take different perspectives, draw upon expertise and insight that otherwise wouldn't be available to me."
Drew describes himself as an early adopter who has used AI tools not just to produce more, but to free up time for the things that matter. "I've adopted those tools in a way that has actually freed me up to have more time for people in my team and to listen more effectively and to reflect more deeply."
The framing is important. Drew isn't talking about efficiency for its own sake. He's talking about using technology to create space for the work that only humans can do.
Drew also surfaced something that few people in the AI conversation are willing to name: the gap between who uses AI at work and who admits to using AI at work.
"People are worried they'd be complicit in making themselves redundant," he says.
But that's not the only fear. Drew outlined three others that showed up in his research: the fear of being perceived as lazy, the fear of being perceived as incompetent, and the social risk of outperforming colleagues who haven't adopted the tools yet.
"You can imagine if you're getting all your work done, you're doing an amazing job, you are stretching, you are really contributing, and you're using these tools to help you, and the rest of the team are not quite on the bus yet. You might find yourself socially excluded for shining too brightly."
This is a leadership problem, not a technology problem. If an organisation's culture treats AI adoption as suspicious or threatening, people will keep it hidden. Drew argues it's incumbent on leaders to make it safe for people to explore, adopt, and be honest about how they're working.
"There's a lot of people out there who have just checked out of it because they're overwhelmed by it, they're intimidated by it, it's confusing. The second they sort of feel like they're beginning to understand what it might mean for them, seemingly the landscape completely shifts again."
For people teams thinking about how to engage their workforce through change, the AI disclosure gap is worth paying attention to. The technology is already in use. The question is whether your culture makes it safe to talk about.
Alex Pusenjak, Global VP of People and Culture at Fluent Commerce, frames the AI opportunity in terms of organisational design.
"We're not plugging in add-ons to old systems or tools or HRISs. We're fundamentally building and designing how we transform our work into the future," Alex says. "Across my function and across the company, we're architecting a new relationship between human creativity and machine intelligence."
Where others talk about AI as a tool, Alex talks about it as a catalyst for rethinking how roles are structured. He describes redesigning roles around automation, amplifying what he calls "the human-only elements of work: empathy, strategy, complex problem solving."
"We're not just reacting to tech, we're actually intentionally crafting the future," he says. "What we're building in 2026, it wasn't on the roadmap in 2024."
The implication is that people leaders who are still thinking about AI as an add-on to existing processes are already behind. Alex is using it to question which processes should exist at all.
Matt McFarlane, Founder and Director of FNDN, is bullish on what AI means for the HR profession. Possibly more bullish than anyone else in the group.
"I just genuinely think that AI is the thing that's really gonna help us level up even more as a profession and as a function," Matt says.
His excitement is specific. HR teams have always been sitting on large amounts of people data. The problem was that most HR professionals aren't data analysts. AI closes that gap, giving people teams the ability to pull insights from disparate systems, spot trends, and move faster from question to answer.
"Your ability to go from 'I've got three spreadsheets here, can you help me make sense of the data?' That can now happen so quickly," Matt says. "It's enabled teams to not only make sense of things that they didn't think about before, but also that speed to decision is so much faster now."
But Matt adds a warning that balances the optimism. Understanding what the data says and understanding how the AI arrived at that answer are two different skills.
"If you can't dig beneath AI and understand how it's come to the answer, then gosh, I've seen so many horror stories of people who are like, I realised this dashboard that we built has actually been feeding us the wrong information."
His advice to his younger self included learning data analytics earlier in his career. AI has reduced the barrier to entry, he says, but it hasn't eliminated the need for critical thinking about what the numbers actually mean.
Edan ties the whole conversation together when he talks about what comes after automation handles the repetitive work.
"I did a big piece of people analytics work yesterday with one of my team members, Kiyo, and we were able to use Claude AI to do what would have taken us weeks," Edan says. "Then Kiyo and I could really focus on, well how do we bring this to life and how do we creatively tell these stories to the people that matter."
His argument is that data will become self-serve for leaders within a few years. The analytical work will be handled. What won't be automated is the ability to turn numbers into narratives that move people to action.
"The people that haven't been able to do that or perhaps don't have that skill are going to find it harder to adjust in this kind of automated world that we're moving into."
Teresa Lilly, Founder of Culture Pilot Co, offers a grounding note. AI can make you more efficient, she says, but it can't make your trade-offs for you. "Your time is finite. Hoping that you can just become more efficient is, yes, with AI, you probably can do some stuff more efficiently, but ultimately you have to make trade-offs."
Across these five conversations, the message is consistent. AI is changing the people profession, and the leaders who are already using it well aren't using it to work less. They're using it to work differently. To listen to 300 applicants instead of ten. To spend more time with their teams instead of less. To tell better stories with better data. To redesign work around what humans do best.
For people and culture teams looking at AI adoption, the real question isn't whether to use it. It's whether you've been clear enough about why.
Read the full feature articles for each winner mentioned in this piece:
About the HR Influence Awards
The HR Influence Awards recognise the top 12 HR and people leaders across Australia and New Zealand who are shaping the future of work. Presented by Compono, the awards celebrate leaders who go beyond policy to drive real business and cultural outcomes.
#HRInfluenceAwards