AI & ERS: The human touch required in increasingly automated employment processes

12 Days of ChristmAIs: A TMT insight series

AI-driven restructures are no longer theoretical. Employers are already using AI tools to automate tasks, design new operating models, and even draft redundancy correspondence. The legal framework, however, is old-fashioned: the Fair Work Act 2009 (Cth) requires genuine consultation, real consideration of redeployment, and a process grounded in human judgment. The challenge for employers is making old-world obligations fit fast, AI-enabled operating models.

Genuine redundancy in an AI world

Section 389 of the Fair Work Act provides a three-limb test for a “genuine redundancy”:

  1. The job is no longer required to be performed by anyone due to changes in the operational requirements of the enterprise.
  2. The employer has complied with any consultation obligation in a modern award or enterprise agreement.
  3. It was not reasonable in all the circumstances to redeploy the employee within the employer’s enterprise (or an associated entity).

AI may be driving the “operational requirements” – for example, process automation or AI replacing parts of a role – but it does not dilute limbs (2) and (3).

Lord v Millet Hospitality: AI-written consultation gone wrong

The recent unfair dismissal case of Hayley Lord v Millet Hospitality Geelong Pty Ltd [2025] FWC 2740 is a timely warning for anyone tempted to outsource redundancy communications to a chatbot. In that case, a hospitality business used ChatGPT to draft an email informing an employee that “we have made the difficult decision to remove the Housekeeping Supervisor position,” before offering a “discussion” about alternatives and saying it would “proceed with the removal… as planned” if she did not respond.

The Commission held:

  • the wording showed a final decision had already been made, so any “consultation” offered was illusory; and
  • relying on ChatGPT did not excuse the employer’s failure “to adhere to basic standards of decency” or to have a face-to-face conversation about redundancy.

The takeaway is blunt: AI can help draft, but it cannot own the message. Employers remain responsible for ensuring communications are accurate and framed in a way that leaves space for genuine consultation.

Consultation duties in AI-driven restructures

For employers planning AI-driven restructures:

  • Consult before the decision is locked in; Award and EA consultation clauses typically require information to be shared while options are still genuinely open – not after the automation project is effectively implemented.
  • Explain the “why”; in an AI context, that means being able to describe (at a human level) what the technology change is, why it affects certain roles and what alternatives have been considered.
  • Engage in two-way dialogue; employees must have a real opportunity to challenge assumptions, suggest changes, or propose retraining or redesign of duties.

Redeployment and automation displacement

The Helensburgh Coal Pty Ltd v Bartley [2025] HCA 29 decision confirmed that when considering redeployment, the Commission can scrutinise how an employer structures its labour model – including reducing reliance on contractors or labour hire – to create or free up roles for otherwise redundant employees. That is a significant shift for employers running complex automation or outsourcing programs, particularly where a restructure involves replacing internal labour with technology supported by external providers.

For AI-driven restructures, that means that consideration of redeployment should include:

  • Mapping roles likely to be disrupted by AI or automation early and identifying adjacent roles that could be created or expanded.
  • Looking beyond internal vacancies to contractor and labour-hire arrangements – could some of that work be brought back in-house to redeploy impacted employees?
  • Considering retraining and upskilling as part of the redeployment analysis.

A redundancy program that trumpets “efficiency through AI” while ignoring obvious redeployment possibilities – including substituting employees for external providers – is now more vulnerable.

Safety and “automation displacement” risks

From a work health and safety perspective, AI restructures clearly engage psychosocial risk duties. Large-scale automation programs create job insecurity, anxiety about skill obsolescence and, in some cases, perceived unfairness or discrimination (for example, where older or disabled workers are disproportionately affected).

Reasonably practicable control measures may include:

If automation displacement is handled clumsily, employers face potential WHS exposure on top of unfair dismissal, general protections or discrimination claims.

Practical steps

Used well, AI can support lawful restructures. Used lazily, the output becomes Exhibit A to a legal claim. In practice, employers should:

  • Use AI as a drafting assistant only – all consultation and termination documents must be checked against the actual plans, award/EA obligations and the required legal language.
  • Train managers on how to talk to employees about automation and redundancy, and insist on live conversations for significant changes.
  • Build a structured redeployment and retraining review, expressly considering contractor and labour-hire roles.
  • Document the reasoning: why roles are no longer required, what alternatives were assessed, and why redeployment was not reasonable.

AI might drive the restructure, but the Commission will still be looking for fairness, transparency and humanity.

This article forms part of the series, the 12 Days of ChristmAIs: A Technology, Media and Telecommunications series on artificial intelligence and its intersection with the law. You can view all the articles here.