

Most Australian organisations adopted digital work systems without a second thought about safety obligations. Automated rostering, algorithmic performance tracking, Slack notifications at 10pm, AI-driven workload allocation: these tools arrived as productivity investments, and nobody paused to ask whether they might also be generating psychosocial harm. In February 2026, the NSW Parliament answered that question on every employer's behalf. Digital work systems are systems of work, and they carry the same duty of care as any other workplace hazard.
The Work Health and Safety Amendment (Digital Work Systems) Act 2025 (NSW), passed on 12 February 2026, is the first legislation in Australia to expressly regulate the psychosocial risks created by algorithms, artificial intelligence, automation, and online platforms. The Act commences on a date to be set by proclamation, with SafeWork NSW required to publish right of entry guidelines before certain provisions take effect. It doesn't create a new regulatory framework. It closes a gap in an existing one, removing any ambiguity about whether a PCBU's duty under section 19 of the WHS Act extends to harm caused by digital systems.
The implications run further than most organisations have started to consider.
What the Act covers
The legislation defines "digital work systems" broadly: any algorithm, artificial intelligence, automation, or online platform. That definition captures the obvious candidates like AI-driven scheduling and automated performance management, but it also captures tools most organisations wouldn't immediately connect to a compliance obligation: HR platforms that allocate tasks, software that tracks keystrokes or monitors screen activity, and communication tools that create an always-on expectation.
The psychosocial risks the Act names are the same hazards already recognised in the Model Code of Practice for managing psychosocial hazards at work: excessive or unreasonable workloads, excessive or unreasonable performance monitoring or surveillance, and unlawful discriminatory practices or decision-making. What's different is that the Act makes clear these risks can be generated by the system itself, not only by a manager's decision to use the system poorly.
That distinction matters operationally. Under the legislation, a PCBU can't defend a psychosocial harm by arguing that the algorithm, not the employer, made the decision. As WHS Minister Sophie Cotsis stated when the bill passed, "If PCBUs delegate tasks to an algorithm which creates harm, the Act clarifies that managers are responsible for the human impact."
The union inspection power that changes the calculus
There's a provision in the Act that has received less attention than the duty itself, but it may prove more consequential in practice. WHS entry permit holders, including union representatives, now have the power to access and inspect digital work systems where a suspected WHS contravention exists. That means permit holders can request access to system data, explanations of how algorithms allocate work, and documentation of how the organisation assessed and controlled the psychosocial risks those systems create.
For organisations that have never documented how their digital tools affect workload distribution, performance pressure, or worker surveillance, this creates immediate exposure. A compliance gap that a regulator might find during a scheduled inspection is one kind of risk; a compliance gap that a union representative can investigate on 48 hours' notice is a different kind entirely.
The Act does place some limits on this power. Permit holders must give at least 48 hours' notice before exercising these rights, and SafeWork NSW must publish guidelines before the entry powers commence. But the direction is unmistakable.
The psychosocial risks hiding in your tech stack
Consider the tools an average mid-sized Australian organisation runs today. An automated rostering system allocates shifts based on demand forecasting, a project management platform tracks task completion rates and flags overdue items, a communication tool sends notifications well outside business hours, and performance management software aggregates productivity metrics into individual scores.
Each of those tools can generate at least one recognised psychosocial hazard. A rostering algorithm that optimises for coverage without factoring in fatigue or recovery time creates unreasonable workloads. A project management tool that ties completion metrics to individual assessments produces excessive performance pressure. Communication tools that operate around the clock erode the boundaries protecting workers from burnout. And productivity scoring systems can amount to intrusive surveillance, one of the 17 psychosocial hazard categories now recognised under the Commonwealth Code of Practice.
The numbers confirm these aren't theoretical concerns. In Victoria, 17% of workplace injury claims to WorkSafe were reported as mental injuries in 2024-25, a proportion that has grown steadily over the past decade. Nationally, mental health claims have increased by 161% over the past 10 years, and they cost approximately four times more than physical injury claims while taking five times longer to resolve. The 2026 Future of Work Outlook found that 56% of Australian workers have experienced burnout in the last three years, with that figure reaching 69% among Gen Z workers.
What makes the burnout data worth pausing on is the contradiction underneath it. The same research found that 90% of Gen Z workers say they like their organisation, yet 69% report burnout. People can be engaged, motivated, and loyal while the systems they work within actively harm them. Engagement surveys won't surface that distinction, but structured psychosocial hazard identification will.
Why this isn't just an NSW story
The Act is NSW-specific, but treating it as an isolated jurisdictional development would be a mistake. Every Australian state and territory already imposes a duty on PCBUs to manage psychosocial risks under their respective WHS or OHS frameworks. The argument that digital systems fall within that duty was available before this legislation passed; what the Act does is remove any remaining ambiguity and hand regulators and unions explicit tools to enforce it.
The trajectory here is consistent with how psychosocial regulation has moved across the country over the past five years. NSW introduced its Code of Practice for managing psychosocial hazards in 2021. Queensland amended its WHS regulations in 2022. South Australia and Western Australia followed. Victoria's Occupational Health and Safety (Psychological Health) Regulations 2025 commenced on 1 December 2025, introducing standalone psychosocial obligations and a modified hierarchy of controls for the first time in that state.
At the federal level, the Commonwealth adopted updated psychosocial regulations through Comcare in 2024, and the Senate's Select Committee on Adopting Artificial Intelligence recommended in its November 2024 final report that the Australian Government extend the WHS framework to cover AI-related safety risks nationally.
For organisations operating across multiple states, the prudent reading isn't "this only applies if we have workers in NSW." It's that NSW has made explicit what was already implicit, and other jurisdictions will follow.
What a defensible compliance posture looks like
The Act doesn't require organisations to stop using digital work systems. It requires them to apply the same risk management approach they'd apply to any other system of work: identify the hazards, assess severity and likelihood, implement proportionate controls, monitor whether those controls work, and maintain documented evidence that this process occurred.
That standard isn't new. It's the same identify-assess-control-review cycle that already applies to every other workplace hazard under the WHS Regulations. What's new is that organisations now need to run that cycle explicitly for their digital tools, and they need to be able to show the working.
This is where the gap between what most organisations have and what a regulator or permit holder would expect becomes visible. Organisations that rely on engagement surveys as their primary psychosocial risk tool will find those surveys don't assess whether a specific digital system is generating a specific hazard. Organisations that manage WHS obligations in spreadsheets will struggle to produce evidence of continuous compliance across multiple systems and locations. And organisations that have never consulted workers about the psychosocial impact of their digital tools won't be able to demonstrate the consultation that both this Act, and existing WHS law, requires.
Where to start
The practical steps aren't conceptually difficult, but they require structure that most organisations don't currently have in place. The following actions are drawn from the obligations in the Act itself, the Model Code of Practice, and the existing duties under the WHS Act 2011 (NSW):
1. Audit every digital work system in your operations. Map every system that allocates, monitors, or manages work across your organisation. That includes the obvious platforms like automated rostering and performance tracking software, but it also includes communication tools, project management systems, and any algorithm that influences how work reaches a worker. The Act defines a digital work system as "an algorithm, artificial intelligence, automation or online platform," so the audit needs to be broad enough to match that definition.
2. Conduct a targeted psychosocial risk assessment for each system. This isn't a general wellbeing check. It's a structured assessment against the specific risks the Act requires PCBUs to consider: does this system create or contribute to excessive or unreasonable workloads? Does it use excessive or unreasonable metrics to assess and track performance? Does it involve excessive or unreasonable monitoring or surveillance? Could its outputs produce unlawful discriminatory practices or decisions? The assessment needs to consider how the system operates in practice, not just how it was designed to operate, because a rostering algorithm that works for a 200-person team may generate unreasonable demands when the same workforce drops to 150 through attrition.
3. Implement, document, and assign controls. Where the assessment identifies a risk, the WHS Regulations require the PCBU to eliminate the risk so far as is reasonably practicable, or if elimination isn't reasonably practicable, to minimise it. Each control needs a named owner and a review date. A control might be configuring the rostering system to enforce minimum rest periods between shifts, setting communication tools to suppress notifications outside business hours, or requiring human review of any automated performance score before it feeds into a formal assessment. The control needs to be proportionate to the risk, and it needs to be something the organisation can prove it actually implemented.
4. Consult workers and document the consultation. Section 47 of the WHS Act already requires PCBUs to consult workers on matters affecting their health and safety, and the digital work systems amendments reinforce that expectation. Workers are often the first to experience the psychosocial impacts of a system, whether that's a driver whose algorithm-assigned route leaves no time for rest breaks or an office worker whose project management tool generates automated escalation emails at 9pm. Consultation doesn't need to be complicated, but it does need to be genuine, recorded, and connected to the risk assessment process.
5. Prepare for the expanded right of entry. When a WHS entry permit holder arrives to inspect a digital work system, they'll expect to see documented risk assessments, evidence of controls, records of worker consultation, and an audit trail showing when those controls were last reviewed. The Act requires SafeWork NSW to publish guidelines before the entry powers commence, and permit holders must give at least 48 hours' notice. But building the documentation habit now, before those guidelines are published, is the single highest-value preparation step available.
6. Treat this as ongoing, not one-off. The Act sits within a continuous compliance framework: identify, assess, control, review, repeat. The WHS Regulations require duty holders to review and, if necessary, revise control measures to maintain a work environment without risks to health and safety. Digital systems change, workforces change, and the psychosocial risks a system generates can shift as the context around it shifts. An automated scheduling tool that works well during normal operations might create acute workload hazards during peak periods or staff shortages. The compliance posture needs to be a living system, not a filing cabinet.
For organisations wondering what a regulator would actually ask if they walked in tomorrow, the Model Code of Practice makes the expectations clear. Inspectors typically ask: What process do you use to identify psychosocial hazards? Can you show me the records? What controls have you implemented? Who is responsible for each control? When were controls last reviewed? How do you know the controls are effective? If an organisation can answer those questions with documented evidence for every digital work system in its operations, it's in strong shape. Most can't, yet.
This isn't just an organisational obligation, either. Under section 27 of the WHS Act, officers of a PCBU, including directors, CEOs, and company secretaries, have a personal duty to exercise due diligence to ensure the organisation complies with its WHS obligations. That due diligence duty requires officers to keep up-to-date knowledge of WHS matters, understand the hazards and risks associated with the organisation's operations, and ensure appropriate resources and processes are in place. With digital work systems now expressly within scope of WHS duties, officers who can't demonstrate awareness of the psychosocial risks their organisation's technology creates face personal exposure. SafeWork NSW has allocated $127.7 million in new enforcement funding and hired 51 additional inspectors, 20 of whom are dedicated psychosocial-focused inspectors, with a 25% year-on-year increase in compliance visits. The regulatory posture isn't theoretical.
The question worth sitting with
Every Australian organisation with more than a handful of workers already uses digital tools that shape how work is allocated, monitored, and performed. Most adopted those tools to increase efficiency, improve visibility, or reduce manual processes. The intent was productive.
The question the NSW Act forces into the open is whether those same tools, operating without psychosocial risk assessment, have been generating harm that nobody was measuring. Not because anyone intended it, but because the systems were never evaluated through a safety lens.
That question applies to every PCBU with a duty under any Australian WHS framework, not just those with workers in NSW. And it will only become harder to avoid as regulators hire more inspectors, unions gain more inspection powers, and the volume of mental injury claims continues to climb.
The organisations that come through this well won't be the ones that reacted fastest. They'll be the ones that built the infrastructure to identify risks across all 17 psychosocial hazard categories, documented their controls with named owners and review cycles, and maintained a governance evidence trail that holds up when the question is finally asked. The gap between having that infrastructure and not having it is where the real exposure sits.
ReFresh is the psychosocial compliance platform purpose-built for this workflow: structured hazard identification, orchestrated controls, and governance evidence from one system. If the questions raised in this article feel relevant to your organisation, we'd welcome the conversation.
This article is general information only and does not constitute legal advice. The regulatory landscape for psychosocial compliance and digital work systems is evolving across Australian jurisdictions. Organisations should seek independent legal and WHS advice specific to their circumstances, industry, and jurisdictions of operation. All data and legislative references cited are current as at the date of publication (March 2026).


After the Atlassian and Block layoffs: what happens to the people who stayed
Harrison Kennedy
March 16, 2026


Your workers are not bad at time management. Their meeting load is a psychosocial hazard.
Harrison Kennedy
March 4, 2026


Five workplace changes that should trigger a psychosocial risk review
Harrison Kennedy
March 2, 2026