After the Survey: What to Do When Your Psychosocial Risk Data Tells You Something You Don't Want to Hear

After the Survey: What to Do When Your Psychosocial Risk Data Tells You Something You Don't Want to Hear

Luke Giuseppin

Luke Giuseppin

Feb 13, 2026

Feb 13, 2026

You ran the survey. You consulted your workers. You did the thing the Code of Practice told you to do. And now the data is sitting in front of you, and it's worse than you expected. Maybe a particular team is reporting sustained exposure to bullying. Maybe your leadership cohort scored poorly on support and communication. Maybe the numbers on workload and burnout aren't a blip; they're a pattern that's been building for years.

This is the moment where psychosocial risk management either becomes real or quietly dies. Because what most organisations do next is the problem.

The Three Ways Organisations Bury Bad Data

We see the same patterns repeat across industries and organisation sizes. When psychosocial risk assessment data comes back with uncomfortable findings, most organisations respond in one of three ways.

They dilute it. The results get averaged across the whole organisation until the problem areas disappear into acceptable-looking aggregates. A team reporting serious harassment concerns becomes a line item in a company-wide report that reads "generally positive with some areas for improvement."

They explain it away. Leaders find reasons the data doesn't mean what it clearly means. "That team was going through a restructure." "The survey landed during a busy period." "Those scores probably reflect one disgruntled person." The data gets contextualised into irrelevance.

They act on the easy parts and ignore the hard parts. The organisation responds to the low-hanging fruit, maybe adjusting workloads or rolling out a new wellbeing initiative, while the findings about leadership behaviour, team conflict, or systemic issues quietly drop off the action plan.

All three approaches have something in common: they feel like a response, but they change nothing for the workers who are actually exposed to harm. And under Australian WHS law, that gap between knowing and acting is exactly where liability lives.

Why the Data Matters More Than You Think

Here's what changes the moment you collect psychosocial risk data: you can no longer say you didn't know.

Your duty of care for psychosocial hazards requires you to identify hazards, assess risks, and implement controls so far as is reasonably practicable. Once survey data, consultation outcomes, or complaint patterns tell you a hazard exists, the clock starts. Regulators won't ask whether you ran a survey. They'll ask what you did with the results.

The Managing Psychosocial Hazards at Work Code of Practice makes this explicit: organisations must act on what they find. And recent prosecutions show that regulators are looking at whether organisations had information available and failed to respond, not just whether they had a policy on paper.

This is also why employee surveys and wellbeing programs alone won't protect you. A survey is an identification tool, not a control measure. Running one without acting on the findings can actually make your legal position worse, because it creates a documented record that you knew about the risk and chose not to address it.

A Practical Framework for Responding to Difficult Findings

When the data tells you something uncomfortable, resist the urge to manage the narrative and focus on managing the risk. Here's how.

1. Let the Data Speak Before You Interpret It

Before leadership gets a chance to contextualise the findings, sit with the raw results. Look at team-level breakdowns, not just organisation-wide averages. Identify which of SafeWork Australia's 17 psychosocial hazards are showing up. Note where multiple hazards cluster together, because that's usually where the most serious harm is occurring.

Ask three questions of every finding:

  • What is the specific hazard this data is pointing to?

  • Who is exposed, and how frequently?

  • What controls (if any) already exist, and are they working?

2. Prioritise by Severity and Exposure, Not by Convenience

Not every finding demands the same urgency. A team reporting chronic exposure to bullying or harassment from a manager is a higher priority than a general finding that meeting loads are too high. Both matter, but one carries immediate risk of serious psychological harm.

Use your psychosocial risk register to rate each identified hazard by likelihood and consequence. This isn't about creating bureaucracy. It's about making sure you direct resources where people are most at risk, not where it's easiest to show quick progress.

3. Go Back to Your Workers

The consultation requirements under WHS law don't end when the survey closes. In fact, they arguably become more important at this stage. Workers who reported hazards need to know their input was heard and is being acted on. Workers in affected teams need to be consulted on what controls might actually work in their context.

This step is where organisations build or destroy trust. If people took the risk of being honest in a survey and nothing visibly changes, they won't participate next time. Worse, they'll conclude that the organisation knows about the problem and has chosen to accept it. That perception, justified or not, is itself a psychosocial hazard: poor organisational justice.

4. Design Controls That Target the Source

The hierarchy of controls applies to psychosocial hazards just as it applies to physical ones. Wherever possible, address the hazard at its source rather than asking individual workers to cope with it.

If the data shows that high job demands are causing harm, don't respond with resilience training. Redesign the work. If a team is reporting conflict driven by a specific leader's behaviour, don't send the team to mediation. Address the leader's conduct directly.

Wellbeing software and EAPs are support mechanisms, not controls. They help individuals manage the impact of exposure. They do not reduce the exposure itself. Your action plan needs to include systemic changes, not just individual supports.

5. Document Everything and Set Review Dates

Record the findings, your risk assessment, the controls you've implemented, and your rationale in your risk register. Set specific dates to review whether the controls are working. This documentation serves two purposes: it demonstrates compliance to a regulator, and it gives you a structured way to track whether things are actually improving.

If a worker later raises a complaint, lodges a bullying claim, or resigns citing psychological harm, your documented response to the survey data will be one of the first things examined.

The Hardest Part Isn't the Data. It's the Decision.

The organisations we work with that handle this well share one trait: they treat uncomfortable findings as useful information rather than a threat to manage. They don't shoot the messenger or bury the message. They use it.

That doesn't mean they act perfectly or fix everything at once. It means they make a genuine decision to respond, allocate resources to the highest-priority risks, communicate transparently with their people, and build a cycle of assessment, action, and review that gets better over time.

Your psychosocial compliance checklist is a good place to start pressure-testing your current approach. And if your officers and directors haven't seen the survey results yet, that's your first action item. Due diligence requires that the people with governance responsibility actually engage with the risk data, not just the sanitised summary.

Turn Your Data Into a System That Drives Action

Survey data is only as valuable as the system that sits behind it. If your current process relies on spreadsheets, good intentions, and someone in HR remembering to follow up, the gap between data and action will keep growing.

ReFresh connects your psychosocial risk data to a structured management system: identifying hazards, assessing risks, assigning controls, tracking progress, and generating the compliance evidence you need. Book a 30-minute demo and we'll show you how it works with your data, not a hypothetical.