Building a Safety Culture in Mining: Why Leadership Matters More Than Rules
Dr. Carl Marx, Independent Risk Consultant, Leadership Coach, and Safety Specialist, City of Cape Town, South Africa
Roger Blair, President at On-Site Health and Safety Consulting Ltd., Toronto, Canada
The Pike River Question
On 19 November 2010, an explosion tore through the Pike River coal mine on the West Coast of New Zealand's South Island. Twenty-nine men were underground. None survived. The Royal Commission that followed produced one of the most rigorous investigations of organisational safety failure in the history of the Australasian mining industry, and its findings had almost nothing to do with technical engineering standards.
The Commission found that Pike River Coal Ltd had a corporate culture in which production imperatives consistently overrode safety concerns. Workers reported feeling unable to raise safety issues without fear of consequences. The mine had been operating with a ventilation system that was inadequate for the gas regime in the coal seam. Methane monitoring had been unreliable. The underground management system was chaotic. And critically, the board of directors had minimal knowledge of the operational safety conditions underground; safety reporting to the board was cursory, optimistic, and did not reflect the reality on the ground.
The Pike River disaster was not a failure of regulations, New Zealand had comprehensive coal mining safety regulations. It was not a failure of technical knowledge, the risks of methane in coal mines have been understood for over a century. It was a failure of safety culture: the set of values, beliefs, assumptions, and behaviours within the organisation that determined how safety was actually prioritised when it conflicted with other objectives.
Understanding safety culture, what it is, how it is formed, and what leaders must do to build and sustain it, is not a peripheral concern for mining professionals. It is a core operational competency. The evidence from decades of major incident investigations across industries is unambiguous: when organisations with strong technical safety systems experience catastrophic events, the root cause is almost always found in the cultural and organisational conditions that determined how that technical system was actually used.
Defining Safety Culture with Precision
The term "safety culture" has been used so frequently and so loosely that it has nearly lost its analytical value. Before discussing how to build it, it is worth defining it precisely, because vague definitions produce vague improvement strategies.
The most useful operational definition comes from the work of James Reason, whose organisational accident model underpins much of modern safety management theory: safety culture is the product of individual and group values, attitudes, perceptions, competencies, and patterns of behaviour that determine the commitment to, and the style and proficiency of, an organisation's health and safety management.
A more operational framing: safety culture is what workers actually believe about safety when no one is watching, and how those beliefs translate into their decisions and behaviour under operational conditions.
This framing is important because it distinguishes safety culture from safety compliance. A worksite can be highly compliant, every form filled in, every procedure followed when observed, while maintaining a culture in which workers privately believe that the procedures are bureaucratic overhead to be minimised when no supervisor is present. The measure of safety culture is not documented behaviour; it is undocumented behaviour.
The Pike River Royal Commission captured this distinction with devastating clarity. The mine had extensive safety documentation, procedures, and formal reporting structures. What it lacked was an environment in which the undocumented reality, the methane readings that went unrecorded, the ventilation concerns that went unreported, the fear workers felt about speaking up, could surface and be acted upon. Understanding why those undocumented realities remained hidden requires examining the concept of psychological safety: the conditions under which people feel able to speak, and the conditions under which they learn to remain silent.
The Psychological Safety Foundation
In the mid-1990s, Harvard researcher Amy Edmondson was studying error rates across hospital nursing teams when she made a counterintuitive finding: the teams with the highest psychological safety, where nurses felt safe to acknowledge mistakes and raise concerns, reported higher error rates than teams with lower psychological safety. Not because they made more errors, but because they were more willing to acknowledge them. The underlying error rate was actually lower; the reported rate was higher because the culture supported honest communication.
The implications for mining are direct and significant. A mine site where near-miss reporting rates are high is not necessarily less safe than one where they are low, it may simply be more honest. Sites where workers feel safe reporting near misses, stopping unsafe work, and raising concerns without fear of blame or ridicule are sites where risk is surfaced and managed before it escalates into incidents. Sites where workers conceal near misses and work around unsafe conditions in silence are where risk accumulates invisibly.
Psychological safety is not the same as physical safety. It does not mean that work is comfortable, unchallenging, or that accountability is absent. It means that the social environment allows people to take the interpersonal risks involved in speaking up, flagging a problem, questioning a decision, admitting a mistake, or stopping unsafe production, without expecting punishment or humiliation.
Edmondson's research, refined through studies across teams in manufacturing, healthcare, finance, and technology, has consistently found that psychological safety is among the strongest predictors of team performance on complex tasks, and in high-risk environments, the "performance" variable that matters most is safety.
Building psychological safety in a mining environment is a concrete, operational project, not a philosophical aspiration. The specific behaviours that create and destroy it are identifiable. Leaders who respond to bad news with anger, who dismiss safety concerns raised by frontline workers, who blame individuals for incidents rather than investigating system conditions, and who visibly prioritise production over safety in high-pressure moments are systematically eroding psychological safety, regardless of what the safety policy documents say.
Leaders who ask for input at pre-shift meetings and demonstrably act on what they hear, who thank workers for raising concerns (including concerns that turn out to be unfounded), who investigate near misses with curiosity rather than a search for culpability, and who visibly absorb production pressure without passing it to their team as safety compromise are building psychological safety one interaction at a time.
The absence of psychological safety creates the conditions in which normalisation of deviance can take root and flourish. When workers do not feel safe raising concerns, small deviations from procedures go unreported. When leaders do not actively seek out and respond to frontline intelligence, those deviations accumulate without corrective action. What begins as a single unreported shortcut becomes a tolerated practice, and eventually becomes the new normal. The process by which safe systems become unsafe does not announce itself; it advances through the very silence that psychological safety is designed to break. Interrupting that process requires leaders to understand not only how normalisation operates, but also how their own visibility, curiosity, and responsiveness are the primary mechanisms for stopping it before it is embedded in practice.
Normalisation of Deviance: How Safe Systems Become Unsafe
One of the most important concepts in understanding how major incidents develop in organisations with good safety records is the normalisation of deviance, a term introduced by sociologist Diane Vaughan in her analysis of the 1986 Challenger space shuttle disaster and since applied extensively to industrial accident investigation.
Normalisation of deviance describes the social process by which groups within an organisation gradually come to accept and accommodate risk-taking behaviours that deviate from prescribed standards, not through explicit decision-making, but through repeated small steps, each of which seems minor in isolation.
The sequence typically runs as follows: A rule or procedure is bypassed because it is inconvenient, time-consuming, or seems irrelevant to the specific situation. The expected negative consequence does not occur. The bypass is repeated. Over time, the deviant behaviour becomes the operational norm, something "everyone knows we do", and the prescribed behaviour becomes the unusual exception. The deviation has been normalised and the start of the negative culture change is embedded.
In the mining context, examples are pervasive: isolation procedures that are "simplified" because the equipment is "always safe at that point in the cycle"; pre-shift safety inspections that become signature exercises rather than genuine assessments; TARP (trigger action response plan) trigger thresholds that are interpreted with increasing latitude as ground engineers become accustomed to elevated readings that, in the past, have been tolerated without incident, thereby normalising what were originally designed as critical intervention points, and confined space entry procedures that are informally waived for "quick" entries, where "quick" itself undergoes progressive erosion: what begins as a genuine thirty-second visual check becomes a five-minute task without isolation, then a fifteen-minute repair with tools carried into the space, until the original justification, that the brevity of the entry made the procedure unnecessary, has long since ceased to hold, yet the procedural waiver persists because the cultural habit has taken root and what was once a violation requiring conscious justification has become simply "how the job is done".
Each of these individually may seldom lead to an incident. Cumulatively, they represent a systematic erosion of the barriers that stand between normal operations and major events. The Swiss Cheese model of accident causation (also from James Reason's work) captures this precisely: each barrier has holes, and major incidents occur when the holes in multiple barriers align simultaneously. Normalisation of deviance progressively enlarges the holes.
Interrupting normalisation of deviance requires leaders to maintain active vigilance about procedural compliance, not as a bureaucratic exercise, but as a deliberate effort to maintain the integrity of safety barriers. This includes:
Periodic procedural compliance audits that genuinely test whether the procedures are being followed, not whether the forms are being signed
Creating formal mechanisms to assist workers to recognise and flag procedures that seem disconnected from actual risk (rather than simply ignoring them)
Investigating instances of procedural deviation even when no incident resulted, to understand the conditions that made the deviation seem reasonable
Treating the discovery of normalised deviance as an early warning signal, not as an embarrassment to be managed or the workers alike
Leadership Behaviours That Actually Matter
Safety culture research has identified specific leadership behaviours that reliably influence safety outcomes, not leadership personality types, not leadership styles in the abstract, but specific, observable behaviours that can be developed, practised, and evaluated.
Presence and engagement in the field
Leaders who regularly spend time in operational areas, talking to workers, observing work as it is actually performed, and engaging substantively in conversations about hazards and near misses, send a consistent signal that safety is a live priority. This is distinct from formal safety tours; it is informal, ongoing, conversational engagement. The quality of these interactions matters more than their frequency. A leader who asks "what's the biggest safety risk you're managing right now?" and genuinely listens to the answer creates a qualitatively different cultural signal than one who asks "everything safe today?" while already moving on.
Visible Stop Work Authority in practice
Most mining operations formally assert that any worker has the authority to stop unsafe work. The cultural measure is whether this authority is genuinely exercised and whether its exercise is reinforced or suppressed. At operations where Stop Work Authority is culturally real, workers describe instances where they stopped a job, the supervisor thanked them, and the issue was resolved before work resumed. At operations where it exists only on paper, workers describe knowing that stopping work carries social or professional consequences, and adjusting their behaviour accordingly.
Consistent decision-making under pressure
The most revealing test of a leader's cultural influence is their behaviour when production is at risk. When a piece of critical equipment is down, when a shipment deadline is approaching, when the superintendent is asking for progress, how does the leader make safety decisions? Leaders who consistently hold safety standards under these conditions earn credibility that no safety communication campaign can replicate. Leaders who visibly compromise under pressure, approving work without proper isolation "just this once," allowing an entry into a confined space without completed permits because "it's only for a minute", teach the organisation what safety is actually worth.
Differentiated accountability
Effective safety leaders distinguish between two types of accountability that are fundamentally different in nature. Culpable behaviour, deliberate violations of safety rules, reckless disregard for established procedures, warrants direct individual accountability. But most safety incidents do not involve culpable behaviour. They involve human error within a system that allowed or facilitated the error. Treating both categories identically, blaming and disciplining the person involved in any incident regardless of the underlying conditions, suppresses reporting and eliminates the learning opportunity. Treating both categories identically in the other direction, dismissing all individual accountability because "the system is always to blame", removes the personal responsibility that is part of a healthy safety culture. The skill is in consistently making the right distinction.
The skill managers and supervisors require is to consistently making the correct distinction. When leaders demonstrate this skill consistently, they earn the one thing that no procedure can mandate, trust. Workers trust that raising a concern will not result in unfair punishment, and they trust that genuine recklessness will not be overlooked. That trust is the foundation upon which psychological safety and, ultimately, a durable safety culture are built.
The BHP Transformation: A Case Study in Cultural Change
One of the most extensively documented safety culture transformations in mining occurred at BHP in the late 1990s and early 2000s. Following a period with unacceptable fatality rates, the company's leadership, under CEO Paul Anderson, implemented a systematic safety transformation programme that combined changes in structure, metrics, training, and critically, leadership accountability.
The programme introduced Charter behaviours for safety leadership that were explicitly tied to senior leader performance evaluations, making safety leadership a career-consequential matter for executives and general managers. Operational reviews at the executive level gave safety performance equal standing with financial and production performance. The message that personnel safety performance was as important as business performance in determining career outcomes changed the incentive structure within which management decisions were made.
The results, documented in subsequent academic research and industry publications, showed statistically significant reductions in serious injury rates over the following decade. More importantly, the mechanism of improvement was the establishment of a sustainable safety cultural: leaders who previously managed safety as a compliance function began managing it as a personal leadership responsibility, because the organisational incentive structure now made this rational behaviour by managers.
The BHP example is instructive not because its specific programme is universally applicable, it isn't, but because it demonstrates that safety culture is not immutable. It can be deliberately changed, and the primary lever for change is leadership behaviour, reinforced by accountability systems that make the desired behaviours rational rather than aspirational.
If leadership behaviour and accountability systems are the levers of cultural change, the question that follows is how an organisation knows whether those levers are working. BHP's transformation was not driven by intuition; it was guided by metrics that told leaders whether their efforts were translating into changed conditions on the ground. The metrics that mattered, however, were not the traditional lagging indicators of injury rates. They were measures of the conditions that precede incidents: the quality of safety conversations, the volume and specificity of near-miss reporting, the integrity of procedural compliance. Understanding what to measure, and how to distinguish between indicators that predict future performance and those that merely record past failure, is the next essential capability for leaders committed to building a durable safety culture.
Measuring Culture: Moving Beyond Lagging Indicators
Fatal injury frequency rates, total recordable injury frequencies, lost-time injury rates, these are the most commonly used safety metrics in mining, and they share a critical limitation: they measure past failures. They are lagging indicators, informative about what has already happened and useless for predicting what is about to happen.
Safety culture assessment requires leading indicators, measures of the conditions and behaviours that precede incidents, not the incidents themselves. These include:
Near-miss reporting rates and quality: Volume and specificity of near-miss reports indicates workers' willingness to highlight risks in the workplace. Declining report rates in a period without incidents may indicate improving culture (fewer near misses occurring) or deteriorating culture (fewer near misses being reported). Trend analysis across multiple indicators is needed to distinguish these.
Safety conversation rates and quality: Frequency and content of formal and informal safety conversations between leaders and workers, measured through direct sampling and worker surveys.
Procedural compliance audits: Systematic observation of whether established procedures are being followed in practice, distinct from documentation reviews.
Psychological safety surveys: Validated instruments (such as those derived from Edmondson's work) that measure workers' perceptions of safety to speak up, with trend tracking over time.
These leading indicators, tracked consistently and transparently, give organisations the ability to detect cultural deterioration before it manifests in injury statistics, a capability that lagging indicators can never provide.
Conclusion
The engineering and operational knowledge required to manage physical hazards in mining is sophisticated and well-developed. The evidence across decades of major incident investigations, from Pike River to Challenger and beyond, delivers a single, consistent finding: technical systems fail not when the engineering is wrong, but when the organisational conditions in which that engineering operates are allowed to deteriorate. Safety culture is not a soft add-on to safety management; it is the medium through which all safety systems actually function. Procedures, barriers, and controls are only as effective as the cultural environment that determines whether they are followed, questioned, maintained, or silently bypassed. Investing in the leadership capability to build and sustain that environment is not separate from safety investment; it is the most fundamental safety investment an organisation can make.
Keywords: mining safety culture, safety leadership mining, psychological safety mining, Pike River coal mine disaster, normalisation of deviance mining, safety culture measurement, BHP safety transformation, stop work authority, safety culture development mining