A safety culture is the combination of beliefs, perceptions and values shared by employees regarding risks within the organization. For example, in the workplace or in society. This safety culture is part of the organizational and has been described in various ways. In particular, employees of the National Academy of Sciences and the Association of Universities for the Provision of Land and Public Universities published a summary on this topic in 2014 and 2016.
Studies have shown that disasters associated with the workplace are the result of violations of the organization’s policies and procedures that have been created to ensure safety. And also that this breakdown stems from insufficient attention paid to security issues.
A good culture can be fostered by a commitment to senior management, realistic methods of dealing with hazards, ongoing organizational training, and concern for reliability common to the entire workforce.
History of the concept
The Chernobyl disaster proved that a safety culture and the influence of managerial and human factors are very important. The term was first used in the 1986 Consolidated Report after the Chernobyl accident. It tells us that KB is:
The set of characteristics and views in organizations and individuals, which establishes that attention is paid to the priority issue of nuclear plants, justified by their significance.
Since then, a number of definitions of safety culture have been published. The Health Commission has developed one of the most commonly used language:
The product of individual and group values, attitudes, perceptions, competencies and behaviors that determine commitment, as well as style and skill, organization of labor and health protection.
Ladbrok Grove Railway Crash Report
Lord Cullen viewed the term, and in particular the fire safety culture, as "the way we usually work." This meant that every organization has a design bureau - only some are better than others.
The concept of “physical culture safety” initially arose in connection with major organizational disasters, where it provides a critical view of how multiple barriers against such incidents can be simultaneously ineffective. In every catastrophic situation, it is precisely ignoring our knowledge of the factors that make organizations vulnerable that leads to an increase in the number of failures. It became clear that such a problem does not arise only due to “human error”, random environmental factors or technological failures. But rather because of entrenched organizational policies and standards that have repeatedly proven that they precede disaster.
However, every year, the safety culture still applies (with a lesser degree of certainty) to individual accidents and, therefore, has begun to relate to the whole range of measures related to wearing PPE, the quality of management's response to malfunctions - or (which is often the main problem for major accidents ) Is the degree to which sustainability considerations affect management decisions. A new starter or recently arrived subcontractor understands what exactly is the local code and therefore is under their strong influence.
Safety of cultural institutions
The organization itself and its management system are closely related, but the relationship is not easy in that the design bureau corresponds to the formal order. A safety culture cannot be created or changed overnight. Over time, it develops as a result of history, work environment, strength, health and safety practices, and leadership. Organizations, like organisms, adapt.
Control system
The design bureau of life is ultimately reflected in the ways of ensuring order at its workplaces (be it a conference room or a workshop). In fact, the management system is not a set of policies and procedures, but how they are implemented in the enterprise.
HSE experts at the British Higher School of Economics note that the development of a safety culture is not only (and not even to the greatest extent) a problem of employee attitudes and behavior. Many heads of companies talk about this term, bearing in mind the tendency of their employees to abide by the rules or act in accordance with the rules. However, culture and management style are even more significant. For example, a natural, unconscious deviation of production towards safety or a tendency to focus on the short term and be highly reactive.
Pros and cons
Since the 1980s, a large number of studies have been conducted on the safety of culture in the home. However, the concept remains largely “poorly defined”. In the literature, there are a number of different concepts of design bureau with arguments for and against.
The two most famous and frequently used are the concepts presented above from the International Atomic Energy Agency (IAEA) and the United Kingdom Health and Safety Commission (HSC).
However, there are general characteristics of other definitions. Some related to safety culture include accounting for beliefs, values, and attitudes. And an important feature is that the culture is shared by a group.
Poor culture
The first obvious minus can be a major accident, for example, an explosion at the enterprise.
Although there is some uncertainty in the definition of a safety culture, there is no misunderstanding regarding the relevance or significance of the concept.
Mearns et al stated that “Design Bureau is an important concept that shapes the environment, and it develops and maintains an individual relationship to security and encourages measures to it.” With every major disaster, significant resources are allocated to identify factors that could contribute to a positive outcome.
Incidents
Examining the detail highlighted in investigations of such disasters is invaluable in identifying common factors that “make organizations vulnerable to disruption.” From such requests a pattern is created.
Accidents are not the result of an accidentally coincident “operator error”, environmental or technical malfunctions alone. Rather, disasters are the result of violations of the organization's policies and procedures that have been created to ensure a safety culture. And this breakdown stems from insufficient attention paid to reliability issues.
In the UK, incidents such as the death of the MS Herald of Free Enterprise passenger ferry (Sheen, 1987), the fire at the Kings Cross metro station (1987), and the explosion of the Piper Alpha oil platform (1988) all investigated the need to raise awareness of organizational , managerial and human factors on safety results. Similar problems were discovered in the United States that underlie the space shuttle disaster, a subsequent investigation of which revealed that cultural issues had influenced numerous “erroneous” decisions on behalf of NASA and Thiokol. A lesson learned from disasters in the UK was that “it’s important to create a corporate atmosphere or culture in which security is understood and considered the number one priority.”
Typical features
From public inquiries, it has become apparent that neglect of a safety culture has caused many of the major accidents that have occurred around the world in the past 20 years or so. Typical features associated with these disasters are:
"Profit before safety." This means that productivity has always been higher since sustainability was seen as value, not investment.
"Fear". Enterprises try to keep problems hidden as management tries to avoid sanctions or reprimands.
“Ineffective Leadership”
The existing corporate culture of life safety prevents the recognition of risks and opportunities that lead to incorrect security decisions made at the wrong time for unreliable reasons:
Failure to comply with standards, rules and procedures by managers and the workforce.
"Misunderstanding." When critical safety information has not been communicated to decision makers, or the message was inaccurate.
"Failures in competence." When there were false expectations that direct employees and contractors were well trained and competent.
Ignoring Lessons Learned. When security-critical information is not retrieved, transmitted, or used.
Attributes of the "tough guy." Such as reluctance to acknowledge ignorance, to make mistakes, or to ask for help, can undermine design bureaucracy and labor productivity, preventing the exchange of useful information.
Ideal culture
Experts see perfect safety as the “engine” that guides the system toward the goal of providing maximum resistance to its operational burdens, regardless of current business problems or leadership style. This requires constant attention and respect for everything that can defeat the security system.
Sophisticated methods with defense in depth (such as would be expected for an enterprise with powerful danger) become opaque to most, if not all of their managers and operators. Their project must ensure that not a single failure will lead to an accident due to a detected error. Reason claims that for such systems there is a lack of a sufficient number of accidents and a desired state of reasonable and respectful alertness. If it is not supported by the collection, analysis and dissemination of knowledge of incidents and detected during misses, problems may arise that could have been prevented. It is very dangerous to think that the organization is safe, because no information says otherwise.
People accidents
Over the years, much attention has been paid to the causes of professional incidents. When they occur in the workplace, it is important to understand what factors (human, technical, organizational) could influence the outcome in order to avoid similar events in the future. By understanding why and how they occur, appropriate disaster prevention methods can be developed. In the past, improved workplace safety or risk control was achieved through the provision of safer mechanisms or processes, better employee training, and the introduction of formal reliability management systems. Consequently, (some argue) many of the residual accidents at the workplace are the result of operator error - one or more employees perform duties differently from how they were trained. This must be controlled.