Risk in Workplace: 1316259

Human error refers to the mistake of someone that can cause accident or bad incident. Woods and Branlat (2010) stated that most of the violations or safety errors in health sector, nuclear industry and civil aviation happen due to human error. Due to human error, accidents can occur and be risky in the clinical practice. Hence, in such case human errors can be fatal and the individual or system has to suffer for it. However, in such case just culture helps to identify whether it is an honest mistake that is made unintentionally or it is a serious mistake. The present study focuses on understanding and detecting the human errors and the recovery approach. Moreover, the study has shed light on perception of unsafe acts and appreciation of openness and transparency so that errors can be managed. The way of applying just culture and how organisational justice can be achieved are also discussed.

In case of risk management, human errors play crucial part and are viewed in two ways that are person approach and system approach. However, the healthcare workers have to deal with service users and family members openly when any mistake is made (Maurino et al. 2017). The error can involve deviation such as recent current intention like lapse or slip, from a proper route towards mistake or it can involve sin that us straying from path of righteousness. Among all, slip or lapse is one of the most common errors that can be vulnerable when attention is diverted or it can occur without attention also. These type mistakes are known as rule based mistake and knowledge based mistake. From the case study, it is found that Air traffic controllers in Yugoslavia were managing their work understaffed. As a result, midair collision happened between two of the passengers (Dekker and Breakey 2016). As per the case study, the air traffic company had hired new cabin crew members, who had worked hard to avoid the accident. However, this type of human errors is fatal and can cause violation.  

In the clinical practice, to understand and cope with the presented risk of mishaps, different practical implications are necessary that includes system approach and person approach. Person approach has long standing as well as widespread tradition. This focuses on unsafe acts. The unsafe acts include procedural violations and errors. This can occur in case of people who work for front line such as physicians, health care providers, surgeons, nurses, anaesthetists and pharmacists. The unsafe acts can arise primarily. This can arise from aberrant mental procedures for example forgetfulness, poor motivation, recklessness, carelessness, negligence and inattention (Geraghty et al. 2020). In human behaviour, the related counter measures mainly are directed to reduce the unwanted variability. The methods can include different poster campaigns, which appeal to fear, disciplinary measures, writing other procedure, threat of litigation, shaming, blaming, naming and retraining. The followers who want to follow the approaches are tending treatment of errors as the moral issues. In view of the psychologists, ‘bad things happen to bad people’ (Selvik and Bellamy 2020). However, while working in the front line, people need to be careful to avoid different health issues and accident.

In system approach, the primary premise is that human is fallible; hence, errors are expected. In fact, the errors can occur even in famous organisations. Wiegmann and Shappell (2017) opined that errors are consequences more willingly than causes. The errors may have their own origins in perversity of the human nature in the upstream systematic features. It can include the recurrent errors traps in organisational processes and in workplace, which may give rise to those traps. The counter measures might be based on assumptions and human condition cannot change but the working environment of human can be changed; people can make a safe and friendly environment for themselves. A system defence can be made for the working people for safe guard. According to Loh et al. (2020), hazardous technologies may possess barriers as well as safeguard. When a poor event or accident occurs, the important issue is to identify the reason of failure of defence instead of identifying who is the culprit behind the event or accident.

 ‘Swiss cheese model’ helps to reduce the system accidents. Barriers, defence as well as safeguard may occupy the primary position in system approach (Reason 2000). System approach includes high technology systems, which has defensive layers. The layers may include engineered and other relies on people. Engineered people may include physical barriers, alarms and automatic shutdowns and other people may include pilots, surgeons, control room operators and anaesthetists. People may depend on administrative controls and procedures. Initial function of ‘Swiss cheese model’ helps protecting potential victims and advantages from the local hazards. Hence, the layer needs to be intact. The layers are the slices of the Swiss cheese and may have various holes. The holes are opening, shifting and shutting the location. The holes of the slices may not cause any bad outcome. These types of holes in the model can be caused due to latent conditions and active failure. All of the bad incidents have the combination of two sets of factors. People may commit active failures, which is an unsafe act and they come in direct contact with system or patient (Goldratt and Cox 2016).

Latent condition is another model in the system and is inevitable resident pathogen. It can arise from the decisions of builders, top level management, designers and procedure writers. However, the taken decisions can be a mistake and the decisions can introduce pathogens in the system. The latent condition has two types of effects. Within workplace, the events can translate in an error provoking situation that includes understaffing, fatigue, inexperience, inadequate equipments and time pressure (Mount-Campbell et al. 2019). Latent condition can lie dormant in system to combine the active failure and local triggers. Like the active failure, latent condition also can be identified in an early stage and remedies also can be identified to avoid the adverse events before occurring. Front line people should be careful and attentive so that human errors and systemic errors can be avoided. However, it cannot be said that the errors are avoidable; hence, in a workplace of an organisation proper approaches should be there to handle the situation.

The management of an organisation needs to be transparent so that if any accident occurs, it can be solved immediately. As per the case study of Maccoby et al. (2013), due to the error in medication administration procedure, the patient was overdosed by the order of resident and the patient died on the same day. The president immediately announced a press meeting and explained the mistake and how the condition can be overcome. Such type transparent activity helped to minimize the error so that incompetent physician or nurse could learn from it and before ordering dose of a drug they themselves can check the system. Transparency in an organisation is linked with just culture, which helps to decide the punishment type after analysing the type of mistake (Yeow et al. 2017). In contrast, the case study showed another incident, in which a patient pressure ulcer did not get proper treatment and for which he was shifted to another hospital and after surgery his wound heeled completely. After investigation it is found that the hospital lawyer of the first one stated that they never acknowledge any mistake because they had the fear of sued.

As opined by Babaei Pouya et al. (2017), just culture informs the organisation how involved people can be responded and ways to minimize negative impacts and maximize the learning. Organisational culture includes safety culture and just culture so that the infrastructure can be strengthened. Retributive justice provides an idea about the proportional and deserved punishment, which is just a response to the sanctionable action. After identifying the mistake, the person or action will be punished fairly. The case study by Maccoby et al. (2013) showed that medication error was made unintentionally as it was a system error. However, it is at-risk behaviour and such type of mistake can be fatal. In case of restoration justice, it is necessary to identify the victims at first who include patients, surrounding community and the practitioners. In the first case, the honest mistake was admitted whereas in the second case, hospital authority did not accept the fault. Accidents or mistakes needs to be reported accordingly so that the need of organisational development can be understood.

Due to the accident, the controllers were charged in murder case and were jailed. The condition of first victim had no assurance of improvement or possible prevention and the second victim were singled out unfairly for failure of entire complex system. In such case, intrusions, mistiming had occurred, for which the accident occurred. Not only human error but also system error had occurred in this accident. These errors could be overcome by strengthening the infrastructure of the organisation and providing appropriate training to the junior staffs by appointing senior staffs (Dekker 2018). Though the junior cabin crew tried his best to stop the accident but it was late. Hence, proper intervention was necessary to stop the potential accidents. Without proper training and experience, the junior staffs tried their best; for which reason, the first victim did not supported the decision of jailing of the controllers. To do the necessary, it may cost a lot and necessary guidance is also necessary. To overcome the challenges and providing appropriate solutions to the human errors and system errors gaining knowledge is must.

Organisational justice helps to maintain organisational policy and regulation in the workplace. Social exchange theory gives knowledge on justice effect and emphasizes a reciprocal relationship between employees and employers (Goldratt and Cox 2016). The employers in an organisation need to have perceived fairness, which raises the feeling of commitment, obligation and trust among the employees towards employers. In such case, the organisation needs to have transparent policy with strong organisational culture.  Employees in a workplace always should have positive attitude and needs to be attentive while handling critical works; for example, residents and nurses should be very attentive while applying drug to patients as a decimal point error can be an overdose. On the other hand, Maurino et al. (2017) stated that communication plays critical role in the healthcare system. To avoid human errors, employees must have strong communication skill and it will help to reduce half of the mistakes that occur due to miscommunication.

Moreover, strong communication in the organisational justice strengthens the work culture and bonding between all the employees as well as relation between employees and employers also being strengthen. To develop the organisational justice, equity theory plays a significant role. According to the theory, people need to be treated fairly in the organisation as every employee is involved in the workplace by physically and psychologically (Eib 2015). It is directly related with the health outcomes.  For example, for the same work, if one of the nurses is paid more who gives less effort and makes more mistakes whereas on the other hand, another nurse is paid less work but gives more efforts to make fewer mistakes; it is injustice. With the justice culture, such type of injustice can be stopped and people can be fairly.

Based on above discussion, it can be said that accidents can occur in a workplace and it can occur intentionally or unintentionally. However, whatever the reason is behind the accident, appropriate precaution always should be there so that people not get hurt as accidents can cause fatal or full of mental trauma or physical problems for lifetime. People should have appropriate knowledge of organisational culture and justice in the workplace. It can only be done when staffs get proper training and gain knowledge to gather theoretical knowledge and experience to handle such adverse situations. By this way, a safe organisational work culture can be created for employees. Employees of healthcare sectors need to be more attentive and confident while performing their duties so that they can take necessary steps when an action is going to be occurring. Transparency should be there in the workplace that helps to reduce the workplace risks and hazards.

References

Babaei Pouya, A., Hazrati, S., Mosavianasl, Z. and Habibi, E., (2017). Systematic Human Error Reduction and Prediction Approach: Case Study in Cement Industry Control Room. Occupational and Environmental Health2(4), pp.272-284.

Dekker, S., (2018). Just culture: restoring trust and accountability in your organization. CRC Press.

Dekker, S.W. and Breakey, H., (2016). ‘Just culture:’Improving safety by achieving substantive, procedural and restorative justice. Safety science85, pp.187-193.

Eib, C., (2015). Processes of organizational justice: Insights into the perception and enactment of justice (Doctoral dissertation, Department of Psychology, Stockholm University).

Geraghty, A., Ferguson, L., McIlhenny, C. and Bowie, P., (2020). Incidence of Wrong-Site Surgery List Errors for a 2-year period in a single national health service board. Journal of Patient Safety16(1), p.79.

Gluyas, H. and Morrison, P., (2014). Human factors and medication errors: a case study. Nursing Standard (2014+)29(15), p.37.

Goldratt, E.M. and Cox, J., (2016). The goal: a process of ongoing improvement. Routledge.

Loh, T.Y., Brito, M.P., Bose, N., Xu, J. and Tenekedjiev, K., (2020). Human Error in Autonomous Underwater Vehicle Deployment: A System Dynamics Approach. Risk Analysis.

Maccoby, M., Norman, C.L., Norman, C.J. and Margolies, R., (2013). Transforming health care leadership: A systems guide to improve patient care, decrease costs, and improve population health. John Wiley & Sons.

Maurino, D.E., Reason, J., Johnston, N. and Lee, R.B., (2017). Beyond aviation human factors: Safety in high technology systems. Routledge.

Mount-Campbell, A.F., Evans, K.D., Woods, D.D., Chipps, E.M., Moffatt-Bruce, S.D. and Patterson, E.S., 2019. Value and usage of a workaround artifact: A cognitive work analysis of “brains” use by hospital nurses. Journal of Cognitive Engineering and Decision Making13(2), pp.67-80.

Reason, J., (2000). Human error: models and management. Bmj320(7237), pp.768-770.

Selvik, J.T. and Bellamy, L.J., (2020). Addressing human error when collecting failure cause information in the oil and gas industry: A review of ISO 14224: 2016. Reliability Engineering & System Safety194, p.106418.

Wiegmann, D.A. and Shappell, S.A., (2017). A human error approach to aviation accident analysis: The human factors analysis and classification system. Routledge.

Woods, D.D. and Branlat, M., (2010). Hollnagel’s test: being ‘in control’of highly interdependent multi-layered networked systems. Cognition, Technology & Work12(2), pp.95-101.

Yeow, J.A., Khan, M.K.B.J. and Ng, P.K., (2017). Enforcement of Safety and Health Policy Reduces Human Error in SMEs in the Manufacturing Industry. Advanced Science Letters23(11), pp.10656-10659.