Statement - ERC quality management in teaching life support: survey across National Resuscitation Councils
As many studies about quality management in ERC courses had low quality of evidence, the National Resuscitation Councils (NRCs) launched a survey to detect if it had implemented any quality management into ERC certified courses and how it was performed at the national level through feedback devices, high fidelity manikins, simulation and external factors.
SEPTEMBER 14, 2018 – European Resuscitation Council (ERC) quality management in teaching life support: a survey across National Resuscitation Councils ( INTRODUCTION)
Authors
- Renier Walter S [1], Khalifa Gamal Eldin [2], Krawczyk Paweł [3], Truhlář Anatolij [4] and Raffay Violetta [5]
- 1 Department of Public Health and Primary Care (General Practice), KU Leuven, University of Leuven, Leuven, Belgium and Belgian Resuscitation Council, Brussels, Belgium
- 2 Emergency and Disaster Medicine, Military Production Hospital and Egyptian Resuscitation Council, Cairo, Egypt
- 3 Department of Anaesthesiology and Intensive Care Medicine, Jagiellonian University Medical College, Cracow, Poland
- 4 Emergency Medical Services of the Hradec Králové Region, Hradec Králové, Czech Republic and Department of Anaesthesiology and Intensive Care Medicine, University Hospital Hradec Králové, Hradec Králové, Czech Republic
- 5 Serbian Resuscitation Council, Novi Sad, Serbia
The history and roots of quality can be traced back to centuries when craftsmen began organizing into unions called guilds. When the Industrial Revolution came, early quality management systems were used as standards that controlled product and process outcomes. As more people had to work together to produce results and production quantities grew, best practices were needed to ensure quality results.
This is also true for European Resuscitation Council (ERC) courses. The 2015 ERC Guidelines and the Consensus on Science with Treatment Recommendations (CoSTR) on Education, Implementation and Teams [2] emphasise the critical lifesaving steps of Basic Life Support (BLS) and the importance, of high quality cardiopulmonary resuscitation (CPR): namely compression rate, depth, recoil, and minimal chest compression pauses. There was low to very low quality of evidence declared in nearly all studies focusing on quality of pre-hospital resuscitation skills. In addition, heterogeneity between studies was also present in nearly all studies. Recent studies confirmed those findings.
Different factors have impact on the quality of teaching and consequently of CPR delivery, and also on the improvement of quality. Feedback devices, e.g. those measuring chest compression during resuscitation simulations, give a better end-of-course result than instructor assessment alone. Although, it could still be distracting. Recently Pavo et al. observed that human led feedback is as good as mechanical (by a device) feedback. Cheng et al. concluded that the use of high fidelity manikins for advanced life support training is associated with only moderate benefits for improving skills performance at course conclusion, confirming the CoSTR findings. For countries with lower income, which cannot afford high quality manikins, quality management of courses and of the instructors may improve outcome. Simulation improves skills and knowledge, but does not improve quality because only 50% of the three groups of participants reached qualitative chest compressions. The CoSTR authors also wondered if students receiving self-instruction courses should have better skill performance in actual resuscitations and further improve the rate of return of spontaneous circulation (ROSC) and survival to hospital discharge of patients when compared with those receiving traditional courses. Although, recently, Yeung et al. proved that results are better with the combination of self-instruction and face-to-face teaching.
Quality is also affected by external factors: quality of skills decreases within one year. Therefore, ERC recommends retraining on a regular basis. Although, the interval between trainings is not actually known. Improving the weak links of the local Chain of Survival (e.g. more bystander CPRs, more trained bystanders, better transmission of the emergency call information, etc), was, together with improving the quality of ALS and the post-resuscitation care, associated with an increased survival after out-of-hospital cardiac arrest (OHCA). This study, comparing the effect of the ERC Guidelines changes between 2005 and 2015 found that shockable arrests declined, that there were fewer arrests witnessed, that response interval increased but that overall survival increased, especially in the sub-group of bystander witnessed VF/VT arrests with cardiac aetiology. More recent studies confirmed that tendency. We could not identify any paper on quality management of the ERC courses or on the quality of the teaching and feedback of instructors and course directors. Although, the impact of that quality on skills training during courses is as important as the use of simulation, high fidelity manikins, feedback devices etc. The accountability of teachers depends on their certification and testing. Human interaction is also influencing the qualitative performance of a candidate. Because of the differences in results according to the above-mentioned factors and the absence of evidence concerning the quality of the teaching and training themselves, quality management, rather than quality control, is mandatory. We aim to know if the National Resuscitation Councils (NRCs), partners of the ERC, perform any quality management on courses and, if so, how it is executed, who is responsible, how it is recorded and what can be done to improve the quality of the ERC courses and facilitate the instructors’ development in courses.
Survey across National Resuscitation Councils on life support management – METHODS
In January 2017 all 33 NRC’s were invited by direct emails sent to NRC contact persons to fill in an online survey about quality control in their country. This survey was uploaded on the ERC website. The consensus of the final version of the survey was reached using Delphi method, before sending it to the NRCs.
After identification of the NRC and the responding person, the survey included fifteen questions covering eight topics. Four questions were only quantitative, ten only qualitative, and one mixed quantitative and qualitative. The following questions were used:
- Does your NRC perform any quality control on ERC courses?
- On which courses do you perform quality control?
- Who is responsible for quality control?
- How is it organised?
- Who is entitled to perform the quality control?
How is it recorded?
– Does your NRC have any specific set of documents for quality control records?
– To what extent is there quality control performed for each course type in your country? Please, estimate the ratio of quality controls to all sessions organized (in %).
– Would you like to have a quality control online tool included in the Course System (CoSy) (e.g. feedback by candidates and/or instructors)?
– Comments and suggestions about quality control online tool included in the CoSy (e.g. feedback by candidates and/or instructors)?
- Do you think that relevant structures of ERC (e.g. international course committees) should be informed about results of quality control performed on the national level?
- Is your NRC organizing any instructor days or workshops for ERC instructors and/or course directors?
Comments and suggestions about instructor days or workshops for ERC instructors and/or course directors
- How ERC could facilitate quality control in your country? Do you have any suggestions for ERC?
– Any other comments?
All answers were included in the analysis. For each question, similar answers from different NRCs were added together and plotted to the extent of their importance (number of similar answers). In case a NRC contact person allowed access to the survey to more than one person, all the answers were included but combined to one answer for that country. If there were contradictory answers, the NRC was asked again to explain details. The researchers compared the qualitative answers in order to become interpretable results. This was mainly compared by AT. In case of any doubt, a third researcher was involved.
Survey across National Resuscitation Councils on life support management – RESULTS
Twenty-six out of the 33 NRCs (79%) (Figure 1) sent us 31 answer forms back: one NRC sent three answer forms and three NRC two forms. These were, as explained, merged to one form for the respective NRC.
Quality management: It was not performed in nine NRC (35% of the 26 included NRCs). The reasons were: never thought about it (n=2), no volunteers (n=2), it was planned not implemented (n=1), no format or tool (n=1), it was never stated to do so (n=1) and two NRCs gave no comment (table 1). Seventeen NRCs declared quality management activity in ERC courses (table 1). ALS and BLS were the most covered ones.
Organisation and responsibility for quality management: It was generally taken by the NRC (n=5), sometimes by a course coordinator (n=1), an educator (n=1), by a specific committee (n=2) or a director of quality (n=1). In some NRCs the national course director (NCD) was the responsible person (n=3), the course director (CD) (n=3) alone or together with the course organiser (CO) (n=1). Comparing the number of course types supervised to the responsible body, a director of quality, a NCD, a committee or an educator supervised between five à six course types, while CDs and NRCs only did two à three.
Documentation used: Five NRCs used specific forms (feedback forms or quality control forms based on CD’s observation). Other NRCs used a quality feedback questionnaire (n=1), a course report (n=3). In one NRC the NCD checks the course reports. Reports were recorded on paper only (n=9), on paper combined with electronic record (n=2) or combined with video (n=2), or electronically only (n=1). One NRC does not save any record and two gave no answer. Only nine of those 17 NRCs (53%) have a specific document for the ERC courses quality management.
Number of courses supervised: Fifteen out of 16 NRCs gave an estimation, in percentage, of the number of courses of a certain type they supervised for quality (Table 2). Only one country did not supervise organised BLS courses. Although, BLS and ALS are the most frequently supervised courses (respectively by 9 and 14 NRCs). Fourteen NRCs supervise 62% of all ALS courses (range: 10-100%) and 8 NRCs supervise 43% of all Generic Instructor Courses (GIC) (range: 50-100%). In other countries the amounts are lower (Table 2). The German NRC gave no estimation on the organised courses. Therefore, it is impossible to know if they supervise or not. The estimation of the Dutch NRC was not included because the answer was unclear: they gave only a number one for all courses except for European Paediatric Intermediate Life Support (EPILS) course.
Feedback tool: Sixteen out of 26 responding NRCs (94%) mentioned the need of an online tool. Two NRC gave “No comment” and three no response. Six out of 11 NRCs indicated that they would like to have an online feedback form for candidates, COs, CDs, Full Instructors (FI), Instructor Trainers (IT), instructor candidates (IC) or Instructor Trainer Candidates (ITC). Four NRCs wanted the availability of an electronic form in CoSy, or an easy to use evaluation grid or form, as was available in the old course management system (CMS). One NRC asked specifically for an online quality management tool.
Supervision: Seven NRCs agreed that the Science and Education Committee (SEC) should be informed about the courses’ quality; eight NRCs proposed to do it only if there are problems. A negative answer came from two NRCs.
National or international instructor or CD day: Thirteen NRCs organised themselves an instructor day or workshop for ERC instructors and/or course directors. Two NRCs planned it. One NRC mentioned the lack of availability of persons to organise it and one had no instructors yet. When organised, meetings happened between twice in a year until every two years. It could be a one-day or a two-day meeting. Organisers are the NRC or a specific committee. The content focused merely on skills’ demonstrations, practice, updating and discussions, in order to homogenise skills for all CDs, FIs and instructor candidates. Additionally, NRCs mentioned that they need instructor days at European level, and asked for more workshops during ERC congresses. Three NRCs did not comment.
Could ERC facilitate quality management? The answers were very diverse. Five NRCs asked for a specific tool and promotion of quality management and three of them whished a regular audit by an international faculty or external auditors (from other countries in order to maintain high quality and objective feedback) together with NRC representatives. The involvement of NRCs in quality management is mandatory (n=2).
Suggestions by NRCs: ERC should have a course-specific quality management coordinator and easy to use facilities in CoSy (feedback forms …) or a tool in order to compare course performances within NRCs and also between NRCs (as existed before). They also suggested to improve the poor response rate by participants on the course feedback form was to link the completion of the feedback form with the download of the course certificate.
Survey across National Resuscitation Councils on life support management – DISCUSSION
This survey demonstrated that quality management has been suboptimal in ERC courses. Only half of the NRCs used any quality management tool for ERC courses. Only two NRCs used a structured quality management form.
We were able to reach 79% response rate of the NRCs, members of the ERC. Therefore, we can conclude that these survey results represent the overall opinion.
Quality control was mainly performed on BLS, ALS and Generic Instructor courses. The quality management within ERC member countries is not uniform. Each NRC used different paper or electronic forms, or a multitude of persons to manage quality; these persons were frequently involved in the courses. Unfortunately, we did not have any possibility to see examples of the tools or the content used in quality control or other forms for all course types or of the teaching quality.
There seems to be a relationship between the level of quality management system in the NRCs and the number of course types supervised: the higher level (quality committee and similar), the more course types organized. Most countries thought that appointing a director of quality should be a future step. The Dutch NRC reported no percentage: therefore we could not define how many courses of each course type were supervised.
The proposed documents gave information about how the course was run, the venue and/or the course content. It contained no objective information about the performance of the CO, the CD or the instructors. Although, their performance in teaching and/or skills or skills’ demonstration has an important impact on the candidates’ skills and knowledge.
Nearly all NRCs expect an electronic form for their quality control. There is a need for creation of such a tool by ERC. It should be a high quality and easy to use form, adapted for each course type and integrated into Cosy. The results should be available for the relevant NRC or NCD with optional access of other parties in needed, i.e. the relevant SEC. NRCs expect external assistance only in problematic cases. If quality is managed and not controlled, this will lower the NRCs’ fear of excessive interference.
There are NRCS experienced with auditing ERC courses with specific people from the country or from abroad. Unfortunately, they did not use a uniform and specific form. Some of the results of these audits and reports are comparable from one NRC to another. Most of the reported cases were not audited in situ by independent observers, which is not according the audit rules. NRCs mentioned different feedback forms or reports, giving a good but partial idea about quality. Enthusiasm is an important measure of quality [16,17], but enthusiasm or dissatisfaction of directors, instructors or candidates colour the course reports with subjectivity, while the latter is an important element for measuring their view on the delivered quality [19]. Because of that duality, independent observers are needed.
An audit examines systematically the quality system build in a course and should be carried out by internal or external auditors or an audit team, at predefined intervals and based on validated assessment criteria [19]. An audit is one of the most powerful monitoring techniques and an effective way to avoid complacency and highlight slowly deteriorating conditions, especially when the auditing focuses not just on compliance but effectiveness [20]. Therefore, the development and implementation of new structures in quality control, based on written documents and clearly listed criteria, must be initiated. Those who would apply for such an enrolment would not be directly involved in conducting courses but would act as independent body in role of internal and external quality supervisors.
A quality management tool was already prepared by the ERC quality management working group and discussed during the ERC Course Director Day in 2017 by the NCDs present at the meeting. Afterwards comments were asked and consequently, the working group created a final document. To realise homogenous quality in all NRC, one needs a coordinator, an expert responsible for quality management, capable of understanding and performing the duties of a quality officer in order to make each NRC grow in their own quality management function. This person should liaise with the ERC quality management working group until each NRC will be able to have their own coordinator or quality group.
The strength of this study is that we were able to demonstrate the need for a tool and homogenisation of ERC courses’ quality management. There are also some limitations. We did not have any possibility to see examples of the tools or the content used the quality control or other forms for all course types or concerning the teaching quality. Only 13 NRCs organised an instructor day or a workshop for ERC instructors and/or course directors. This question has a strong limitation, because it was not asked to the NRCs which did not organise any quality control but still may organise such days or refresher meetings. We do not know what the non-responding NRCs are doing as well.
DISCOVER THE CONCLUSIONS: SOURCE
_________________________________________
OTHER RELATED ARTICLES:
PROJECT: ERC Research Net – 2nd ERC Research Summer School
ERC 2018 – Statement from the European Resuscitation Council relating to publication of the PARAMEDIC 2 trial