What is the inter-rater agreement of injury classification using the WHO minimum data set for emergency medical teams?. Issue 2 (7th January 2020)
- Record Type:
- Journal Article
- Title:
- What is the inter-rater agreement of injury classification using the WHO minimum data set for emergency medical teams?. Issue 2 (7th January 2020)
- Main Title:
- What is the inter-rater agreement of injury classification using the WHO minimum data set for emergency medical teams?
- Authors:
- Jafar, Anisa Jabeen Nasir
Sergeant, Jamie C
Lecky, Fiona - Abstract:
- Abstract : Background: In 2017, the WHO produced its first minimum data set (MDS) for emergency medical team (EMT) daily reporting during the sudden-onset disasters (SODs), following expert consensus. The MDS was deliberately designed to be simple in order to improve the rate of data capture; however, it is new and untested. This study assesses the inter-rater agreement between practitioners when performing the injury aspect of coding within the WHO EMT MDS. Methods: 25 clinical case vignettes were developed, reflecting potential injuries encountered in an SOD. These were presented online from April to July 2018 to practitioners who have experience of/training in managing patients in SODs The practitioners were from UK-Med's members, Australian Medical Assistance Team's Northern Territory members and New Zealand Medical Assistance Team members. Practitioners were asked to code injuries according to WHO EMT MDS case classifications. Randolph's kappa statistic for free-marginal multirater data was calculated for the whole dataset as well as subgroups to ascertain inter-rater agreement. Results: 86 practitioners responded (20.6% response rate), giving >2000 individual case responses. Overall agreement was moderate at 67.9% with a kappa of 0.59 (95% CI 0.49 to 0.69). Despite subgroups of paramedics (kappa 0.63, 95% CI 0.53 to 0.72), doctors (kappa 0.61, 95% CI 0.52 to 0.69) and those with disaster experience (kappa 0.62, 95% CI 0.52 to 0.71) suggesting slightly higher agreement,Abstract : Background: In 2017, the WHO produced its first minimum data set (MDS) for emergency medical team (EMT) daily reporting during the sudden-onset disasters (SODs), following expert consensus. The MDS was deliberately designed to be simple in order to improve the rate of data capture; however, it is new and untested. This study assesses the inter-rater agreement between practitioners when performing the injury aspect of coding within the WHO EMT MDS. Methods: 25 clinical case vignettes were developed, reflecting potential injuries encountered in an SOD. These were presented online from April to July 2018 to practitioners who have experience of/training in managing patients in SODs The practitioners were from UK-Med's members, Australian Medical Assistance Team's Northern Territory members and New Zealand Medical Assistance Team members. Practitioners were asked to code injuries according to WHO EMT MDS case classifications. Randolph's kappa statistic for free-marginal multirater data was calculated for the whole dataset as well as subgroups to ascertain inter-rater agreement. Results: 86 practitioners responded (20.6% response rate), giving >2000 individual case responses. Overall agreement was moderate at 67.9% with a kappa of 0.59 (95% CI 0.49 to 0.69). Despite subgroups of paramedics (kappa 0.63, 95% CI 0.53 to 0.72), doctors (kappa 0.61, 95% CI 0.52 to 0.69) and those with disaster experience (kappa 0.62, 95% CI 0.52 to 0.71) suggesting slightly higher agreement, their CIs (and those of other subgroups) suggest overall similar and moderate levels of practitioner agreement in classifying injuries according to the MDS categories. Conclusions: An inter-rater agreement of 0.59 is moderate, at best, however, it gives ministries of health some sense of how tightly they may interpret injury data derived from daily reports using WHO EMT MDS. Furthermore, this kappa is similar to established but more complex (thus more contextually impractical) injury scores. Similar studies, with weighting for injury likelihood using sample data from SODs would further refine the level of expected inter-rater agreement. … (more)
- Is Part Of:
- Emergency medicine journal. Volume 37:Issue 2(2020)
- Journal:
- Emergency medicine journal
- Issue:
- Volume 37:Issue 2(2020)
- Issue Display:
- Volume 37, Issue 2 (2020)
- Year:
- 2020
- Volume:
- 37
- Issue:
- 2
- Issue Sort Value:
- 2020-0037-0002-0000
- Page Start:
- 58
- Page End:
- 64
- Publication Date:
- 2020-01-07
- Subjects:
- disaster planning and response -- data management -- global health
Emergency medicine -- Periodicals
616.02505 - Journal URLs:
- http://www.bmj.com/archive ↗
https://emj.bmj.com/ ↗ - DOI:
- 10.1136/emermed-2019-209012 ↗
- Languages:
- English
- ISSNs:
- 1472-0205
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 18149.xml