Education Research: Competency-Based EEG Education

Background and Objectives We recently published expert consensus-based curricular objectives for routine EEG (rEEG) interpretation for adult and child neurology residents. In this study, we used this curriculum framework to develop and validate an online, competency-based, formative and summative rEEG examination for neurology residents. Methods We developed an online rEEG examination consisting of a brief survey and 30 multiple-choice questions covering EEG learning objectives for neurology residents in 4 domains: normal, abnormal, normal variants, and artifacts. Each question contained a deidentified EEG image, displayed in 2 montages (bipolar and average), reviewed and optimized by the authors to address the learning objectives. Respondents reported their level of confidence (LOC, 5-point Likert scale) with identifying 4 categories of EEG findings independently: states of wakefulness/sleep, sleep structures, normal variants, and artifacts. Accuracy and item discrimination were calculated for each question and LOC for each category. The test was disseminated by the International League Against Epilepsy and shared on social media. Results Of 2,080 responses, 922 were complete. Respondents comprised clinical neurophysiologists/experts (n = 41), EEG/epilepsy clinical fellows (n = 211), EEG technologists (n = 128), attending neurologists (n = 111), adult neurology residents (n = 227), child neurology residents (n = 108), medical students (n = 24), attending non-neurologists (n = 18), and others (n = 54). Mean overall scores (95% CI) were 82% (77–86) (clinical neurophysiologists), 81% (79–83) (clinical fellows), and 72% (70–73) (adult and child neurology residents). Experts were more confident than clinical fellows in all categories but sleep structures. Experts and clinical fellows were more confident than residents in all 4 categories. Among residents, accuracy and LOC increased as a function of prior EEG weeks of training. Accuracy improved from 67% (baseline/no prior EEG training) to 77% (>12 prior EEG weeks). More than 8 weeks of EEG training was needed to reach accuracy comparable with clinical neurophysiologists on this rEEG examination. Increase in LOC was slower and less robust than increase in accuracy. All but 3 questions had a high discrimination index (>0.25). Discussion This online, competency-based rEEG examination, mapped to a published EEG curriculum, has excellent psychometrics and differentiates experienced EEG readers from adult and child neurology residents. This online tool has the potential to improve resident EEG education worldwide.


Introduction
EEG is one of the cornerstones in the diagnosis and management of patients with seizures and epilepsy.From a diagnostic standpoint, EEG has a critical role in establishing a diagnosis of epilepsy-for example, in cases where paroxysmal events are of moderate clinical suspicion for epileptic seizures 1,2 or in cases where there is a single unprovoked epileptic seizure. 3In addition, EEG is helpful in classifying epilepsy and guiding workup and treatment. 1Finally, EEG can assist in understanding prognosis and stratifying risk of seizure recurrence upon antiseizure drug withdrawal. 1,4e important clinical role of EEG requires accurate and reliable EEG interpretation.If this condition is not met, the care of patients with seizures and epilepsy is compromised and deleterious outcomes may ensue. 5It is, therefore, critical to ensure that all those who read EEG in clinical practice are fully competent.7][8][9] This practice dictates that adult and child neurology residency training provides optimal EEG training, thus allowing residents to achieve EEG competency by the time of graduation.
1][12][13][14] There have been multiple efforts to understand this educational gap and address barriers to resident EEG education.One of these efforts sought to standardize the resident EEG learning process by the curation of curricular objectives for teaching and assessing routine EEG (rEEG) interpretation. 15These curricular objectives consist of a list of "must-know" rEEG findings for adult and child neurology residents. 15Notably, interpretation of other types of EEGs, such as those performed in critically ill patients, was not addressed.In this study, we used this curriculum framework 15 to develop and validate an online, multiple-choice rEEG examination for adult and child neurology trainees-which may be used for both formative and summative assessment.

Methods
We developed an online rEEG examination (Figure 1) based on a previously published rEEG curriculum content map. 15he examination was hosted on Survey Monkey and consisted of a brief survey and 30 multiple-choice questions covering 4 EEG domains: normal (n = 8/30), abnormal (n = 7/30), normal variants (n = 7/30), and artifacts (n = 8/30) (Table 1).Multiple-choice questions tested rEEG findings considered high yield by a large group of multinational EEG/ epilepsy experts based on the importance for adult and child neurology residents to learn these findings during residency training. 15 the normal domain, we tested the top 10 high-yield EEG findings. 15We used 1 question (question 10) to test 3 EEG findings associated with drowsiness: slowing of the posterior dominant rhythm, diffuse irregular delta-theta slowing of the background, and slow roving lateral eye movements.
In the artifacts domain, we tested the top 7 high-yield EEG findings in addition to the ninth EEG finding on the rank list. 15We skipped the eighth EEG finding on the list (pulse artifact) because this finding is less often seen in rEEG clinical practice.We arbitrarily elected to test 50 Hz artifact rather than 60 Hz artifact given their electrographic similarity; however, we note that they are typically seen in Europe and North America, respectively.
In the normal variants domain, we tested the top 5 high-yield EEG findings in addition to the seventh and eighth EEG findings on the rank list. 15We skipped the sixth EEG finding (hypnopompic hypersynchrony) because this finding is also less often seen in rEEG clinical practice and because of its electrographic similarity with hypnagogic hypersynchrony, which was included in the test.
In the abnormal domain, we tested the top 5 high-yield EEG findings in addition to the seventh and eighth EEG findings on the rank list. 15We skipped the sixth EEG finding (generalized seizure, absence) because this EEG pattern was electroencephalographically represented by the first EEG finding on the list (generalized epileptiform discharge, 3 Hz).
The survey included questions regarding participants' country of origin, profession (medical student, adult neurology resident, child neurology resident, EEG/epilepsy clinical fellow, or other), year of training (if applicable), number of weeks dedicated to EEG learning during training (0, 1-4, 5-8, 9-12, or >12) (if applicable), and level of confidence (LOC, 5-point Likert scale: from "not at all confident" to "very confident") identifying 4 categories of EEG findings independently/ without supervision: states of wakefulness/sleep, sleep structures, normal variants, and artifacts.The survey did not include a question addressing participants' country of practice/training.
Each question included a deidentified, 10-to 20-second EEG image displayed in 2 montages (bipolar and common average).These EEGs were obtained from 2 authors' (F.A.N., S.B.) personal collections and were deemed unequivocal examples of the findings tested by 5 EEG experts authoring this work (F.A.N., R.M., S.R., W.O.T., S.B.).All 30 questions had 5 answer choices.All but 2 questions had 1 correct and 4 incorrect answers.Questions 3 and 9 had 2 correct answers: "focal epileptiform discharge, spike" and "focal epileptiform discharge, sharp."We considered both answer choices as correct because it was virtually impossible for respondents to estimate the sharp transient duration without an interactive ruler, hence being unable to correctly differentiate a sharp wave (duration of 70-200 milliseconds) from a spike (duration of <70 milliseconds).Incorrect answers were judiciously selected to represent reasonable and important electroencephalographic differentials of the EEG finding tested.
Questions were reviewed and improved, and correct and incorrect answers were adjudicated by 5 EEG experts (F.A.N.,  Mean overall percentage scores and mean percentage scores in each EEG domain for each major group are summarized in Table 2. Comparisons of mean overall and domain-specific scores among these 3 major groups are discussed below and Regarding mean overall scores, there was a statistically significant difference between clinical neurophysiologists and neurology residents (p = 0.0002) and between EEG/epilepsy clinical fellows and neurology residents (p < 0.0001).There was no statistically significant difference between clinical neurophysiologists and EEG/epilepsy clinical fellows (p = 0.9347).
We also compared LOC identifying 4 categories of EEG findings independently/without supervision among the 3 major groups.
In the states of wakefulness/sleep, normal variants, and artifacts categories, clinical neurophysiologists were more confident than EEG/epilepsy clinical fellows who were more confident than neurology residents.In the sleep structures category, clinical neurophysiologists were as confident as EEG/epilepsy clinical fellows, but both groups were more confident than neurology residents.These data are summarized in Table 3.

Questions With Highest and Lowest Accuracy
The top 10 questions with overall lowest accuracy are listed in eTable 1 (links.lww.com/NE9/A46).The top 10 questions with overall highest accuracy are listed in eTable 2.

Discussion
We developed a competency-based, online rEEG examination based on a published EEG curriculum for adult and child neurology residents 15 using expert-certified unequivocal EEG examples of findings considered educationally high yield (aka "must-know") for adult and child neurology residents.The examination was completed by approximately 1,000 participants across the globe including clinal neurophysiologists (n = 41), EEG/epilepsy clinical fellows (n = 211), and adult and child neurology residents (n = 335).Our results support that this examination has excellent overall psychometrics and accurately differentiates experienced EEG readers (attending neurologists, EEG technologists, EEG/epilepsy clinical fellows, and clinical neurophysiologists) from adult and child neurology residents and attending non-neurologists based on accuracy.
The examination's ability to distinguish neurology residents from experienced EEG readers is promising because it allows trainees to undergo a tangible, competency-based model of assessment.The utilization of this examination becomes even more appealing because a large portion of neurology residency programs in the United States (64%) 14 and in Europe (47%) 8 do not currently use objective measures to assess EEG  competency.Importantly, this learner-centered, competencybased model has been guiding medical education over the past decades 16 and is arguably an optimal framework in the field of EEG to ensure that neurology residents achieve competency by graduation. 15ecifically concerning the group of adult and child neurology residents in this study, we learned that their overall accuracy in identifying "must-know" EEG findings for neurology residents 15 does not become statistically comparable with the mean overall accuracy among clinical neurophysiologists until after residents have had more than 8 weeks of EEG training.Given that the examination includes unequivocal examples of EEG findings considered "mustknow" for adult and child neurology residents, 15  Ensuring that these recommendations are implemented is crucial, especially in countries where neurologists without postresidency training in clinical neurophysiology/EEG or epilepsy typically read EEGs in clinical practice-such as the United States 6,7,9 and select European countries. 8In the United States, for instance, it has been shown that EEG is the most common procedure performed by neurologists (;60%). 6imilarly, survey data on the US child neurology workforce have shown that more than half of child neurology/ neurodevelopmental disabilities specialists who manage patients with epilepsy and read EEGs had no formal EEG training other than what was received during residency. 9Nevertheless, EEG training is not a requirement for graduation in neurology residencies in the United States, 17,18 leading to some programs requiring no EEG training altogether. 14is target of more than 8 weeks of EEG training is higher than the mean number of EEG weeks required to graduate in neurology programs in the United States (6.8 weeks, range 0-16) 14 and in European countries where general neurologists are not among the providers who typically read EEGs in clinical practice (7.4 weeks). 8Nonetheless, it is comparable with the mean number of EEG weeks required to graduate in European countries where general neurologists are either among the providers or the only providers who read EEGs in clinical practice (9.2 weeks). 8 this study, we also found that it required more than 12 weeks of prior EEG training for at least 36% of residents to report feeling very confident identifying EEG findings related to states of wakefulness/sleep, sleep structures, and artifacts without supervision.Conversely, a minority of residents (20%) felt very confident identifying EEG findings in the Finally, we learned that clinical neurophysiologists and EEG/ epilepsy clinical fellows performed similarly but greater than adult and neurology residents in the examination addressing "must-know" EEG findings for adult and child neurology residents 15 both overall as well as in its normal, abnormal, and normal variants domains.In the artifacts domain, however, there was no significant difference in accuracy among the 3 groups, except for EEG/epilepsy clinical fellows performing greater than adult and child neurology residents.In addition, we learned that the accuracy among attending nonneurologists was significantly lower than experienced EEG readers and adult and child neurology residents.
Regarding LOC identifying 4 categories of EEG findings without supervision, clinical neurophysiologists were more confident than EEG/epilepsy clinical fellows who were more confident than adult and child neurology residents in the categories of states of wakefulness/sleep, normal variants, and artifacts.In the sleep structures category, clinical neurophysiologists were as confident as EEG/epilepsy clinical fellows, and both groups were more confident than adult and child neurology residents.
These data highlight that although clinical neurophysiologists performed as well as EEG/epilepsy clinical fellows on this rEEG examination, the former group had a relatively higher LOC compared with the latter group.This discrepancy may be explained by the fact that the rEEG examination was targeted at a trainee level (low overall difficulty level for experienced EEG readers).Consequently, both clinical neurophysiologists and EEG/epilepsy clinical fellows performed similarly.However, the overall LOC obtained for both groups was unrelated to the rEEG examination itself and rather referred to their overall perceptions of identifying EEG findings independently.
Regarding psychometric validity, all but 3 questions in the examination had a discrimination index of >0.25.These data support the competency-based assessment purpose of the examination wherein experienced EEG readers are separated from nonexperienced EEG readers based on accuracy (aka EEG proficiency and skills).The 3 questions with a lower discrimination index, and higher overall accuracy (>90%), are also vital because they addressed important EEG findings (electrode pop, eye blinks, and myogenic artifact) that must be mastered at an early stage of training by all EEG readers, irrespective of their experience.
Our study has several limitations.First, owing to the anonymous nature of the examination, there may have been more than 1 recorded answer for the same participant.Similarly, we could not confirm participants' profession or level of training or past EEG experience (if neurology residents).We did not obtain participants' country of practice/training although, given that our rEEG examination was disseminated on the internet by the authors and ILAE, we assume that our participants were from many different countries.This limitation should be taken into account upon interpreting our results and recommendations because adult and child neurology residency as well as EEG/epilepsy fellowship training vary considerably across countries. 8,19,20Second, for participants who were neurology residents, we were provided with a number of weeks of prior EEG training only.We acknowledge that EEG training relies greatly on the quality of education in addition to the quantity.We believe this limitation may have been mitigated by the large number of neurology residents included in the study.Third, the examination was "open book," and participants did not have time constraints to complete the test.These characteristics may have influenced their scores.Fourth, although the EEG examples included in the test were believed to be unequivocal per 5 authors, who are EEG experts, we identified expert inter-rater variability (IRV) based on the scores of other clinical neurophysiologists/EEG experts who also completed the examination.This phenomenon is a known limitation of expert EEG interpretation using qualitative visual analysis. 21 Another area that requires further research is the unexpectedly large expert inter-rater variability in the rEEG examination.Increased expert IRV is a well-known phenomenon in the realm of interictal epileptiform discharge (IED) identification 21,22 and seizures and rhythmic and periodic pattern identification. 23onetheless, expert IRV has not been fully explored in other rEEG findings such as abnormal findings other than IEDs, normal findings, normal variants, and artifacts.
We believe our rEEG examination may be used by educators, administrators, and regulators in several ways.In its version with feedback, it can be used as a self-paced online teaching tool (formative assessment) where the highest yield "mustknow" rEEG findings for adult and child neurology residents are included.In its version without feedback, it can be used as a competency-based assessment tool.This tool can be used longitudinally throughout residency training, where a target accuracy would depend on the number of EEG weeks residents have had.Educators may tailor individual and collective EEG educational activities based on residents' examination scores.This tool can also be used at the time of residency training completion, where an accuracy comparable with expert-level accuracy would be a surrogate of trainee EEG competency within the realm of rEEG.Finally, on implementation of our rEEG examination on the ILAE Academy online learning platform, learners will be able to use this tool as a formative and summative assessment concomitantly.
Finally, we recommend that adult and neurology residents undergo more than 8 weeks (ideally >12 weeks) of EEG training to ensure minimal rEEG competency by the time they graduate.We also suggest educators consider accounting for the observation that residents typically improve their accuracy before their LOC increases throughout their trainingespecially regarding EEG normal variants.We believe that the rEEG examination will be a useful resource in EEG education and that it will help EEG education teaching and assessment become more objective and standardized.

Figure 1
Figure 1 Representative Screenshot of the Online Routine EEG Examination

Figure 3 Figure 2
Figure 3 Mean Overall Scores Stratified by Number of Weeks of Prior EEG Training Among Adult and Child Neurology Residents (n = 334)

Figure 4
Figure 4 Level of Confidence Stratified by Number of Weeks of Prior EEG Training Among Adult and Child Neurology Residents (n = 334)

Table 1
Questions Included in the Routine EEG Examination Stratified by Domain Along With Correct Answers and Item Psychometrics through email and social media by the International League Against Epilepsy (ILAE).Data were collected from June 2022 to December 2022.After data collection and study completion, the authors created a new version of the rEEG examination with instant feedback for self-paced online formative assessment.Feedback is given for all answer choices-correct and incorrect ones.Examination content remained unchanged, except for questions 3 and 9 where we replaced EEG images with easier-to-visualize examples of a sharp wave and a spike, respectively.In addition, we included a more detailed scale, which was placed closer to the sharp wave/spike.These modifications, derived from what we learned from our study results, were made to ensure that respondents can measure the sharp transient duration and, therefore, determine whether the focal epileptiform discharge is a sharp wave (70-200 milliseconds) or a spike (<70 milliseconds).The formative assessment version of the rEEG examination with feedback will be made publicly available at the time of publication of this study and integrated into the online educational portfolio of the ILAE Academy (eAppendix 1, links.lww.com/NE9/A42).
15breviations: HV = hyperventilation; NREM = non-REM; NV = normal variant; PDR = posterior dominant rhythm; POSTS = positive occipital sharp transients of sleep; RMTD = rhythmic mid-temporal theta of drowsiness; SSS = small sharp spike.R.M., S.R., W.O.T., S.B.).The rEEG examination was disseminated on social media from the authors' personal accounts (F.A.N., R.M., R.S., S.B.) and Analysis Item analysis (difficulty/accuracy and discrimination index) was performed using the Statistical Analysis System.Regression models were used to compare 3 groups (clinical neurophysiologists, EEG/epilepsy clinical fellows, and adult and child neurology residents) regarding accuracy: overall and stratified by each test domain (normal, abnormal, normal variants, and artifacts).Logistic regression models were used to compare these 3 groups regarding LOC identifying 4 categories of EEG findings (states of wakefulness/sleep, sleep structures, normal variants, and artifacts) independently/without supervision.Finally, we examined the relationships between the number of weeks of prior EEG training among (adult and child) neurology residents and their (1) mean overall scores and (2) LOC identifying the 4 categories of EEG findings.In the former analysis, we compared the mean overall scores in each group based on prior EEG weeks of training (0, 1-4, 5-8, 9-12, and >12) with the mean overall score among clinical neurophysiologists.Statistical significance was assumed for p < 0.05.tifyinginformation was collected from participants.This study was approved by the institutional review board and data safety officer at the senior author's (S.B.) institution (EMN-2023-02938).Data AvailabilityStudy data will be made available by request from any qualified investigator.The rEEG examination with (formative) and without (summative) feedback will be publicly available on the internet.Test Content ValidityTest content maps to a published EEG curriculum15and as such is divided into 4 domains: normal, abnormal, normal variants, and artifacts (Table1).Within each domain, we tested EEG findings with the highest educational yield.15TestPsychometricValidityTheaccuracy and discrimination index for each question are summarized in Table1.All but 3 questions (questions 17, 18, and 24) had a discrimination index of at least 0.25.number of prior EEG weeks of training were 67 (59-74) at baseline/no prior EEG training, 66 (62-69) for 1-4 weeks of prior EEG training, 72 (69-75) for 5-8 weeks, 73 (69-77) for 9-12 weeks, and 77 (74-79) for >12 weeks.The mean overall percentage score in each group was compared with the mean overall percentage score (mean [95% CI]) among clinical neurophysiologists (n = 41, 82 (77-86)).

Table 2
Mean Overall and Domain-Specific Scores Among Clinical Neurophysiologists, EEG/Epilepsy Clinical Fellows, and (Adult and Child) Neurology Residents Abbreviation: NV = normal variant.Mean score (95% CI).

Table 3
Level of Confidence Identifying 4 Categories of EEG Findings Independently/Without Supervision Among Clinical Neurophysiologists (n = 41), EEG/Epilepsy Clinical Fellows (n = 211), and (Adult and Child) Neurology Residents (N = 335)The ramifications of these data are twofold.First, we identified a mismatch between residents' increase in accuracy and LOC as they undergo EEG training in residency.The increase in LOC seems to be slower than the increase in accuracy.
Moreover, the examination included only 1 example of each EEG finding tested, although, in clinical EEG practice, each type of EEG finding may be represented by a multitude of nonidentical waveforms.Fifth, our study was not designed to assess retention of skill because all respondents were tested only once.Future directions for our research include improving our competency-based rEEG examination by adding more examples of each of the "must-know" findings, conceivably of different levels of difficulty (based on the level of expert agreement).Furthermore, we plan to have our EEG examples labeled by a larger cohort of experts, thereby expanding the expert consensus-based gold standard.Moreover, rather than showing EEG examples as static images, we plan to create an online, publicly accessible EEG platform where learners can interact with EEGs shown, for example, by changing the montage and modifying sensitivity.We will also include longer EEGs including findings of interest where learners are able to scroll and find those EEG patterns before interpreting them.Finally, it would be helpful to have similar competencybased EEG examinations focused on other types of EEGs, such as critical care EEG, epilepsy monitoring unit EEG, and intracranial EEG.