Fidelity Assessment and Improvement Cycles

On any school day, one person in every five (20% of the entire population) in the United States of America is in school.  In this massive enterprise, education happens when adult educators interact with students and each other in education settings. 

To be useful to students and functional for teachers, Implementation Teams need to know what to train, what to coach, and what performance to assess to make full and effective use of a selected effective practice.  Implementation Teams need to know WHAT is intended to be done (innovation components or instructional practices) so they can efficiently and effectively develop processes to assure high fidelity use of the innovation now and over time. 

Note that the need to specify WHAT is intended applies to current practices as well as evidence-based or innovative practices.  Current practices may be “standard practices” but WHAT are they?  Can they be done consistently?  Do they produce desired student outcomes when done as intended?  Standard practices must meet the same test if they are to be used, improved, and retained in education.

The PDSA Cycle

The process of establishing a fidelity assessment takes time and focused effort.  To establish fidelity assessments, Implementation Teams make intentional use of the plan, do, study, act (PDSA) cycle.  The benefits of the PDSA cycle in highly interactive environments have been evaluated across many domains including manufacturing, health, and substance abuse treatment.  As an improvement cycle, the PDSA trial-and-learning approach allows Implementation Teams to identify the essential components of the innovation itself.  The PDSA approach can help Implementation Teams evaluate the benefits of innovation components, retain effective components, and discard or de-emphasize non-essential components. 

  • Plan: The “plan” is the innovation or instruction as educators intend it to be used in practice. 
  • Do: The “plan” needs to be operationalized (what we will do and say to enact the plan) so it is doable in practice.  This compels attention to the core innovation components and provides an opportunity to begin to develop a training and coaching process (e.g. here is how to do the plan) and to create a measure of fidelity (e.g. did we “do” the plan as intended). 
  • Study: As a few newly trained educators begin working with students or others in an actual education setting, the budding fidelity measure can be used to interpret the outcomes in the “study” part of the PDSA cycle (e.g. did we do what we intended; did doing what we intended result in desired outcomes). 
  • Act: The Implementation Team uses the experience to help develop a new plan where the essential components are even better defined and operationalized.  In addition, the fidelity assessment is adjusted to reflect more accurately the essential components and the items are modified to make the assessment more practical to conduct in the education setting.
  • Cycle: The PDSA process is repeated until the innovation and the fidelity assessment are specified well enough to meet the Usable Innovation criteria.  At that point, the innovation is ready to be used by multiple educators, the fidelity assessment is deemed practical, and the correlation between the essential components and intended outcomes is high.

Implementation Teams may employ the PDSA cycle in a Usability Testing format to arrive at a functional version of an innovation that is effective in practice and can be implemented with fidelity on a useful scale (e.g. Akin et al., 2013; Fixsen et al., 2001; McGrew & Griss, 2005; Wolf et al., 1995).  Once the components of an innovation have been identified, functional analyses can be done to determine empirically the extent to which key components contribute to significant outcomes.  As noted above, the vast majority of standard practices and innovations do not meet the Usable Innovation criteria.  Implementation Teams will need to make use of PDSA improvement cycles and Usability Testing to establish the core innovation components before they can proceed with broader scale implementation.