Implementation Frameworks Applied: The Status Quo in Child, Youth and Family Services

Saturday 10:30 – 11:45 Breakout D1

Presentor: Bianca Albers

Bianca Albers, Parenting Research Centre; Robyn Mildon, Parenting Research Centre; Aron Shlonsky, University of Melbourne; Aaron Lyon, University of Washington


In recent years, the growing interest in implementation in the different sectors of human services has led many researchers to develop conceptual frameworks that can be defined as coherent sets of interlinked concepts that – together – constitute a generic structure for describing, understanding or guiding complex implementation processes. A few examples of these frameworks are The Active Implementation Frameworks (Metz & Bartley, 2012), the EPIS Implementation Conceptual Model (Aarons, 2012), The Consolidated Framework for Implementation Research – CFIR (Damschroder, 2009) or The Quality Implementation Framework – QIF (Meyers, Durlak & Wandersman, 2012). However, the application of these frameworks in practice settings is still rare, and our knowledge about the evidence behind these frameworks is limited.

This presentation will give an overview of the results of a scoping review that aimed to

  • Identify studies related to the field of child and youth services that apply an implementation framework
  • Summarise the results generated through these studies in order to describe the current evidence behind implementation frameworks that have been applied in child and youth services
  • Discuss the common and inherent weaknesses of existing implementation frameworks that likely prevent them from rigorous empirical testing through scientific studies

The Scoping review is based on a total 831 papers of which titles and abstracts were screened to determine if they related to evaluations or any other applications of implementation frameworks in the child, youth and family service sector.

Tracking Implementation Strategies Prospectively: A Practical Approach

Saturday 10:30 – 11:45 Breakout D1

Presentor: Alicia C. Bunger

Alicia C. Bunger, Ohio State University; Byron J. Powell, University of Pennsylvania; Hillary Robertson, Ohio State University



Descriptions of implementation strategies in the literature often lack precision and consistency, which limits replicability and slows the accumulation of knowledge. Recent publication guidelines for implementation strategies call for improved description of the activities, dose, rationale, and expected outcome of strategies. However, capturing implementation strategies with this level of detail can be challenging, as responsibility for implementation is often diffuse and strategies may be flexibly applied as new barriers and challenges emerge . Few (if any) tools are available for capturing implementation in real-time. We present a practical approach to tracking implementation strategies, which we piloted in an evaluation of a multi-component intervention to improve children’s access to behavioral health services in a county-based child welfare agency. This tracking method gathers monthly accounts of the implementation team’s activity and categorizes it based upon Powell et al.’s (2012) taxonomy of implementation strategies. In addition to type of strategy, the tool  captures intent, frequency of use, timing, and individuals involved. This approach allows us to monitor implementation over time, estimate “dose,??? and describe temporal ordering of implementation strategies. We will highlight the utility of this approach for implementation research and practice, discuss its limitations, and present avenues for future development.

Trained but Not Implementing: The Need for Effective Implementation Planning Tools

Saturday 10:30 – 11:45 Breakout D1

Presentor: Christopher Botsko

Christopher Botsko, Altarum Institute



One of the most inefficient and common occurrences when implementing evidence-based practices (EBPs) is that a large number of trained providers do not end up implementing the practice. This presentation explores the phenomena of failure to implement through an evaluation of the Triple P parenting support program. Two communities were provided with resources to implement Triple P through a partnership of Local Health Departments and Federally Qualified Health Centers.  Data assessing progress on implementation was collected through surveys of advisory group members and providers, three years of annual interviews with key informants, focus groups with parents and providers, and program tracking. The presentation will use the data to show progress that was made on implementation and the challenges that were faced. Project leadership was provided an implementation framework that drew upon implementation research. Project leadership developed a better understanding of what implementation entailed beyond training but needed more practical tools and information to effectively use the framework. An implementation planning tool will be presented that was developed from feedback from the implementing organizations and other evaluation data. More efforts are needed to evaluate implementation tools that support agencies and communities implementing EBPs.