A Model for Providing Methodological Expertise to Advance Dissemination and Implementation of Health Discoveries in Clinical and Translational Science Award (CTSA) Institutions

Saturday 1:00 – 2:15 Breakout E5

Presentor: Donald Gerke

Donald Gerke, MSW, Washington University in St. Louis; Beth Prusaczyk, MSW, Washington University in St. Louis; Ana Baumann, Ph.D, Washington University in St. Louis; Ericka Lewis, LMSW, Washington University in St. Louis; Enola Proctor, Ph.D, Washington University in St. Louis

 

 

Background: Institutions supported by CTSAs are tasked with advancing translational science. The Dissemination and Implementation Research Core (DIRC) at Washington University’s CTSA provides methodological expertise to advance scientific agenda and grant writing to dissemination and implementation (D&I) of health discoveries.

Methods: Strategies employed by DIRC include: providing consultation to investigators during one-on-one appointments and weekly walk-in clinic; creating “toolkits??? for each area of D&I to assist DIRC members during consultations and provide investigators with tools to strengthen their own capacity to conduct D&I research; working with a strong team comprising masters and doctoral-level research assistants, each with a focused area in D&I. DIRC team building activities include semi-monthly meetings for quality assurance and mentoring of each members’ own work in D&I research.

Results: Since its inception in 2011, the number of DIRC customers has steadily increased. In 2011, 19 investigators sought DIRC resources, followed by 29 in 2012 and 30 in 2013. Although there was a slight decrease in 2014 (N=24), as of February 2015, DIRC had assisted 50% more customers than were seen during the first two months of 2014.

Discussion: DIRC may serve as a model for other CTSAs supporting investigators in the development of translational research proposals.

Establishing a Research Agenda for the Triple P Implementation Framework

Saturday 1:00 – 2:15 Breakout E5

Presentor: Jenna McWilliam

Jenna McWilliam, Triple P International; Jacquie Brown, Triple P International

 

 

The Triple P Implementation Framework supports communities to establish Community Implementation Systems to develop the capacity for effective, sustainable program implementation.  This means that programs scale up at a pace that allows for maximum community benefit.

The challenge now is how to evaluate the effectiveness of the Framework.  This presentation focuses on two research projects that represent the beginning of a research agenda to evaluate the Framework.

The first research project evaluated the impact on uptake and effective delivery of a key component of Triple P implementation support, the Pre-Accreditation Workshop. This workshop was introduced as a strategy to improve practitioner’s completion of the Triple P. This is a critical element of implementing Triple P, with previous research showing that practitioners who complete accreditation are more likely deliver the program.

The second research project explored the implementation experiences of practitioners, examining the relationship between the practitioner’s implementation experience within their organisation, their perceptions of implementation climate and their use of Triple P in the last twelve months.

Implications of these research findings will be discussed both in the context of supporting the implementation of evidence-based programs and developing a research agenda to evaluate the effectiveness of implementation strategies.

Cheap and Fast, but What is “Best???? Examining Implementation Outcomes Across Sites in a State-Wide Scaled-Up Evidence-Based Walking Program

Saturday 1:00 – 2:15 Breakout E5

Presentor: Kathleen Conte

Kathleen Conte, Oregon State University

 

 

Scaling-up programs through established delivery systems can accelerate dissemination and reduce costs, however, research guiding best-practices for scaling-up and evaluating outcomes is lacking. This mixed-method study examines outcomes of a two-year state-wide scale-up of a simple, evidence-based walking program in relationship to cost, speed, and effectiveness of implementation.

To facilitate implementation and share resources, multi-sector community partnerships were established. Partners contributed volunteer/staff time to delivery in exchange for training and materials. Participant outcomes (n=598) were assessed via registration/satisfaction forms; scale-up outcomes were assessed via interviews with leaders (n=39), administrative reports and observations.

In-person leader trainings (versus online) accelerated leader recruitment and initiation. Scale-up outcomes (e.g. fidelity, leader recruitment and initiation, class size and participant retention) will be presented to describe variation in effectiveness by site type. Classes implemented by staff

[OR=3.1, p<.05] and senior centers [OR=3.0, p<.05] best retained program participants. Interviews indicated implementation was enhanced in sites whose leaders who demonstrated clear understanding of program goals and saw the program as good fit.

Maximizing partnerships contributed to fast and cheap wide-scale implementation. By engaging volunteers in personal interaction via in-person trainings, scope, speed, and quality of implementation were improved. We discuss implications for managing and evaluating scaled-up program delivery.