Instrument Review Project

An update on the Society for Implementation Research Collaboration Instrument Review Project

Saturday 8:00 – 8:45

Cara C. Lewis, Cameo Stanick, Bryan J. Weiner, Heather Halko, Caitlin Dorsey

Indiana University, Bloomington, University of Montana, University of North Carolina at Chapel Hill

Significant gaps related to measurement issues are among the most critical barriers to advancing implementation science. Notably, there is a lack of stakeholder involvement in defining pragmatic measure qualities and unknown psychometric and pragmatic strength of existing measures. The Society for Implementation Research Collaboration Instrument Review Project aims to address these gaps by first generating a stakeholder-driven operationalization of the pragmatic measures construct. The preliminary dimensions of the pragmatic construct were delineated via inductive and deductive methods. First, a systematic literature review was conducted. All synonyms of the ‘pragmatic’ construct (e.g., ‘usefulness’) and/or dimension terms/phrases (e.g., ‘ease of scoring’) were included. Second, interviews with seven stakeholder representatives from a variety of mental health settings (e.g., inpatient, outpatient, residential, school) were conducted and qualitatively coded. The results from both methods were combined to reveal preliminary pragmatic dimensions. The literature review revealed 32 unique domains/dimensions, whereas the interviews revealed 25 domains (e.g., cost) and 11 dimensions (e.g., less than $1.00 per measure) of the pragmatic construct, as well as 16 antonyms (e.g., costly). A final list of 47 items (both domains and dimensions) was retained after removing redundant and/or confusing items. Results from the inductive and deductive methods revealed significantly more and diverse pragmatic measure qualities than those articulated in the recent literature. The next phase of the project will clarify the internal structure of the pragmatic construct using concept mapping, followed by stakeholder prioritization using Delphi methodology.



Implementation Development Workshop (IDW)

SIRC implementation development workshop: Results from a new methodology for enhancing implementation science proposals

Saturday 8:00 – 8:45

Presenters: Sara J. Landes, Cara C. Lewis, Allison L. Rodriguez, Brigid R. Marriott, Katherine Anne Comtois

National Center for PTSD,  University of Arkansas for Medical Sciences, Indiana University, University of Washington School of Medicine 

There is a dearth of training and technical assistance opportunities in the field of implementation science. The Society for Implementation Research Collaboration (SIRC) developed the Implementation Development Workshop (IDW) to provide critical and rich feedback that enhances the rigor and relevance of proposals in development. This highly structured and facilitated IDW is based on the Behavioral Research in Diabetes Group Exchange (BRIDGE) model and was modified by SIRC to deliver it in two formats, face-to-face and virtual. A mixed method approach was used to evaluate the effectiveness and acceptability of the IDW and compare the two formats. IDW participants (N=38) completed an anonymous quantitative survey assessing perceptions of the IDW. Presenters (N=13) completed a funding survey to assess grant submission and funding success. Qualitative interviews were conducted with IDW participants who participated in both formats (N=8). Face-to-face and virtual participants agreed they had a better understanding of implementation science principles and methods and thought they could apply what they learned. Of the presenters who completed the survey, 83% submitted their proposal for funding and of those who submitted, 40% received funding and 27% plan to resubmit. There was a preference for the face-to-face format; however, both formats were deemed acceptable and satisfying. Qualitative interviews indicated that the structured process of the IDW appeared to impact acceptability (e.g., clear structure, facilitator, note taker).Results indicated that participants found IDWs helpful and both formats were acceptable. SIRC will continue to host and evaluate IDWs in both formats.