Skip to content

Advertisement

  • Study protocol
  • Open Access
  • Open Peer Review

Using NIATx strategies to implement integrated services in routine care: a study protocol

BMC Health Services Research201818:431

https://doi.org/10.1186/s12913-018-3241-4

  • Received: 2 May 2018
  • Accepted: 27 May 2018
  • Published:
Open Peer Review reports

Abstract

Background

Access to integrated services for individuals with co-occurring substance use and mental health disorders is a long-standing public health issue. Receiving integrated treatment services are both more effective and preferred by patients and families versus parallel or fragmented care. National policy statements and expert consensus guidelines underscore the benefits of integrated treatment. Despite decades of awareness, adequate treatment for individuals with co-occurring substance use and mental health disorders occurs infrequently. The underlying disease burden associated with alcohol, illicit and prescription drug problems, as well as mental health disorders, such as depression, posttraumatic stress disorder and schizophrenia, is substantial.

Methods

This cluster randomized controlled trial (RCT) is designed to determine if the multi-component Network for the Improvement of Addiction Treatment (NIATx) strategies are effective in implementing integrated services for persons with co-occurring substance use and mental health disorders. In this study, 50 behavioral health programs in Washington State will be recruited and then randomized into one of two intervention arms: 1) NIATx implementation strategies, including coaching and learning sessions over a 12-month intervention period to implement changes targeting integrated treatment services; or 2) wait-list control. Primary outcome measures include: 1) fidelity - a standardized organizational assessment of integrated services (Dual Diagnosis in Addiction Treatment [DDCAT] Index); and 2) penetration - proportion of patients screened and diagnosed with co-occurring disorders, proportion of eligible patients receiving substance use and mental health services, and psychotropic or substance use disorder medications. Barriers and facilitators, as determinants of implementation outcomes, will be assessed using the Consolidated Framework for Implementation Research (CFIR) Index. Fidelity to and participation in NIATx strategies will be assessed utilizing the NIATx Fidelity Scale and Stages of Implementation Completion (SIC).

Discussion

This study addresses an issue of substantial public health significance: the gap in access to an evidence-based practice for integrated treatment for individuals with co-occurring mental health and substance use disorders. The study utilizes rigorous and reproducible quantitative approaches to measuring implementation determinants and strategies, and may address a longstanding gap in the quality of care for persons with co-occurring disorders.

Trial registration

ClinicalTrials.gov NCT03007940. Registered 02 January 2017 – Retrospectively Registered

Keywords

  • Co-occurring disorders
  • Integrated treatment
  • NIATx implementation strategies
  • DDCAT

Background

Access to integrated treatment services for individuals with co-occurring substance use and mental health disorders is a longstanding problem in behavioral health care [1, 2]. The provision of integrated mental health and substance use services during the same treatment episode by the same clinical provider addresses national policy statements and expert consensus guidelines underscoring the benefits of integrated treatment [13]. It is also preferred by patients and families [4]. However, a significant gap remains between the availability of “one-stop” integrated services and the actual receipt of integrated services for individuals with co-occurring disorders. Despite increased awareness, adequate integrated treatment for individuals with co-occurring substance use and mental health disorders occurs infrequently [5, 6]. In the United States, only 18% of specialty addiction programs and 9% of mental health programs offer integrated services [7]. Availability of integrated services is not associated with receipt of services. Consumers with co-occurring disorders report only receiving integrated services between 7 to 9% of the time [8, 9]. However, it is unclear if these individuals had their co-occurring disorders addressed in treatment at the same time or if they even received integrated services.

The current system represents an undesired but chronic, systemic artifact for policymakers and treatment providers, and even more so for families and individuals suffering from co-occurring disorders [6, 1012]. The resulting fragmented system of care requires multiple provider interactions and integrated care is almost non-existent. The disease burden associated with co-occurring disorders represents a substantial public health concern [1320]. Inadequate access to effective integrated treatment results in poorer public health and societal outcomes [2125]. The impact on the United States healthcare system is significant. By 2020, annual expenditures in the US for co-occurring substance use and mental health disorders are projected to reach $281 billion [26].

Despite these facts, integrated services for individuals with co-occurring disorders are not being widely implemented in behavioral health organizations [27, 28]. Research on effective implementation of evidence-based approaches to integrated treatment for co-occurring disorders is sorely needed [2931]. This research study addresses this gap utilizing an implementation science approach.

Conceptual model and theoretical justification

Implementation science holds the methodological key to the effective implementation of evidence-based approaches to integrated treatment for co-occurring disorders. This research utilizes objective measures across three types of frameworks (determinant, evaluative and process) outlined by Per Nilsen [32] to create a conceptual unified implementation research model (Fig. 1).
Fig. 1
Fig. 1

Unified Conceptual Model The model outlines the integration and use of objective measures across three frameworks: determinant (Consolidated Framework for Implementation Research); evaluative (Proctor’s implementation outcome taxonomy); and process (Stages of Implementation Completion [SIC]) with NIATx Implementation Strategies to implement integrated services for co-occucring disorders in community addiction treatment programs

The Consolidated Framework for Implementation Research (CFIR), an evaluative framework, articulates factors impacting the success or failure of an implementation strategy [3339]. In this study, we will focus on four CFIR dimensions (Outer Setting, Inner Setting, Characteristics of the Intervention and the Individual) that are particularly salient during pre-implementation of organizational change. However, the absence of a quantitative measure is a limitation of the CFIR [40, 41]. Therefore, for this study, we will use the quantitative instrument we developed of the CFIR items across these four dimensions to assess the presence of potential facilitators or barriers to the implementation process, the CFIR Index.

Proctors’ implementation taxonomy represents an evaluative framework that differentiates between implementation, service and patient outcomes [42]. This study will focus on implementation (fidelity and penetration) and patient care outcomes. The final component will examine implementation strategy participation which has suffered from a lack of clarity in definition, description, documentation and terminology precision [4347]. The study will use the Stages of Implementation Completion (SIC) as a process framework to assesses implementation strategy participation by tracking a list of milestone activities and measuring the proportion of completed activities and the duration (time) to completion [4850]. This study will adapt the SIC to assess program completion of the NIATx implementation strategy.

NIATx implementation strategy

The NIATx implementation strategies will be incorporated into this conceptual model (see Step 4 in Fig. 1) to determine the effectiveness of NIATx in implementing integrated services for persons with co-occurring substance use and mental health disorders. NIATx combines process improvement tools and techniques (e.g., consumer-centered walk-through and PDSA rapid change cycles) with quality improvement interventions such as coaching, learning sessions, and “interest circle” calls [5153]. NIATx implementation strategies have been widely adopted and successfully utilized to improve and sustain access to care and addiction medications [5459]. The NIATx200 study dismantled three components of the NIATx implementation strategy (learning session, coaching, and interest circle calls) or a combination of all three to determine key elements for improving wait time, admissions and retention [60]. This dismantling of NIATx found important differences by implementation strategy and outcome; however, it was in fact an “intent-to-treat” analysis [61]. Fidelity and extent of variation within each component were not assessed in the NIATx200 study.

Two studies provide evidence for NIATx as an effective implementation strategy to improve access to integrated treatment for individuals with co-occurring disorders. The use of unspecified “NIATx-like” implementation strategies (e.g., PDSA cycles, change champion and team, coach, and process/outcomes measurement) in 54 treatment agencies in five states significantly predicted changes in DDCAT Total Score [62]. In an “open-label” single group repeated measures design, eight community addiction treatment agencies participated and received expert NIATx support during a six-month timeframe. Measures included pre and post DDCAT assessments and changes in Addiction Severity Index (ASI) substance use and psychiatric severity scores. Results indicated that seven of the eight agencies made significant improvements in integrated service capacity over six months (change in DDCAT Total Score range 0.5 to 0.8), and patient level data (range in n by program: 19 to 588) revealed corresponding positive changes in ASI drug, alcohol and psychiatric severity composite scores [63]. These studies provided a compelling signal for more rigorous and controlled research, which is absolutely necessary to advance, with scientific confidence, the use of NIATx to integrate services. This research study will address that gap by determining if NIATx strategies are effective in implementing integrated services for persons with co-occurring substance use and mental health disorders.

Methods/study design

Overview

The project represents a collaboration between Stanford University, University of Wisconsin-Madison, and the Division of Behavioral Health and Recovery (DBHR) located in the Washington State Department of Social and Health Services. The study uses a cluster randomized wait-list control group design. Fifty community-based addiction treatment programs located in the State of Washington will be assigned to one of two cohorts (Fig. 2) during an index 12-month period: 1) NIATx implementation strategies (Cohort 1), or 2) wait-list control (Cohort 2).
Fig. 2
Fig. 2

NIATx Implementation Strategy Study Design. The community programs are randomized to NIATx (Cohort 1) or Wait List (Cohort 2) with four data collection time points

The programs will use the NIATx implementation strategy to implement changes targeting integrated treatment services. The study will assess the effectiveness of the NIATx implementation strategies to improve integrated services for persons with co-occurring substance use and mental health disorders. Hypothesized effects are that relative to the wait-list, NIATx strategies will improve implementation (penetration and fidelity) and patient care outcomes (Aim 1 and Aim 2). Variation in the extent of and fidelity to the NIATx implementation strategies will be examined across the entire sample (Aim 3). The specific aims and hypothesis are detailed in Table 1. Figure 3 shows the overall study timeline. Recruitment began in April 2016 and the active intervention period for Cohort 2 ends in June 2018.
Table 1

Study Specific Aims and Hypotheses

Aim

Hypotheses

Specific Aim 1: Relative to wait-list, to determine if NIATx strategies improve implementation fidelity outcomes.

H1: NIATx strategies will produce increased integrated service fidelity at the program level.

Specific Aim 2: Relative to wait-list, to determine if NIATx strategies improve implementation penetration outcomes.

H2: NIATx strategies will produce increased penetration rates in integrated services, evidenced by proportion of program patients screened, diagnosed and receiving integrated medication and psychosocial services.

Specific Aim 3: Across entire sample, to evaluate variation in the extent of and fidelity to NIATx strategies.

H3: Programs with more facilitating factors, articulated by the Consolidated Framework for Implementation Research (CFIR) Index dimensions will be more likely to complete the requisite tasks of the NIATx protocol and do so with greater fidelity.

Fig. 3
Fig. 3

NIATx Implementation Study Project Timeline. The study project timeline is organized by activities associated with a) project launch, b) cohort 1, c) cohort 2, and d) overall project activites over the five year study period

NIATx implementation strategy

Programs in the NIATx intervention will be assigned a NIATx trained process improvement coach who leads the active implementation phase. Over a 12-month active implementation period, the coach works with executive directors, change leaders and teams. Coaching includes a one-day site visit and individual monthly phone conferences (10 h total) with each program.

Prior to the site visit, the coach will introduce the project, review initial DDCAT results, discuss how to conduct a walk-through [64] and set the stage for the site visit. The site visit will use a standardized agenda to ensure fidelity. During the visit, the coach will meet with executive leadership, review the walk-through and DDCAT assessment results, and train staff on the use of the NIATx implementation strategies. With their coach, the program will utilize results from the DDCAT assessment to identify areas for improvement, implement change projects and assess their impact.

After the site visit, the coach will conduct bi-monthly coaching calls with their assigned programs for the 1st quarter of the implementation period and monthly calls thereafter. In the individual calls, the coach and change team will review change projects, discuss successes, and identify new change projects. In addition to individual coaching calls, the coach provides support through learning sessions and group coaching calls.

Two group calls, moderated by the coach, will involve change leaders from multiple programs and provide an opportunity for peer-to-peer sharing. On these calls, the change leaders will discuss common change-related issues, progress, and exchange innovative implementation strategies with their peers. The calls will also allow the coach to share new strategies and discuss implementation issues such as sustainment of organizational change.

The study will include two one-day coach-led learning sessions for all programs within a cohort. Learning sessions promote peer-to-peer sharing about specific goals and objectives using a tailored agenda. The first learning session will teach programs how to use NIATx process improvement strategies through skill development activities such as how to identify change opportunities, develop PDSA cycles and effective use of data to drive change. The second learning session will include program presentations about change efforts and discuss how to develop sustainment plans to continue to improve integrated services.

Coach supports will ensure that NIATx is delivered with fidelity to all participating programs. Supports include: a one-day coach training session at the study start to review objectives, provide NIATx refresher and review how to interpret DDCAT results in order to design change projects; a standardized site visit agenda; and a standardized coach report to capture program interactions. In addition, the coaches will participate in monthly calls with the PI (Dr. Ford) to review progress, discuss issues, receive advice from peers, share promising practices and clarify any research issues.

Eligibility and recruitment

Programs will be recruited from the population of 486 licensed addiction treatment programs in Washington State. Eligibility criteria included: offering outpatient and/or intensive outpatient services; tax-exempt, government status or at least 50% publically funded; and no prior participation in NIATx research studies. Public mental health and private addiction treatment programs were excluded because they are not required to utilize the state clinical information system, and therefore cannot provide the necessary standardized project data. A study recruitment letter will be created and distributed by Division of Behavioral Health and Recovery staff to all eligible programs.

Randomization

The randomization sequence will be generated by the study biostatistician and concealed from the researchers conducting study assessments. An equal number of programs will be randomized to each study arm. However, the coaches will not be blinded to the results of randomization as they will be assigned to programs in Cohort 1 after the baseline DDCAT assessment is completed. After the baseline DDCAT assessment is completed, the program along with their coach, will be notified of their intervention assignment (Cohort 1) or the program will be notified that they have been assigned to the wait-list control group (Cohort 2).

Data collection/variables

The proposed research will explore the impact of the NIATx implementation strategies on positive changes in an implementation fidelity outcome (Aim 1) assessed by the Dual Diagnosis Capability in Addiction Treatment (DDCAT) Index. The DDCAT (Version 4.0) is a 35-item observational benchmark measure of program level dual diagnosis capability. Items are rated on a 5-point scale on degree of integration to generate a total score and scores on seven dimensions [62, 65, 66]. Two studies provide evidence that improvements in or higher DDCAT scores impact patient outcomes. In a study of 185 substance abuse providers, individuals receiving treatment in clinics with higher DDCAT scores had significantly longer length of stay and although not significant, attend four additional treatment sessions [66]. Results from an “open-label” single group repeated measures design (n = 8 community addiction treatment agencies) found that overall DDCAT scores increased on average 0.56 points which was associated with corresponding changes in standardized Addiction Severity Index composite severity scores in the psychiatric (μ = 0.034 ± 0.075), alcohol (μ = 0.007 ± 0.120) and drug (μ = 0.014 ± 0.091) problem categories [63]. The DDCAT will be assessed for all participating programs at four distinct time points (Table 2).
Table 2

Implementation and Fidelity Measures and Frequency of Data Collection

   

Data Collection Time Periods

Aim

Construct

Measure

Baseline

Post-Implementation

Sustainment Period 1

Sustainment Period 2

1

Integrated Services: Fidelity

Dual Diagnosis Capability in Addiction Treatment (DDCAT) Index1

X

X

X

X

2

Integrated services: Patients Screened

Proportion of program patients: screened using Gain Short Screener (GSS)

X

X

X

X

2

Integrated Services: Medications

Number of patients receiving a psychotropic or substance use disorder medication

X

X

X

X

2

Integrated Services: Chemical Dependency Services

Number of patients receiving chemical dependency services

X

X

X

X

2

Integrated Services: Mental Health Services

Number of patients receiving mental health services

X

X

X

X

3

Program Facilitators and Barriers to Implementation

Consolidated Framework for Implementation Research (CFIR) Index1

X

X

X

X

3

NIATx Stages of Implementation Completion2

NIATx strategy fidelity and extent of and duration to complete activities (SIC)

 

X

  

3

NIATx Fidelity Scale3

NIATx strategy fidelity and extent of and duration to complete activities (SIC)

 

X

  

1

Program characteristics1

Program Size (Admissions), Program Type, ASAM Levels of Care, Payment Sources

X

X

X

X

2

Patient Characteristics

Age, Gender, Race, Ethnicity

X

X

X

X

State of Washington staff will be trained on how to conduct DDCAT and CFIR Index assessments by co-PI (McGovern). Two-person teams will schedule and conduct each DDCAT assessment. Additional training or consultation will help answer questions identified at the site visits. Program characteristics are collected during the DDCAT assessments

Data will be collected by the state of Washington staff and NIATx coaches using standardized instruments for each program participating in the study

NIATx Fidelity scoring (Total and 7 subscale scores are organized by preparation, implementation and sustainment phases) will be assessed by two person teams at the end of the active implementation period for each program. Data sources will include a composite of interviews, review of walk-through results, change project forms, coach notes and sustainability plans

At the time of the study, Washington State was transitioning to Managed Care Organizations (MCOs) to pay for the delivery of substance use disorder (SUD) and mental health (MH) services. The transition involved integrating data from two systems: 1) the Treatment and Assessment Reports Generation Tool (TARGET), covering SUD clients and services; and 2) the Mental Health Consumer Information System (MH-CIS), covering community MH clients and services into a new Behavioral Health Data System (BHDS). The study will capitalize and leverage the State of Washington’s experience in utilizing standardized state-wide clinical management information databases for addiction and mental health treatment [6770]. Implementation penetration outcomes will assess changes in the proportion of patients screened, diagnosed and receiving integrated psychosocial or medication services (Aim 2). The services data will be extracted from the BHDS as well as TARGET and MH-CIS legacy systems. It will include de-identified client level data for all patient admissions to study programs within a 45-day window before and after each DDCAT assessment date (Additional file 1). The information will be transferred to the study team using appropriate security protocols.

The CFIR Index operationalized four CFIR dimensions (Characteristics of the Intervention; Outer Setting, Inner Setting, Characteristics of Individuals) to create an objective rating scale to evaluate pre-implementation factors as moderators of the implementation process over time, and as factors in sustainability (Aim 3). The index has good preliminary psychometric properties [71, 72]. Summary ratings from the CFIR Index dimensions may predict fidelity to and extent of completion of the NIATx strategies (Aim 3). Data collection for the CFIR Index will follow the same schedule as the DDCAT assessment.

Fidelity to and participation in the NIATx implementation strategies will be assessed using two exploratory scales specifically developed for this study: NIATx Fidelity Scale, and the NIATx Stages of Implementation Completion (NIATx SIC). The NIATx Fidelity Scale includes 19-items designed to assess adherence to the NIATx model on a 5-point scale from 1- No evidence to 5-Extensive evidence. The NIATx SIC is based on a modified version of the SIC and is organized into three phases: Pre-implementation, Implementation and Sustainment (Table 3). Program driven activities will be scored and count toward both duration (number of days) and proportion (number of scored activities completed/total number of scored activities possible) within a given phase of the NIATx SIC. The use of these scales will be utilized to assess variation in the extent and fidelity to which NIATx strategies are delivered.
Table 3

Overview of the NIATx Stages of Implementation Checklist

NIATx SIC Phases and Stages in Each Phase

# of Items

Examples of NIATx SIC Elements

Program Characteristics

17

Program Size, Type, Primary Focus

Pre-Implementation Phase Stage 1: Engagement

6

Invite Date, Contacts before Accept

Pre-Implementation Phase Stage 2: Consideration of Feasibility

8

DDCAT Assessment Date, Contacts

Pre-Implementation Phase Stage 3: Readiness Planning

18

Initial Coach Engagement & NIATx Webinar, Change Leader Appointed

Implementation Phase Stage 4: Staff Hired and Intro Training

5

Change Team Identified, Coach Site Visit

Implementation Phase Stage 5: Fidelity Monitoring & Tracking in Place

4

Review of Walkthrough, Project Selection

Implementation Phase Stage 6: Services & Consultation to Services Begin

4

Collect baseline data, Start Change Project

Implementation Phase Stage 7: Model Fidelity & Staff Competence & Adherence Tracked

Varies by Program

Change Projects, Change Cycles per Project, Coaching Calls, Peer to Peer Meeting Attendance

Sustainability Phase Stage 8: Competency

10

Continue use of NIATx Implementation Strategies, NIATx Fidelity Score

A program self-reported operational survey will collect information about average staff hourly salaries (baseline and follow-up), and staff time and cost spent on NIATx implementation as well as information about the impact on operational revenues and costs (follow-up). The survey information will be utilized in the economic analysis evaluating implementation activity and resource costs.

Data and power analysis

The analyses will include a quantitative assessment of Aims 1 to 3 and an economic cost analysis. Table 1 outlines the study aims and hypotheses. The data and power analysis approaches are presented in sequence for each aim by hypothesis for the experimental comparison only, as the pooled-group analyses have inferential limitations. Power was assessed using SamplePower 3.0 [73].

Specific aim 1

For hypothesis 1, the program is the unit of analysis, and DDCAT (fidelity) scores represent the dependent variable and the cohort assignment (NIATx vs. wait-list) is the independent variable. A two tailed analysis of covariance (α = 0.05) with 25 programs per cohort group will compare post- implementation mean fidelity scores between groups, with pre-implementation scores as the covariate. With a correlation between pre- and post- implementation DDCAT scores equal to 0.5, this analysis has power (β = .86) to detect a large effect. Prior research concerning one-year changes in DDCAT scores suggests that we can expect a large effect [62].

Specific aim 2

The date of the DDCAT assessment will serve as the index date for hypothesis 2 (Aim 2). All patients admitted to each program, 45 days before and 45 days after the DDCAT assessment date, will be extracted from the state administrative databases. The outcomes will be the proportion of program patients: 1) screened, 2) diagnosed, and 3) receiving integrated medication and psychosocial services, compared to ad hoc strategies (wait-list comparison sites). Each outcome has a value of 1 (yes) or 0 (no), and interest is in the difference in the rate of each outcome, accounting for the clustering of observations within sites. This analysis calls for a multi-level logistic regression model. The observations at each time point are independent, and therefore this is not a longitudinal (repeated measures) analysis. Instead, there are four independent groups (NIATx pre-implementation, NIATx post-implementation, wait-list pre-implementation, and wait-list post-implementation) in a 2 (group) by 2 (time) analysis, with primary interest in the group by time interaction. Due to the large number of observations, a logistic regression has power equal to 0.75 to detect a small effect (OR = 1.5; .10 difference in proportions) and power of 1.0 to detect a medium effect (OR = 2.33; .20 difference in proportions). These power estimates are based on standard logistic regression. We will apply a correction to the standard errors to adjust for the interclass correlation at the site level in order to avoid Type I errors due to the dependence of the clustered observations [74].

Specific aim 3

Aim 3 will evaluate variation in the extent of, and fidelity to, NIATx strategies. The specific hypothesis is that programs with more facilitating factors, evaluated using the CFIR Index dimensions, will be more likely to complete the NIATx protocol and to do so with greater participation and fidelity. The programs are the unit of analysis, and the primary predictor variable is the number of factors that support implementation. There are two dependent variables, a continuous variable indicating the proportion (%) of the 22 NIATx tasks completed (SIC) and a continuous variable indicating the degree of fidelity to the NIATx protocol (1 to 5-point scale). Multiple regression analyses will be used to evaluate both outcome variables. Characteristics of the sites that are associated with the outcomes will be added as covariates in the regression models in order to evaluate the effect of the CFIR Index dimensions (e.g., Perceptions of the System and Community Score) after controlling for other predictors. Power for the two-tailed multiple regression analysis (α = 0.05) across 50 programs, and 5% of the variance explained by the covariates, is 79% power to detect a change in R2 of 15 and 82% power to detect a change in R2 of 16%, when adding the primary predictor to the model.

Cost analysis

The economic cost analysis consists of two components. First, we will measure costs required to support participation in the implementation strategies (active and wait-list). In the second component, the cost analysis will examine potential changes in program finances (revenue and expenses) associated with delivering more integrated services. In contrast to the economic costs of integrating services, which includes costs not faced directly by the program are not part of any program’s operating budget, and thus savings to outside entities do not make interventions more feasible unless these savings are shared by the program. The cleanest and most comprehensive measure of the costs to a program providing integrated treatment is the pre-post (NIATx) change in total costs netting out any increased revenue, or adding in any loss in revenue, for the program. We will duplicate our successful earlier collection of costs, revenue, and admission information from each program [75]. For both study arms, we will collect archival cost information for two years before the intervention (pre-randomization) and two years from the start of the intervention implementation (post-randomization). Thus, the additional costs of NIATx will be the difference between the pre-post program change in costs and the pre-post control program change in costs. Change in costs will be calculated as:

Pre-post change in costs = (Total Costs)post – (Total Costs)pre - (Total Revenue)post –(Total Revenue)pre.

The impact of improved integrated services could lead to additional (reflected as decreased costs) or lost (reflected as additional costs) revenues. By dividing each component by the number of admissions in that period, the net cost per admission can be derived. The economic analysis will identify important sustainability implications and could influence future stakeholder implementation decisions [76].

Dissemination policy

Irrespective of the magnitude or direction of NIATx strategy effect, we will disseminate study findings. Dissemination efforts will include presentations at professional scientific conferences and publication in peer-reviewed journals with the highest impact factor possible. Additionally, we will seek to ensure the project’s publications are open access (i.e., available online to readers without financial, legal, or technical barriers beyond those inseparable from gaining access to the internet).

Discussion

The use of NIATx implementation strategies spread beyond efforts to improve access and retention to address organizational change efforts to reduce psychiatric re-admissions [77], support implementation of evidence based-practices such as Seeking Safety [78] or trauma informed care [79], and improve no-show rates [80]. In addition, NIATx or NIATx-like implementation strategies have been utilized to improve process of care in drug courts [81] and explore the impact of feedback reports in residential treatment organizations [82]. Similar to the original NIATx studies, the efforts represent an application of NIATx implementation strategies in a single setting (e.g., mental health), targeting a specific outcome (e.g., no-shows).

Recent studies have explored how NIATx or NIATx-like implementation strategies support organizational change efforts for more complex patients in specialized environments. Examples include changes targeting HIV treatment in correctional settings [8386] and the implementation of evidence-based prevention practices for older adults in community health settings [87, 88]. Other studies have integrated NIATx implementation strategies with external policy and regulatory levers to improve access to medications for alcohol and opioid use disorders [59, 89, 90].

The proposed study represents a substantial advance in addressing a gap in the existing NIATx implementation research. This research represents the first true test of NIATx for implementing complex, not simple, treatment services in substance use programs and evaluates how the use of NIATx implementation strategies improves services for individuals with co-occurring disorders. The proposed study accomplishes this objective by unifying and operationalizing objective measures across three types of implementation frameworks (determinant, evaluative and process) to address a longstanding gap in the quality of care for persons with co-occurring disorders. Specifically, it explores the relationship between use of NIATx strategies and implementation and patient level outcomes. It is the first study of NIATx implementation strategies to include a specific aim to precisely document the fidelity with which NIATx is delivered. The modified NIATx Stages of Implementation Completion will explore the extent of activity completion of the purportedly essential components of NIATx using the Stages of Implementation Completion approach. Neither the NIATx research platform nor community users have thus far experienced this rigorous a level of scientific inquiry. Findings from this research can be immediately applied to improve clinical services, advance implementation research, as well as expand and guide research with other systems and settings.

Trial status

The trial has been determined to not involve human subjects research. As of 18 January 2018, 53 addiction treatment agencies volunteered or were recruited to be in the study, and 49 were randomized.

Abbreviations

ASI: 

Addiction Severity Index

BHDS: 

Behavioral Health Data System

CFIR: 

Consolidated Framework for Implementation Research

DBHR: 

Division of Behavioral Health and Recovery

DDCAT: 

Dual Diagnosis Capability in Addiction Treatment

GSS: 

Gain Short Screener

MCO: 

Managed Care Organizations

MH: 

Mental Health

MH-CIS: 

Mental Health Consumer Information System

NIATx: 

Network for the Improvement of Addiction Treatment

NIDA: 

National Institute of Drug Abuse

RCT: 

Randomized Control Trial

SIC: 

Stages of Implementation Completion

SPIRIT: 

Standard Protocol Items: Recommendation for Intervention Trials

SUD: 

Substance Use Disorders

TARGET: 

Treatment and Assessment Reports Generation Tool

Declarations

Acknowledgements

We acknowledge the staff of the Washington Department of Social and Health Services who conducted the DDCAT and CFIR assessments for this study. We would like to express appreciation to the substance abuse clinics in Washington state who are participating in the study. We would also like to thank members of the original research team (Mark Zehner and Chantal Lambert-Harris) as well as the NIATx coaches – Elizabeth Strauss, Denna Vandersloot and Janet Bardossi for their contributions to the study.

Funding

The work was supported by the National Institute on Drug Abuse (NIDA; R01 DA037222-01A1; PIs McGovern and Ford). NIDA had no role in the design of this study and will not have any role during its execution, analyses, interpretation of the data, or decisions to submit results. The content is solely the responsibility of the authors and does not necessarily represent the official views of the government.

Availability of data and materials

The data related to the implementation outcomes that support the findings of this study are available from the Washington Department of Social and Health Services but restrictions apply to the availability of these data, which were used under license for the current study, and so are not publicly available. The datasets generated during and/or analyzed during the current study related to the DDCAT, CFIR or NIATx SIC are available from the corresponding author on reasonable request.

Authors’ contributions

Study conceptualization and design including efforts to design and implement data collection tools were led by MPM and JHF. MA, EO and AK contributed to the acquisition of data. KC contributed to the conception and design of the patient data collection and developed the programming to support the acquisition of the study patient outcome data. AM contributed to the conceptual design of the NIATx intervention approach. All authors were involved in developing and editing the manuscript and have given final approval of the submitted version.

Ethics approval

The study has been categorized as exempt by the Institutional Review Board at Stanford University, the Health Sciences Institutional Review Board at the University of Wisconsin-Madison (#2016–0438) and the State of Washington Department of Social and Health Services Institutional Review Board (E-040716-S).

Competing interests

Dr. Ford is the sole proprietor in JHFHealthcare Consulting, LLC which provides consultancy services to the California Department of Justice. All other authors do not have a conflict of interest related to this work.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Authors’ Affiliations

(1)
School of Pharmacy – Social and Administrative Sciences Division, University of Wisconsin – Madison, 777 University Ave, Madison, WI 53705, USA
(2)
Office of Behavioral Health and Managed Care, Division of Behavioral Health and Recovery, Washington State Department of Social and Health Services, Olympia, WA 98504, USA
(3)
Center for Behavioral Health Services and Implementation Research, Division of Public Health & Population Sciences, Department of Psychiatry & Behavioral Sciences, Stanford University School of Medicine, 1520 Page Mill Road, Palo Alto, CA 94304, USA
(4)
Office of Behavioral Health and Prevention, Division of Behavioral Health and Recovery, Washington State Department of Social and Health Services, Olympia, WA 98504, USA
(5)
Washington State Health Care Authority, Olympia, WA 98501, USA
(6)
Division of Public Mental Health & Population Sciences, Department of Psychiatry & Behavioral Sciences, Division of Primary Care and Population Health, Department of Medicine, Stanford University School of Medicine, 1520 Page Mill Road MC5265, Palo Alto, CA 94304, USA

References

  1. Clark HW, Power AK, Le Fauve CE, Lopez EI. Policy and practice implications of epidemiological surveys on co-occurring mental and substance use disorders. J Subst Abus Treat. 2008;34:3–13.View ArticleGoogle Scholar
  2. Institute of Medicine. Committee on crossing the quality chasm: adaptation to mental health and addictive D: Improving the Quality of Health Care for Mental and Substance-Use Conditions. Washington, DC: the National Academy Press; 2006.Google Scholar
  3. (HHS) USDoHaHS: Facing addiction in America: THe surgeon General's report on alcohol, drugs, and health. Washington, DC: HHS: Office of the Surgeon General; 2016.Google Scholar
  4. Schulte SJ, Meier PS, Stirling J. Dual diagnosis clients' treatment satisfaction - a systematic review. BMC Psychiatry. 2011;11:64.View ArticlePubMedPubMed CentralGoogle Scholar
  5. Substance Abuse and Mental Health Services A: Key substance use and mental health indicators in the United States: Results from the 2016 National Survey on Drug Use and Health. vol. NSDUH Series H-52, HHS Publication No. SMA 17–5044. Rockville, MD: Center for Behavioral Health Statistics and Quality, Substance Abuse and Mental Health Services Administration; 2017.Google Scholar
  6. Croft B, Parish SL. Care integration in the patient protection and affordable care act: implications for behavioral health. Adm Policy Ment Health Ment Health Serv Res. 2013;40:258–63.View ArticleGoogle Scholar
  7. McGovern MP, Lambert-Harris C, Gotham HJ, Claus RE, Xie H. Dual diagnosis capability in mental health and addiction treatment services: an assessment of programs across multiple state systems. Admin Pol Ment Health. 2014;41:205–14.View ArticleGoogle Scholar
  8. Han B, Compton WM, Blanco C, Colpe LJ. Prevalence, treatment, and unmet treatment needs of US adults with mental health and substance use disorders. Health Aff. 2017;36:1739–47.View ArticleGoogle Scholar
  9. U.S. Department of Health and Human Services (HHS) OotSG: Facing Addiction in America: Surgeon General's Report on Alcohol, Drugs, and Health. In. Washington (DC): HHS; November 2016.Google Scholar
  10. Knickman J, Krishnan KRR, Pincus H, Blanco C, Blazer D, Coye M, Krystal J, Rauch S, Simon G, Vitiello B: Improving Access to Effective Care for People Who Have Mental Health and Substance Use Disorders. Discussion Paper, Vital Directions for Health and Health Care Series. National Academy of Medicine, Washington, DC. https://nam.edu/improving-access-to-effective-care-for-people-who-have-mental-health-and-substance-use-disorders-a-vital-direction-for-health-and-health-care/; 2016.
  11. Lewis VA, Colla CH, Tierney K, Van Citters AD, Fisher ES, Meara E. Few ACOs pursue innovative models that integrate care for mental illness and substance abuse with primary care. Health Aff. 2014;33:1808–16.View ArticleGoogle Scholar
  12. Knickman J, Krishnan R, Pincus H. Improving access to effective care for people with mental health and substance use disorders. Jama. 2016;316:1647–8.View ArticlePubMedGoogle Scholar
  13. Whiteford HA, Ferrari AJ, Degenhardt L, Feigin V, Vos T. The global burden of mental, neurological and substance use disorders: an analysis from the global burden of disease study 2010. PLoS One. 2015;10:e0116820.View ArticlePubMedPubMed CentralGoogle Scholar
  14. Abajobir AA, Abate KH, Abbafati C, Abbas KM, Abd-Allah F, Abera SF, Abraha HN, Abu-Raddad LJ, Abu-Rmeileh NME, Adedeji IA, et al: Global, regional, and national under-5 mortality, adult mortality, age-specific mortality, and life expectancy, 1970–2016: a systematic analysis for the Global Burden of Disease Study 2016. Lancet, 390:1084–1150.Google Scholar
  15. Wang H, Naghavi M, Allen C, Barber RM, Bhutta ZA, Carter A, Casey DC, Charlson FJ, Chen AZ, Coates MM, et al. Global, regional, and national life expectancy, all-cause mortality, and cause-specific mortality for 249 causes of death, 1980–2015: a systematic analysis for the Global Burden of Disease Study 2015. Lancet. 388:1459–544.Google Scholar
  16. Lai HMX, Cleary M, Sitharthan T, Hunt GE. Prevalence of comorbid substance use, anxiety and mood disorders in epidemiological surveys, 1990–2014: a systematic review and meta-analysis. Drug Alcohol Depend. 2015;154:1–13.View ArticlePubMedGoogle Scholar
  17. McGovern MP, Xie H, Segal SR, Siembab L, Drake RE. Addiction treatment services and co-occurring disorders: prevalence estimates, treatment practices, and barriers. J Subst Abus Treat. 2006;31:267–75.View ArticleGoogle Scholar
  18. Hasin DS, Grant BF. The National Epidemiologic Survey on alcohol and related conditions (NESARC) waves 1 and 2: review and summary of findings. Soc Psychiatry Psychiatr Epidemiol. 2015;50:1609–40.View ArticlePubMedPubMed CentralGoogle Scholar
  19. Grant BF, Stinson FS, Dawson DA, Chou SP, Dufour MC, Compton W, Pickering RP, Kaplan K. Prevalence and co-occurrence of substance use disorders and independent mood and anxiety disorders: results from the National Epidemiologic Survey on alcohol and related conditions. Arch Gen Psychiatry. 2004;61:807–16.View ArticlePubMedGoogle Scholar
  20. Watkins KE, Hunter SB, Wenzel SL, Tu W, Paddock SM, Griffin A, Ebener P. Prevalence and characteristics of clients with co-occurring disorders in outpatient substance abuse treatment. Am J Drug Alcohol Abuse. 2004;30:749–64.View ArticlePubMedGoogle Scholar
  21. Priester MA, Browne T, Iachini A, Clone S, DeHart D, Seay KD. Treatment access barriers and disparities among individuals with co-occurring mental health and substance use disorders: an integrative literature review. J Subst Abus Treat. 2016;61:47–59.View ArticleGoogle Scholar
  22. Walker ER, Druss BG. Cumulative burden of comorbid mental disorders, substance use disorders, chronic medical conditions, and poverty on health among adults in the U.S.A. Psychology, Health & Medicine. 2017;22:727–35.View ArticleGoogle Scholar
  23. Clark RE, Samnaliev M, McGovern MP. Impact of substance disorders on medical expenditures for medicaid beneficiaries with behavioral health disorders. Psychiatr Serv. 2009;60:35–42.View ArticlePubMedGoogle Scholar
  24. Mangrum LF, Spence RT, Lopez M. Integrated versus parallel treatment of co-occurring psychiatric and substance use disorders. J Subst Abus Treat. 2006;30:79–84.View ArticleGoogle Scholar
  25. Xie H, McHugo GJ, Helmstetter BS, Drake RE. Three-year recovery outcomes for long-term patients with co-occurring schizophrenic and substance use disorders. Schizophr Res. 2005;75:337–48.View ArticlePubMedGoogle Scholar
  26. Mark TL, Levit KR, Yee T, Chow CM. Spending on mental and substance use disorders projected to grow more slowly than all health spending through 2020. Health Aff (Millwood). 2014;33:1407–15.View ArticleGoogle Scholar
  27. Sacks S, Chaple M, Sirikantraporn J, Sacks JY, Knickman J, Martinez J. Improving the capability to provide integrated mental health and substance abuse services in a state system of outpatient care. J Subst Abus Treat. 2013;44:488–93.View ArticleGoogle Scholar
  28. Center for Subst Abus T: Subst Abus treatment for persons with co-occuring disorders. vol. DHHS Publication no. (SMA) 05–3992. Rockville, MD: Subst Abus and Mental Health Service Administration; 2005.Google Scholar
  29. Perron BE, Bunger A, Bender K, Vaughn MG, Howard MO. Treatment guidelines for substance use disorders and serious mental illnesses: do they address co-occurring disorders? Substance use & misuse. 2010;45:1262–78.View ArticleGoogle Scholar
  30. McGovern MP, McLellan AT. The status of addiction treatment research with co-occurring substance use and psychiatric disorders. J Subst Abus Treat. 2008;34:1–2.View ArticleGoogle Scholar
  31. Flynn PM, Brown BS. Co-occurring disorders in substance abuse treatment: issues and prospects. J Subst Abus Treat. 2008;34:36–47.View ArticleGoogle Scholar
  32. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:53.View ArticlePubMedPubMed CentralGoogle Scholar
  33. Rogers EM. Diffusion of innovations. New York: Free Press; 2003.Google Scholar
  34. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.View ArticlePubMedPubMed CentralGoogle Scholar
  35. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. The Milbank Quarterly. 2004;82:581–629.View ArticlePubMedPubMed CentralGoogle Scholar
  36. Simpson DD. A conceptual framework for transferring research to practice. J Subst Abus Treat. 2002;22:171–82.View ArticleGoogle Scholar
  37. Simpson DD, Flynn PM. Moving innovations into treatment: a stage-based approach to program change. J Subst Abus Treat. 2007;33:111–20.View ArticleGoogle Scholar
  38. Weiner BJ. A theory of organizational readiness for change. Implement Sci. 2009;4:67.View ArticlePubMedPubMed CentralGoogle Scholar
  39. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Implementation Science. Implement Sci. 2009;4:50.View ArticlePubMedPubMed CentralGoogle Scholar
  40. Damschroder LJ, Hagedorn HJ. A guiding framework and approach for implementation research in substance use disorders treatment. Psychol Addict Behav. 2011;25:194–205.View ArticlePubMedGoogle Scholar
  41. Damschroder LJ, Lowery JC. Evaluation of a large-scale weight management program using the consolidated framework for implementation research (CFIR). Implement Sci. 2013;8:51.View ArticlePubMedPubMed CentralGoogle Scholar
  42. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health Ment Health Serv Res. 2011:1–12.Google Scholar
  43. McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, Haynes RB, Straus SE. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a tower of babel? Implement Sci. 2010;5:16.View ArticlePubMedPubMed CentralGoogle Scholar
  44. Michie S, Fixsen D, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implement Sci. 2009;4:40.View ArticlePubMedPubMed CentralGoogle Scholar
  45. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, Glass JE, York JL. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69:123–57.View ArticlePubMedGoogle Scholar
  46. Powell BJ, Proctor EK, Glass JE. A systematic review of strategies for implementing empirically supported mental health interventions. Res Soc Work Pract. 2014;24:192–212.View ArticlePubMedGoogle Scholar
  47. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.View ArticlePubMedPubMed CentralGoogle Scholar
  48. Chamberlain P, Brown CH, Saldana L. Observational measure of implementation progress in community based settings: the stages of implementation completion (SIC). Implement Sci. 2011;6:116.View ArticlePubMedPubMed CentralGoogle Scholar
  49. Saldana L, Chamberlain P, Wang W, Brown CH. Predicting program start-up using the stages of implementation measure. Adm Policy Ment Health Ment Health Serv Res. 2012;39:419–25.View ArticleGoogle Scholar
  50. Chamberlain P, Roberts R, Jones H, Marsenich L, Sosna T, Price JM. Three collaborative models for scaling up evidence-based practices. Adm Policy Ment Health Ment Health Serv Res. 2012;39:278–90.View ArticleGoogle Scholar
  51. Capoccia VA, Cotter F, Gustafson DH, Cassidy EF, Ford JH 2nd, Madden L, Owens BH, Farnum SO, McCarty D, Molfenter T. Making "stone soup": improvements in clinic access and retention in addiction treatment. Jt Comm J Qual Patient Saf. 2007;33:95–103.View ArticlePubMedGoogle Scholar
  52. Hoffman KA, Green CA, Ford JH 2nd, Wisdom JP, Gustafson DH, McCarty D. Improving quality of care in substance abuse treatment using five key process improvement principles. J Behav Health Serv Res. 2012;39:234–44.View ArticlePubMedPubMed CentralGoogle Scholar
  53. Gustafson D, Johnson K, Capoccia V, Cotter F, Ford I, Holloway D, Lea D, McCarty D, Molfenter T, Owens B: The NIATx model: process improvement in behavioral health. Madison, WI: University of Wisconsin-Madison 2011.Google Scholar
  54. Hoffman KA, Ford JH 2nd, Choi D, Gustafson DH, McCarty D. Replication and sustainability of improved access and retention within the network for the improvement of addiction treatment. Drug Alcohol Depend. 2008;98:63–9.View ArticlePubMedPubMed CentralGoogle Scholar
  55. Hoffman KA, Ford JH, Tillotson CJ, Choi D, McCarty D. Days to treatment and early retention among patients in treatment for alcohol and drug disorders. Addict Behav. 2011;36:643–7.View ArticlePubMedPubMed CentralGoogle Scholar
  56. McCarty D, Gustafson DH, Wisdom JP, Ford J, Choi D, Molfenter T, Capoccia V, Cotter F. The network for the improvement of addiction treatment (NIATx): enhancing access and retention. Drug Alcohol Depend. 2007;88:138–45.View ArticlePubMedGoogle Scholar
  57. Quanbeck A, Wheelock A, Ford JH 2nd, Pulvermacher A, Capoccia V, Gustafson D. Examining access to addiction treatment: scheduling processes and barriers. J Subst Abus Treat. 2013;44:343–8.View ArticleGoogle Scholar
  58. Schmidt LA, Rieckmann T, Abraham A, Molfenter T, Capoccia V, Roman P, Gustafson DH, McCarty D. Advancing recovery: implementing evidence-based treatment for substance use disorders at the systems level. J Stud Alcohol Drugs. 2012;73:413–22.View ArticlePubMedPubMed CentralGoogle Scholar
  59. Ford JH 2nd, Abraham AJ, Lupulescu-Mann N, Croff R, Hoffman KA, Alanis-Hirsch K, Chalk M, Schmidt L, McCarty D. Promoting adoption of medication for opioid and alcohol use disorders through system change. J Stud Alcohol Drugs. 2017;78:735–44.View ArticlePubMedPubMed CentralGoogle Scholar
  60. Quanbeck AR, Gustafson DH, Ford JH 2nd, Pulvermacher A, French MT, McConnell KJ, McCarty D. Disseminating quality improvement: study protocol for a large cluster-randomized trial. Implement Sci. 2011;6:44.View ArticlePubMedPubMed CentralGoogle Scholar
  61. Gustafson DH, Quanbeck AR, Robinson JM, Ford JH 2nd, Pulvermacher A, French MT, McConnell KJ, Batalden PB, Hoffman KA, McCarty D. Which elements of improvement collaboratives are most effective? A cluster-randomized trial. Addiction. 2013;108:1145–57.View ArticlePubMedPubMed CentralGoogle Scholar
  62. McGovern MP, Lambert-Harris C, McHugo GJ, Giard J, Mangrum L. Improving the dual diagnosis capability of addiction and mental health treatment services: implementation factors associated with program level changes. Journal of Dual Diagnosis. 2010;6:237–50.View ArticleGoogle Scholar
  63. Ford Ii JH, Bardossi J, Vandersloot D, Pulvermacher A: Using NIATx process improvement technology to enhance addiction treatment Services for Persons with co-occurring disorders: results from a state of Washington pilot in Addiction Health Services Research Lexington, KY October 25-27, 2010.Google Scholar
  64. Ford JH 2nd, Green CA, Hoffman KA, Wisdom JP, Riley KJ, Bergmann L, Molfenter T. Process improvement needs in substance abuse treatment: admissions walk-through results. J Subst Abus Treat. 2007;33:379–89.View ArticleGoogle Scholar
  65. McGovern MP, Lambert-Harris C, Gotham HJ, Claus RE, Xie H. Dual diagnosis capability in mental health and addiction treatment services: an assessment of programs across multiple state systems. Adm Policy Ment Health Ment Health Serv Res. 2014;41:205–14.View ArticleGoogle Scholar
  66. Chaple M, Sacks S, Melnick G, McKendrick K, Brandau S. The predictive validity of the dual diagnosis capability in addiction treatment (DDCAT) index. Journal of Dual Diagnosis. 2013;9:171–8.View ArticleGoogle Scholar
  67. Campbell KM. Impact of record-linkage methodology on performance indicators and multivariate relationships. J Subst Abus Treat. 2009;36:110–7.View ArticleGoogle Scholar
  68. Estee S, Wickizer T, He L, Shah MF, Mancuso D. Evaluation of the Washington state screening, brief intervention, and referral to treatment project: cost outcomes for Medicaid patients screened in hospital emergency departments. Med Care. 2010;48:18–24.View ArticlePubMedGoogle Scholar
  69. Lipsky S, Krupski A, Roy-Byrne P, Lucenko B, Mancuso D, Huber A. Effect of co-occurring disorders and intimate partner violence on substance abuse treatment outcomes. J Subst Abus Treat. 2010;38:231–44.View ArticleGoogle Scholar
  70. Sears JM, Krupski A, Joesch JM, Estee SL, He L, Shah MF, Huber A, Dunn C, Ries R, Roy-Byrne PP. The use of administrative data as a substitute for individual screening scores in observational studies related to problematic alcohol or drug use. Drug Alcohol Depend. 2010;111:89–96.View ArticlePubMedGoogle Scholar
  71. Assefa MT, McGovern MP: Operationalizing the consolidated framework for implementation research (CFIR) to systematically assess context. In Addiction Health Services Research. Madison, WI.Google Scholar
  72. Crnich CJ, Nace D, Ramly E, Ford Ii JH, Wetterneck T: Improving antibiotic prescribing in nursing homes through work system redesign. In AMDA The Society for Post-Acute and Long-Term Care Medicine. Orlando, Florida.Google Scholar
  73. Sample Power 3.01. Chicago, Illinois: IBM SPSS; 2010.Google Scholar
  74. Killip S, Mahfoud Z, Pearce K. What is an intracluster correlation coefficient? Crucial concepts for primary care researchers. The Annals of Family Medicine. 2004;2:204–8.View ArticlePubMedGoogle Scholar
  75. McCarty D, McGuire TG, Harwood HJ, Field T. Using state information systems for drug abuse services research. Am Behav Sci. 1998;41:1090–106.View ArticleGoogle Scholar
  76. Landsverk J, Brown CH, Chamberlain P, Palinkas LA, Ogihara M, Szaja S, Goldnaber-Fiebert JD, Roll-Reutz JA, McCue HS: Design and analysis in dissemination and implementation research. In Dissemination and implementation research in health: translating science to practice. Edited by Brownson RC, Colditz GA, Proctor EK. New York, NY: Oxford University Press; 2012: 225–260.Google Scholar
  77. Molfenter T, Connor T, Ford JH 2nd, Hyatt J, Zimmerman D. Reducing psychiatric inpatient readmissions using an organizational change model. Wmj. 2016;115:122–8.PubMedGoogle Scholar
  78. Roosa M, Scripa JS, Zastowny TR, Ford JH 2nd. Using a NIATx based local learning collaborative for performance improvement. Eval Program Plann. 2011;34:390–8.View ArticlePubMedPubMed CentralGoogle Scholar
  79. Brown VB, Harris M, Fallot R. Moving toward trauma-informed practice in addiction treatment: a collaborative model of agency assessment. J Psychoactive Drugs. 2013;45:386–93.View ArticlePubMedGoogle Scholar
  80. Molfenter T. Reducing appointment no-shows: going from theory to practice. Subst Use Misuse. 2013;48:743–9.View ArticlePubMedPubMed CentralGoogle Scholar
  81. Wexler HK, Zehner M, Melnick G. Improving drug court operations: NIATx organizational improvement model. Drug Court Rev. 2012;8:80–95.Google Scholar
  82. Daley M, Shepard DS, Tompkins C, Dunigan R, Reif S, Perloff J, Siembab L, Horgan C. Randomized trial of enhanced profiling in substance abuse treatment. Adm Policy Ment Health Ment Health Serv Res. 2011;38:96–104.View ArticleGoogle Scholar
  83. Belenko S, Visher C, Pearson F, Swan H, Pich M, O'Connell D, Dembo R, Frisman L, Hamilton L, Willett J. Efficacy of structured organizational change intervention on HIV testing in correctional facilities. AIDS Educ Prev. 2017;29:241–55.View ArticlePubMedGoogle Scholar
  84. Pearson FS, Shafer MS, Dembo R, Del mar Vega-Debien G, Pankow J, Duvall JL, Belenko S, Frisman LK, Visher CA, Pich M, Patterson Y: Efficacy of a process improvement intervention on delivery of HIV services to offenders: a multisite trial. Am J Public Health 2014, 104:2385–2391.Google Scholar
  85. Visher CA, Hiller M, Belenko S, Pankow J, Dembo R, Frisman LK, Pearson FS, Swan H, Wiley TR. The effect of a local change team intervention on staff attitudes towards HIV service delivery in correctional settings: a randomized trial. AIDS Educ Prev. 2014;26:411–28.View ArticlePubMedPubMed CentralGoogle Scholar
  86. Pankow J, Willett J, Yang Y, Swan H, Dembo R, Burdon WM, Patterson Y, Pearson FS, Belenko S, Frisman LK. Evaluating Fidelity to a modified NIATx process improvement strategy for improving HIV Services in Correctional Facilities. J Behav Health Serv Res. 2017:1–17.Google Scholar
  87. Ford JH 2nd, Abramson B, Wise M, Dattalo M, Mahoney JE. Bringing healthy aging to scale: a randomized trial of a quality improvement intervention to increase adoption of evidence-based health promotion programs by community partners. J Public Health Manag Pract. 2017;23:e17-e24.View ArticlePubMedGoogle Scholar
  88. Dattalo M, Wise M, Ford Ii JH, Abramson B, Mahoney J. Essential resources for implementation and sustainability of evidence-based health promotion programs: a mixed methods multi-site case study. J Community Health. 2017;42:358–68.View ArticlePubMedGoogle Scholar
  89. Molfenter T, Sherbeck C, Zehner M, Quanbeck A, McCarty D, Kim J-S, Starr S. Implementing buprenorphine in addiction treatment: payer and provider perspectives in Ohio. Substance abuse treatment, prevention, and policy. 2015;10:13.View ArticlePubMedPubMed CentralGoogle Scholar
  90. Molfenter T, Sherbeck C, Starr S, Kim J-S, Zehner M, Quanbeck A, Jacobson N, McCarty D. Payer policy behavior towards opioid pharmacotherapy treatment in Ohio. J Addict Med. 2017Google Scholar

Copyright

© The Author(s). 2018

Advertisement