Skip to main content

Advertisement

The implementation of change model adds value to value-based healthcare: a qualitative study

Article metrics

Abstract

Background

Value-based healthcare (VBHC) is a concept that focuses on outcome measurement to contribute to quality improvement. However, VBHC does not offer a systematic approach for implementing improvement as implementation science does. The aim is to, firstly, investigate the implementation of improvement initiatives in the context of VBHC and secondly, to explore how implementation science could be of added value for VBHC and vice versa.

Methods

A case study with two cases in heart care was conducted; one without the explicit use of a systematic implementation method and the other one with the use of the Implementation of Change Model (ICM). Triangulation of data from document research, semi-structured interviews and a focus group was applied to evaluate the degree of method uptake. Interviews were held with experts involved in the implementation of Case 1 (N = 4) and Case 2 (N = 7). The focus group was held with experts also involved in the interviews (N = 4). A theory-driven qualitative analysis was conducted using the ICM as a framework.

Results

In both cases, outcome measures were seen as an important starting point for the implementation and for monitoring change. Several themes were identified as most important: support, personal importance, involvement, leadership, climate and continuous monitoring. Success factors included intrinsic motivation for the change, speed of implementation, complexity and continuous evaluation.

Conclusion

Application of the ICM facilitates successful implementation of quality- improvement initiatives within VBHC. However, the practical use of the ICM shows an emphasis on processes. We recommend that monitoring of outcomes be added as an essential part of the ICM. In the discussion, we propose an implementation model that integrates ICM within VBHC.

Contributions to the literature

  • Value-based healthcare (VBHC) focuses on outcome measurement to contribute to quality improvement. VBHC does not offer an implementation methodology for improvement initiatives as the Implementation of Change Model (ICM) does.

  • Outcome measurement is an important starting point.

  • Support, personal importance, involvement, leadership, climate and continuous monitoring are important for implementation.

  • Success factors included intrinsic motivation for the change, speed of implementation, complexity and continuous evaluation.

  • An Integrated Implementation Model is proposed for the implementation of VBHC improvement initiatives that incorporates monitoring of outcome measures and integrates the ICM.

Background

Improving the quality of care while reducing costs is a major goal on many hospital agendas [1, 2]. The goal of value-based healthcare (VBHC) is to reorganize health care in order to increase value for patients [3]. ‘Value’ in VBHC is defined as patient-relevant health outcomes relative to costs [3]. Porter suggests that this goal can be achieved by measuring outcomes and costs per medical condition, which will allow for the identification of variation in outcomes across the full cycle of care [4]. Experts suggest that, based on this insight into outcomes, improvement potential can be identified and quality of care improved [5]. In current practice, VBHC is used as a concept leading to improvement by measuring outcomes in registries and supporting more efficient coordination of care through benchmarking and reporting [6]. However, the current application of VBHC lacks a systematic approach for the implementation of improvements. The concept is sometimes presented as the sole solution for improving outcomes and reducing costs, but how improvements should be implemented remains unclear. In the literature, a lack of a systematic approach for using VBHC and specifically a method for the implementation of improvement initiatives was identified [1]. Measurement of outcomes and costs has been shown to provide valuable insights into practice variation and waste, which can lead to process improvement [7, 8]. Literature on the implementation of improvement initiatives in the context of VBHC is scarce. One example was identified in the context of a project for orthopaedics, in which the identification of variation in hospital stay led to improvement [7]. Another example, which involved prostate cancer care, showed that improvement based on outcomes led to a relevant decrease in incontinence rates [9]. Moreover, within heart care several improvement initiatives were implemented based on identified variation in outcomes [1]. How the improvements were implemented was, however, not described. Therefore, we aimed to investigate the implementation of improvement initiatives in the context of VBHC and whether a systematic implementation method has added-value for VBHC. The resulting insight could enrich the concept of VBHC [10].

In order to investigate whether systematic implementation could add value to VBHC, a suitable framework needed to be identified. A previous review identified implementation frameworks, models and theories for the process of implementation [11]. The most commonly cited frameworks include the PARIHS, [12] Conceptual Model, [13] the Implementation of Change Model, [14] Ecological Framework [15] and the CFIR. [16] Based on the results of this review, the Implementation of Change Model (ICM) seemed to be the most suitable for the purpose of offering a systematic approach for the implementation of improvements since it specifies practical steps for the process of implementation [14]. Several quality improvement projects have applied the ICM or parts of it [17,18,19].

This paper describes how improvement initiatives which were selected based on insights into outcomes were implemented. To show the added-value of a systematic implementation approach for VBHC, we selected two cases. The goal was to use VBHC as a guideline for both projects in the identification and selection of an improvement intervention. Both interventions emerged from a VBHC improvement cycle. In an earlier systematic literature review only very few improvement interventions based on insights into outcomes were identified [20]. Therefore, the aim was to compare two improvement interventions that used the same starting point to compare the implementation process. The first case was implemented without the explicit use of a systematic implementation approach, while the second case was implemented with the explicit use of a systematic implementation approach, i.e. the ICM. By analysing and comparing the two cases, the goal of this paper was to learn what went well and what could be improved in order to give recommendations on how to implement improvement initiatives in the context of VBHC. The analysis was not intended to evaluate the improvement on outcomes, but to explore the implementation process of two improvement initiatives.

Theoretical framework

ICM

The ICM was developed based on examples from the practice of implementing change in health care and examples from the literature [21]. The ICM consists of seven steps for guiding the implementation of improvement (Table 1). The first step of the model is development of a proposal and target for change, which includes a detailed analysis of the characteristics of the possible innovation and/or change. Secondly, actual performance or outcome variation at baseline has to be assessed in order to gain insights into the current situation and indications for change [21]. The following step of the ICM is the problem analysis, which is seen as a crucial step to the implementation of an improvement initiative [14]. The analysis of barriers and facilitators should include a structured analysis of relevant stakeholders, determinants of change, and subgroups in the target population [22]. Based on the analysis of possible barriers, implementation strategies can be identified [21]. This step is followed by a pilot implementation and the integration into routine care [21]. The last step of the model is the evaluation of the change, which could lead to modifications and a return to earlier steps of the model [21]

Table 1 Seven steps of the ICM

.

Methods

Design: case study

A collective case study design was chosen to test the ICM for two cases. Cases in collective case studies are similar, yet can have a different context [23]. The goal of a collective case study is to compare two or more cases [24]. For this analysis, a within-site collective case study was conducted. According to Creswell (1998), theory can be employed in different ways in a case study design: before or after data collection [23]. For the purposes of exploring the application of the ICM, theory was employed both for supporting the interview guide and for comparing both cases for interpretation after the interview. For this study the Consolidated criteria for reporting qualitative studies (COREQ) were applied.

Case selection and setting

The two cases were selected according to the principles of purposive sampling [23]. Purposive sampling can be used to identify cases that show different perspectives on the same problem [23]. To provide a clear comparison of the implementation approaches used in the two cases, deviant case sampling was applied. In deviant case sampling, cases are selected that are contrasting in some way [25]. For our purposes, one case was chosen that implemented an improvement initiative without the explicit application of a systematic implementation method, while in the other case there was explicit use of the ICM. Both cases emerged based on insights into outcomes according to the VBHC concept. The starting point for the development of both improvement initiatives was the same set of outcome measures and both initiatives share the goal of improving these outcomes. Therefore, both cases are comparable due to their context, yet they can also be contrasted. Creswell (1998) suggests that the more cases are studied, the less depth the cases have [23]. Only two cases were chosen to ensure that they were “information rich”; it was not the purpose of the study to achieve statistical generalization [26]. The research was carried out at a Dutch hospital from June 2017 until January 2018. The first selected case concerns a pre-incision checklist for cardiac surgery to improve cultural behaviour in the operating theatre and reduce 120-day mortality rate (Table 2). The second case is about a protein-enriched diet given to patients two weeks before the operation in order to improve their fitness before cardiac surgery and prevent postoperative complications (Table 2).

Table 2 Description of the cases

Data collection methods

Triangulation of data sources was applied. Using multiple sources for data collection is advised for case studies [23]. First, a document analysis of minutes, presentations and memos was conducted. The documents were made available by a member involved during each of the implementation processes per case. Second, interviews were conducted with professionals involved in the implementation process of the two selected cases. Interviews were semi-structured with a length of approximately 20 min. An interview guide based on the theoretical framework including the ICM was used (see Additional file 1). All interviews were audio-recorded and transcribed verbatim. Third, a focus group interview was conducted in order to recapitulate the results from the interviews. The focus group was intended for feedback purposes and gathering perceptions, and it allowed participants to make additional comments on each other’s opinions [25]. The focus group was audio-recorded and transcribed. The interviews and focus group were conducted by a researcher. The language of the interviews and focus group was Dutch and transcripts were translated into English.

Sampling of participants

Participants for the interviews were selected through a mix of criterion sampling and snowball sampling. Criterion sampling is a method of choosing all participants that meet a predefined criterion [25]. The criterion for selection was that participants must have had an active role in the implementation of the case. Additional snowball sampling [25] was applied by asking participants whether other participants were involved in the implementation process who could provide more information. Participants were asked to participate via e-mail. For the first case, four professionals were chosen (N = 4) including a cardio-thoracic surgeon, a perfusionist, an anaesthesiologist and a data manager. This was the maximum number fulfilling the criterion, including participants suggested through snowball sampling, because no other participants were involved in the implementation of the intervention. For the second case, seven interviews were conducted (N = 7) with two cardiologists, a cardio-thoracic surgeon, a nurse, a researcher and two secretaries. Also for this case the maximum number of participants was chosen through both sampling strategies. The sample size was chosen because it was not the purpose of the cases to draw externally generalizable conclusions, but instead to collect all possible viewpoints, opinions and thoughts of relevant stakeholders about the case [25]. Informed consent was obtained from all participants before the start of the interviews.

The same participants from both cases were also invited to participate in the focus group in order to comprise a multidisciplinary group of experts from both cases. Convenience sampling was applied. This setup was chosen because participants could relate to comments made by colleagues since they shared experiences during the implementation process [36]. Four experts agreed to participate in the focus group (N = 4). Two experts were involved in the implementation of Case 1 and two of Case 2.

Data analysis

We analysed the results in three steps: 1) Chronological case description with a within-case analysis from the documents and interviews, 2) Cross-case analysis from the interviews, and 3) Focus group analysis.

As recommended by Creswell (1998), the detailed case description is done chronologically [23]. The advantage of this approach is that each case can be described separately in order to understand each case as a holistic entity [25]. For this analysis both the documents as well as the interviews were used. The interviews were coded by one researcher. Subsequently, a within-case analysis was conducted with a detailed description of each case and themes within the case [23]. Each case can be seen separately as holistic and context sensitive [25]. A holistic perspective, according to Patton (2002), is one in which the whole context is seen as a complex system [25]. Thus, only when all interviews and sources from the document analysis are combined the whole case is formed. Context sensitivity refers to comparative case analysis and identifying patterns for transferability to a different setting [25]. Data for this analysis were gathered through document analysis and interviews.

A thematic analysis across cases was then carried out, which is known as a cross-case analysis [23]. The data were analysed using deductive analysis techniques based on the theoretical framework of the ICM [25]. In order to contrast and compare the cases, constant comparative analysis was applied [25]. Qualitative comparative analysis seeks to compare cases in order to generate explanations. For the analysis, a so-called truth table was developed in order to test the absence or presence of each step of the ICM [25]. The goal of this analysis approach was not to force the data into predetermined categories, but to show that the ICM enhances the knowledge of the implementation process of both cases. For this analysis the interviews were used. Subsequently, the focus group was analysed by comparing discussions of similar themes [36]. Both cases were interpreted in terms of success factors. Successful implementation was defined as a positive experience by participants.

Analyses were performed in atlas.ti 7.0.

The results of the case analysis are presented in three steps: 1) a chronological case description with a within-case analysis, 2) a cross-case analysis, and 3) the focus group analysis resulting in success factors for the implementation.

Results

The interviews and focus group were held by a researcher (the primary researcher of the study).

Case descriptions

A reconstruction of both cases was made.

Chronological steps of Case 1 with a within-case analysis (introduction of a pre-incision checklist for cardiac surgery in a hospital) (Fig. 1):

  1. a.

    The proposal for change was derived from results of another partnering hospital. At the partnering hospital a larger project was initiated, which included a pre-incision checklist. That hospital presented favourable outcomes in a benchmarking analysis with other Dutch heart centres, which was underpinned by the results of the document analysis. Insights into explanations for the differences in patient-relevant outcome measures showed that a pre-incision checklist could contribute to a reduction of 120-day mortality rate.

  2. b.

    The initial start of Case 1 took place with a pilot phase without the requirement to comply with the intervention. No clear implementation team was in place to inform potential participants about the use of the intervention, which left users unaware of its existence and added-value.

  3. c.

    It was reported that an intervention was started, but it was not carried out as a standard part of the care process. An analysis through questionnaires was carried out in the beginning to investigate whether the initiative was considered important and whether the culture and context would be open to it. The questionnaire included questions on the importance of the checklist to the users and the climate for implementing the checklist which resulted from the document analysis. Questions focused on how long employees have been working at the hospital, whether they thought that colleagues in the ward were treated with respect, whether they felt that they can tell when something is not going well in the operation room, whether they agreed with the introduction of the pre-incision checklist and whether the timing of the check just before incision of the patient was right. These questions were comparable to the validated Team Climate Inventory (TCI), which is an instrument used to measure organizational climate and team building and development [37].

  4. d.

    It was reported that the implementation took place very fast and time was needed to carry out more analyses and develop appropriate implementation strategies. A brief implementation plan was offered from an external partnership that had previously developed and implemented the initiative.

  5. e.

    Next, the checklist was announced during a team meeting of cardio-thoracic surgeons and disseminated via e-mail to operation assistants. The dissemination was supposed to happen from within. Thus, doctors and an anaesthesiologist were the primary contact persons in order to facilitate a low-threshold for asking questions and to prevent resistance from the people applying the checklist.

  6. f.

    A period of voluntary participation with regard to applying the checklist for cardiac surgery was established for about one to two months.

  7. g.

    In the subsequent step, the checklist was implemented in routine care. The implementation took place with a simple start and communication among involved colleagues.

  8. h.

    An evaluation followed, which led to the conclusion of cardio-thoracic surgeons in the hospital that the initiative did not add value to their work. An e-mail was sent to all involved parties and the checklist was stopped. However, some questions that are part of the checklist were integrated into the standard time-out form of the hospital.

Fig. 1
figure1

Flowchart of the implementation process of Case 1. Describes the process of the implementation of Case 1

The implementation process of Case 1 took place between December 2015 and February 2016.

Chronological steps of Case 2 with a within-case analysis (preoperative protein-enriched diet) (Fig. 2):

  1. a.

    The implementation process started with an outcome analysis as a basis for the target. The analysis was based on an outcome registry. Results of the analysis of outcome measures of the hospital made participants feel an urge to change with the development of an improvement intervention. The analysis resulted in a clear target. The target was considered feasible, but difficult to combine with the aims and wishes of the patients to receive an operation as fast as possible.

  2. b.

    The protein-enriched diet and the number of patients with undernutrition were analyzed and discussed in a multidisciplinary implementation team. The team consisted of a researcher, cardiologist, cardio-thoracic surgeon, anaesthesiologist, dietician, head of the hospital kitchen, nurse and researcher.

  3. c.

    The target was refined and a context analysis conducted. The context analysis included an analysis of the current preoperative process for older patients. This analysis could lead to a delay in implementation. The context analysis was conducted and discussed in the implementation team, but not further disseminated to all participants involved with the initiative in order to increase support for the implementation. The context analysis was done in several steps to gain as much insight as possible into the current process of care. This analysis formed the basis for the problem analysis to identify possible barriers and develop an implementation plan.

  4. d.

    The implementation plan was drafted. The implementation plan included a financial plan. It also led to the development of implementation strategies. The implementation plan was adjusted based on feedback from the implementation team. After adjustments, each team member disseminated the implementation plan to the broader involved team in the hospital. Individuals were offered training on their future tasks concerning the implementation.

  5. e.

    Subsequently, the protein-enriched diet was implemented in the form of a pilot aimed at including five patients.

  6. f.

    After these inclusions, an evaluation meeting was organized with the implementation team. The implementation plan and inclusion criteria were adjusted and long-term goals were formulated.

  7. g.

    In the subsequent step, continuous feedback was given via e-mail followed by another evaluation meeting with the implementation team where first results and outcomes were monitored. Following this meeting, adjustments to the plan were made and implementation in routine care was prepared.

Fig. 2
figure2

Flowchart of the implementation process of Case 2. Describes the process of the implementation of Case 2

The implementation of Case 2 took place between April 2016 and February 2017.

Cross-case analysis

Each case was tested against the ICM. Table 3 summarizes to what extent both cases were implemented according to the steps of the ICM.

Table 3 Checklist whether the steps of ICM have been applied per case

The first case was implemented without the application of a specific implementation approach, unlike the second case which used the ICM. Differences in the processes of both cases, as described in the case descriptions, included the development of a proposal for change, elaboration of an implementation plan, development of implementation strategies, testing and execution of the implementation plan and implementation into routine care. For the first case, the proposal for change was imposed by an external hospital through a network of hospitals [1]. Whereas, for the second case a detailed outcome analysis was conducted together with health care professionals who proposed to implement change based on the results of the analysis. The implementation of Case 1 started directly with a pilot phase followed by a culture and context analysis. An implementation plan was used from the external hospital, which was transferred to the current setting without adjustments. In Case 2, after formation of an implementation team, a detailed context analysis was conducted followed by drafting of an implementation plan suitable to the context. Concerning the implementation strategies, in Case 1 an announcement of the intervention during a team meeting and dissemination via e-mail was considered sufficient. In Case 2, the implementation plan was offered to individuals that would be affected by the intervention. The individuals were given the chance to comment and receive training on their tasks for the execution of the intervention. Furthermore, Case 1 was implemented into routine care after a short period of voluntary participation. The second case was not, yet, implemented into routine care, since evidence on the effect on outcomes was desired for the intervention to be implemented into routine care. In Case 1, no further interim evaluations took place. Only an end evaluation determining the stop of the intervention was organized. In comparison, for Case 2 continuous feedback was given followed by an evaluation meeting.

In the case comparison, a number of themes have been identified as most important for the implementation of improvement interventions with a focus on monitoring value (Table 4). These themes showed that the steps of the ICM enhance the implementation process of both cases.

  • Support: Support is important in the beginning of the implementation and includes support for the proposal for change, but also for execution of the implementation plan. Support can also be linked to other steps of the ICM later on in the process, such as involvement and leadership.

  • Personal importance of the target: Respondents mentioned that when an initiative feels important to them, the implementation process is improved.

  • Involvement: For the problem analysis, involvement has been identified as a theme. For the first case, involvement was lacking and neither outcomes nor progress were shared with all participants involved in the initiative. That led to frustration and less uptake of the initiative.

  • Leadership: Participants mentioned that there was no clarity on how to use the intervention in Case 1. This should have been resolved by having one leader in the operation room. That leader was not clearly defined and did not clearly perform his tasks. Therefore, the initiative lacked uptake.

  • Climate: Development and maintenance of a positive climate were mentioned as being important for successful implementation. Room for critique and adjustment should be present.

  • Monitoring: Monitoring, as part of the last step of the ICM, has been identified as important. Monitoring in the first case would have supported uptake as well. As mentioned by R2, if it had been monitored how often the checklist was used, it would have been possible to intervene faster.

Table 4 Results of the cross-case analysis

These themes are linked to steps of the ICM in order to see whether the ICM has added-value for the implementation of improvement initiatives in the context of VBHC. However, for one step of the ICM, namely step 6. Integration of changes into routine, no important theme across cases was identified.

Focus group analysis: success factors for the implementation of improvement interventions

A focus group interview was conducted with four professionals who were also involved in the earlier interviews to critically reflect on the results of the interviews. Several success factors were identified: intrinsic versus extrinsic motivation, a multi-centre intervention compared to a single-centre intervention, the name of the intervention, speed of the implementation process, complexity, continuous feedback and output.

Firstly, the aspect participants reflected on was the fact that the motivation for successful implementation differs when the motivation is extrinsic, i.e. an intervention that is adapted from another hospital versus an intervention that the hospital developed itself. Adapting an external improvement intervention could potentially lead to social pressure for implementation, which could impact success.

“That of course makes a difference whether you invented something yourself and have time to roll it out or if you adapt something from outside. If you really want to participate, then you have to start before a certain date. Otherwise we are too late. That is missing here.” (R4)

Secondly, the name of the intervention which includes the name of another hospital has an impact on the success of the implementation, as elaborated by R7:

“Yes, or what you call it. I hear you call it differently. What is the difference from the original name? So then it is just what you call the intervention, because maybe you do it the right way, but you just call it a number of things under a different name. If it would have been called a different name, maybe we would have been more willing to apply it.”

Thirdly, the speed of an implementation is dependent on whether the intervention involves a multidisciplinary team or a smaller team. Participants mentioned that a systematic implementation model would be applicable for straightforward interventions, but decisions have to be made for interventions that require more extensive research in order to follow.

“The first case is something you have to implement with a whole team; it’s multidisciplinary. You have to get the anaesthesiologist, all participants of the time-out, the perfusion, the nurse, everyone has to support it. So many people need to say yes; I don’t see this happening.” (R4)

Fourthly, the complexity of an intervention influences the success and speed of the implementation. An intervention that is less complex would not need to follow all the steps of the model.

“I think if you do something with some kind of work agreement – so this is a work agreement that, for example, you only let members of the medical staff operate – then you need to follow fewer of those steps. I mean, it’s something you do that you agreed on with the whole team. But if you do something like Case 2 where you also have to measure things, then you have to start with the measurement. You have to organize something for recording the outcome (…) and have good data, and then yes, develop a proposal for an improvement. Yes, that sometimes starts before step 1.” (R4)

Fifthly, continuous evaluation of outcome measures can be time-consuming, but are also crucial for successful implementation and support. Participants also discussed continuous feedback.

“But shouldn’t you let the proposal for change come back continuously because that is at step 3, then data analysis, problem analysis. If you are going to implement strategies, then you actually want to see what effect it has. (…) Because if you have implemented your number of things, then you actually want to know what is the effect of that. And maybe it has no effect. So I would repeat the proposal for change more frequently.” (R4)

The participants noted that it could also be necessary to return from one step to the beginning in a systematic implementation model.

Sixthly, output was also mentioned as being important for successful monitoring. Output, the goal of a successful implementation, should be defined before implementation and, next to outcome measures, be evaluated continuously.

“This is not even outcome, it is output. In terms of input, throughput, output, outcome. I always make the comparison with a vaccination program. Output is how many people you vaccinate, and the outcome is the observed decrease in the prevalence of a condition in an area.” (R7)

The focus group interview identified six success factors for the implementation of improvement initiatives in the context of VBHC.

Discussion

The study had three objectives: to investigate the implementation of improvement initiatives in the context of VBHC, to explore how implementation science could be of added-value for VBHC and vice versa, and to investigate what we can learn from the implementation of two cases in the context of VBHC. To accomplish these objectives, we compared two cases, one that used the ICM and one that did not. In this study, we showed that the use of an implementation model such as the ICM contributed to a more positive experience of the implementation team and better uptake.

Our study identified important themes for the implementation of improvement initiatives in the cross-case analysis. The factors identified in this study are in line with previous research from implementation science. Grol et al. also identified incentives for uptake in relation to the ICM steps, which include conveying a positive attitude towards change and motivation [38]. The literature identified the practitioners’ or users’ support the change as a leading factor for guideline adherence [39]. In our study, personal and professional involvement in the design of the intervention played an important role in the success of the implementation. Previous research also identified involvement as a success factor for uptake of clinical practice guidelines [39, 40]. Implementation of guidelines is comparable to implementation of improvement interventions. Furthermore, participants mentioned the importance of strong leadership and a climate in which users feel free to discuss issues. Incentives for change can be established at various levels, such as in the social context [38]. In the literature, the social context includes culture, leadership and collaboration [38]. The absence of social norms can hinder uptake [41]. Moreover, the composition of an improvement team should be diverse and include all relevant healthcare professionals, as noted in a systematic review of factors influencing guideline implementation [42].

The implementations themselves were considered successful based on the results of the interviews. The implementation process was experienced more positively for Case 2, even though it was not yet implemented into routine care. Nevertheless, the implementation of Case 2 does not yet show an effect on relevant outcome measures. The themes that emerged from the cross-case analysis indicate that the implementation of Case 2 was experienced more positively when support for the implementation is created through involvement, the improvement initiative is of personal importance, leadership was present and a positive climate was created. The implementation itself was considered successful based the process of the implementation, even though it was not yet implemented into routine care during the exploration of the current study. After completion of the current study, the improvement initiative laid out in Case 2 led to continued work. The results of the effect of the improvement initiative showed a significant improvement in protein intake and an indication of an improvement in hospital length-of-stay (results to be published). Based on these results, we expect that preoperative protein-enriched diet will become part of a bundle of improvement initiatives targeting frail elderly people undergoing surgery. Case 2 showed an improvement initiative targeting preoperative preparation could In contrast, in Case 1 participants felt uninvolved and that their needs were ignored since no room for evaluation was created. Whether the implementation in terms of impact on patient-relevant outcomes was successful could not be determined with this exploration on how improvement initiatives focussing on monitoring value were implemented. The goal of this study was not to reach a certain goal in quantitative terms, but rather explore what went well and what could be improved in the implementation process of VBHC improvement initiatives based on two cases.

Based on the results of this study, we built a framework for the implementation of improvement interventions. From the analysis of the success factors for the implementation of improvement initiatives, it appears that the ICM can add value to VBHC and the implementation of value-based improvement initiatives. We, therefore, propose an implementation model that integrates new steps identified through the interviews that are unique to VBHC in order to add value to the ICM and vice versa (Fig. 3). The Integrated Implementation Model (IIM) consists of two additional steps next to the steps of the ICM. These two steps include: outcome registry as a basis and benchmarking.

Fig. 3
figure3

Integrated Implementation Model (IIM) for improvement projects. The steps adapted from the ICM are framed in black. The other steps are new additional steps to the IIM. The arrows on the side indicate the possibility for repetition of steps

Currently, in the ICM there is insufficient focus on the measurement and application of patient-relevant outcomes measures. The final step of the ICM is ‘Continuous feedback and evaluation’ which includes evaluation of performance [14]. Grol et al., however, do not further define performance. Therefore, a focus on outcome measurement is necessary when applying the ICM in the context of VBHC. It is important to consider the proposed implementation model as a roadmap for implementation where at every step of the process possible adaptations need to take place and earlier steps must be repeated. There is support in the literature for our proposal of continuous monitoring of outcomes and adaptation where needed [14].

In the context of VBHC an implementation approach was lacking to guide the implementation of improvement interventions. Whether the IIM adds value to VBHC and vice-versa is yet to be determined and future research should focus on validation of the IIM. However, in the literature, the benefit of new models compared to parallel approached is discussed [43]. The application of an existing, suitable implementation approach is favourable [44]. However, depending on the context and needs a combination of frameworks is necessary [44].

Despite our efforts to rigorously follow the steps of qualitative research, this study has limitations. Firstly, complexity was identified as a limitation. In the first case, doctors needed to change behaviour by adding questions to their usual time-out procedure, which is less complex than targeting the patients as in the second case. Whether an improvement requires doctors or patients to change, can impact support and uptake. As for guideline adherence, guidelines that can be easily understood and are thus less complex have a greater chance of uptake [42]. In our study, strong support and involvement before implementation were identified as important for ensuring successful implementation of complex improvement interventions. Secondly, the first case was part of a larger improvement initiative initiated by another hospital. The initial project included various aspects next to a pre-incision checklist, e.g. the implementation of additional information from actual transoesophageal echocardiography images immediately after induction of anaesthesia [28]. At the current hospital, only a small part of the larger improvement project was implemented, which could have impacted results and motivation for this initiative. Thirdly, comparability of the cases could have influenced the cross-case analysis. The intention was to choose two cases that could be contrasted, yet were also comparable. The cases can be contrasted given the fact that in the first case no structural implementation method was used, whereas in the second case, the ICM was used for the implementation. The cases are comparable, because both initiatives emerged based on outcome measures. However, substantial differences concerning the nature of the cases including the involvement of the health care professionals, the impact on workload, the type of outcome measures and timing, could impact the results. Both cases were implemented as value-based improvement initiatives as an organizational intervention. Fourthly, the composition of the focus group could also potentially impact the results. The focus group consisted of doctors, nurses, data managers and secretaries. Hierarchy could have potentially affected the data [36], as participants may have felt inhibited by the presence of a doctor. Fifthly, the number of interviewed participants was relatively small. Including more participants could have enhanced the description of the cases. However, all possible participants were included in the study. Sixthly, to quantitatively determine the success of the implementation, ideally we would measure the number of safety-checks filled in Case 1 and the number of patients in Case 2 that received protein-enriched diet. However, we did not follow the implementations prospectively, but instead retrospectively analysed the implementation process based on document analysis and interviews. The goal of this study was not to measure success in quantitative terms, but in terms of experience of the implementation process by participants. Seventhly, the data coding was conducted by a single researcher (NZ), which could have posed a threat to the reliability of coding. However, the results of the codes were discussed with three researchers (SG, GW and PvdN) to increase reliability of results. The coding method may have impacted the results of the analysis. Lastly, contamination of both cases may have influenced the implementation process of Case 2 as both interventions were implemented in the same setting. However, both the interventions were implemented at different times (Case 1 between December, 2015 and February 2016 and Case 2 between April 2016 and February, 2017). Since Case 1 preceded Case 2, possible lessons from Case 1 may have influenced the process of Case 2. However, the teams that were involved in both implementations were substantially different.

We aimed to illustrate how a systematic implementation method could support the implementation of improvement interventions based on outcomes. In order to determine the degree of successful implementation, we recommend further studies to evaluate the effect of each case on the health outcomes relevant to the case. We also recommend to test the suggested IIM in a different setting.

Conclusion

Applying an implementation method such as the ICM which offers guidance for the implementation was found to be valuable for successful implementation. The primary focus for implementation of improvement interventions should be outcome measures because insights into outcomes (that are relevant for patients) give an actual picture of the value added of the improvement initiative. This focus is applicable in general for the ICM, not only in the context of VBHC. We, therefore, propose using the IIM for interventions with the aim of quality improvement. Further research needs to be conducted to evaluate the use of the integrated model.

Availability of data and materials

The interview and focus group data analysed are not publicly available due to length and relevance of the interview transcripts but are available from the corresponding author on reasonable request.

Abbreviations

ICM:

Implementation of Change Model

SAVR:

Surgical Aortic Valve Replacement

TAVR:

Transcatheter Aortic Valve Replacement

TEE:

Transesophageal echocardiography

VBHC:

Value-based health care

References

  1. 1.

    van der Nat P, van Veghel D, Daeter E, Crijns H, Koolen J, Houterman S, Soliman M, de Mol B, Meetbaar Beter Study Group: Insights on value-based healthcare implementation from Dutch heart care. International Journal of Healthcare Management 2017, :1–4.

  2. 2.

    Kaplan RS, Witkowski M, Abbott M, Guzman AB, Higgins LD, Meara JG, Padden E, Shah AS, Waters P, Weidemeier M. Using time-driven activity-based costing to identify value improvement opportunities in healthcare. J Healthc Manag. 2014;59(6):399–413.

  3. 3.

    Porter ME. What is value in health care? N Engl J Med. 2010;363(26):2477–81.

  4. 4.

    Porter ME. Value-based health care delivery. Ann Surg. 2008;248(4):503–9.

  5. 5.

    Porter ME, Teisberg EO. Redefining health care: creating value-based competition on results: Harvard business press; 2006.

  6. 6.

    van Deen WK, Esrailian E, Hommes DW. Value-based health care for inflammatory bowel diseases. J Crohn's Colitis. 2015;9(5):421–7.

  7. 7.

    Haas DA, Helmers RA, Rucci M, Brady M, Kaplan RS: The Mayo Clinic model for running a value-improvement program. HBR, October 2015, .

  8. 8.

    Moriates C, Mourad M, Novelero M, Wachter RM. Development of a hospital-based program focused on improving healthcare value. J Hosp Med. 2014;9(10):671–7.

  9. 9.

    Porter ME, Deerberg-Wittram J, Marks C: Martini Klinik: prostate cancer care. Harvard Business School Case 2014, :714–471.

  10. 10.

    Bammer G. Integration and implementation sciences: building a new specialization. Ecol Soc. 2003;10(2):95–107.

  11. 11.

    Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53.

  12. 12.

    Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. BMJ Quality & Safety. 1998;7(3):149–58.

  13. 13.

    Greenhalgh T, Robert G, Bate P, Macfarlane F, Kyriakidou O. Diffusion of innovations in health service organisations: a systematic literature review: John Wiley & Sons; 2008.

  14. 14.

    Grol R, Wensing M, Eccles M, Davis D. Improving patient care: the implementation of change in health care: John Wiley & Sons; 2013.

  15. 15.

    Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41(3–4):327.

  16. 16.

    Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50.

  17. 17.

    Porcheret M, Main C, Croft P, McKinley R, Hassell A, Dziedzic K. Development of a behaviour change intervention: a case study on the practical application of theory. Implement Sci. 2014;9(1):1.

  18. 18.

    Driessen MT, Groenewoud K, Proper KI, Anema JR. Bongers PM, van der Beek, Allard J: what are possible barriers and facilitators to implementation of a participatory ergonomics programme? Implement Sci. 2010;5(1):1.

  19. 19.

    Hamilton S, McLaren S, Mulhall A. Assessing organisational readiness for change: use of diagnostic analysis prior to the implementation of a multidisciplinary assessment for acute stroke care. Implement Sci. 2007;2(1):1.

  20. 20.

    Kampstra NA. Zipfel N, van der Nat, Paul B, Westert GP, van der wees, Philip J, Groenewoud AS: health outcomes measurement and organizational readiness support quality improvement: a systematic review. BMC Health Serv Res. 2018;18(1):1005.

  21. 21.

    Grol R, Wensing M: Effective implementation of change in healthcare: a systematic approach. Improving Patient Care: The Implementation of Change in Health Care, Second Edition 2013, :40–63.

  22. 22.

    Wensing M, Grol R: Determinants of effective change. Improving patient care 2005, :94–108.

  23. 23.

    Creswell J: W.(1998). Qualitative inquiry and research design: Choosing among five traditions 1998, :2.

  24. 24.

    Baxter P, Jack S. Qualitative case study methodology: study design and implementation for novice researchers. Qual Rep. 2008;13(4):544–59.

  25. 25.

    Quinn PM. Qualitative research and evaluation methods. California EU: Sage Publications Inc; 2002.

  26. 26.

    Curtis S, Gesler W, Smith G, Washburn S. Approaches to sampling and case selection in qualitative research: examples in the geography of health. Soc Sci Med. 2000;50(7):1001–14.

  27. 27.

    [http://www.who.int/patientsafety/safesurgery/checklist/en/].

  28. 28.

    Spanjersberg S, Ottervanger JP, Nierich A, De B, Bruinsma GBB: A detailed checklist in cardiothoracic surgery: the isala safety check. Journal of Cardiology & Cardiovascular Therapy 2016, 2(1).

  29. 29.

    Spanjersberg A, Ottervanger J, Nierich A, Hoogendoorn M, Van Veghel D, Houterman S, Stooker W, Speekenbrink R, Brandon Bravo Bruinsma G: 3268 Implementation of a specific safety checklist in cardiac surgery is followed by lower postoperative mortality. Eur Heart J 2018, 39(suppl_1):ehy563. 3268.

  30. 30.

    Mullen JL, Buzby GP, Matthews DC, Smale BF, Rosato EF. Reduction of operative morbidity and mortality by combined preoperative and postoperative nutritional support. Ann Surg. 1980;192(5):604–13.

  31. 31.

    Anonymous Proceedings of the Mayo Clinic Proceedings: Elsevier; 2001.

  32. 32.

    Eneroth M, Olsson U, Thorngren K: Nutritional supplementation decreases hip fracture-related complications. Clinical Orthopaedics and Related Research® 2006, 451:212–217.

  33. 33.

    Le Cornu KA, McKiernan FJ, Kapadia SA, Neuberger JM. A prospective randomized study of preoperative nutritional supplementation in patients awaiting elective Orthotopic liver Transplantation1. Transplantation. 2000;69(7):1364–9.

  34. 34.

    Warnold I, Lundholm K. Clinical significance of preoperative nutritional status in 215 noncancer patients. Ann Surg. 1984;199(3):299–305.

  35. 35.

    Deutz NE, Bauer JM, Barazzoni R, Biolo G, Boirie Y, Bosy-Westphal A, Cederholm T, Cruz-Jentoft A, Krznaric Z, Nair KS, Singer P, Teta D, Tipton K, Calder PC. Protein intake and exercise for optimal muscle function with aging: recommendations from the ESPEN expert group. Clin Nutr. 2014;33(6):929–36.

  36. 36.

    Kitzinger J. Qualitative research. Introducing focus groups. BMJ. 1995;311(7000):299–302.

  37. 37.

    Anderson NR, West MA: Measuring climate for work group innovation: development and validation of the team climate inventory. J Organ Behav 1998, :235–258.

  38. 38.

    Grol R, Wensing M. What drives change? Barriers to and incentives for achieving evidence-based practice. Med J Aust. 2004;180(6 Suppl):S57.

  39. 39.

    Cabana MD, Rand CS, Powe NR, Wu AW, Wilson MH, Abboud PC, Rubin HR. Why don't physicians follow clinical practice guidelines?: a framework for improvement. JAMA. 1999;282(15):1458–65.

  40. 40.

    Vaughn VM, Saint S, Krein SL, Forman JH, Meddings J, Ameling J, Winter S, Townsend W, Chopra V. Characteristics of healthcare organisations struggling to improve quality: results from a systematic review of qualitative studies. BMJ Qual Saf. 2018.

  41. 41.

    Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients' care. Lancet. 2003;362(9391):1225–30.

  42. 42.

    Francke AL, Smit MC, de Veer AJ, Mistiaen P. Factors influencing the implementation of clinical guidelines for health care professionals: a systematic meta-review. BMC medical informatics and decision making. 2008;8(1):38.

  43. 43.

    Colldén C, Gremyr I, Hellström A, Sporraeus D. A value-based taxonomy of improvement approaches in healthcare. Journal of health organization and management. 2017;31(4):445–58.

  44. 44.

    Moullin JC, Sabater-Hernández D, Fernandez-Llimos F, Benrimoj SI. A systematic review of implementation frameworks of innovations in healthcare and resulting generic implementation framework. Health Res Policy Syst. 2015;13(1):5.

  45. 45.

    Central Committee on Research Involving Human Subjects: Medical Research Involving Human Subject Act. WMO 1998, :.

Download references

Acknowledgements

We would like thank all staff of the St. Antonius Hospital, Nieuwegein, the Netherlands for their contributions in the interviews and focus group interview.

Funding

The Netherlands Organisation for Health Research and Development (ZonMw) grant number 842001005. The funder had no influence in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Author information

NZ, PvdN, GW and SG designed the study. BR and ED contributed to the case selection and participant sampling. NZ conducted data collection. NZ analysed and interpreted the data. SG, GW and PvdN critically revised data interpretation. NZ drafted the manuscript. PvdN, SG, GW, BR and ED revised the manuscript. All authors read and approved the final manuscript.

Correspondence to Nina Zipfel.

Ethics declarations

Ethics approval and consent to participate

All interview participants signed written consent. Ethics approval for this study is not necessary under Dutch law according to the Medical Research Involving Human Subjects act (WMO) because interview questions were not incriminating, intimate or radical [45]. A non-medical scientific research declaration was obtained from the Medical Research Ethics Committees United (MEC-U) of the St. Antonius Hospital with the following reference number: W15.006.

Consent for publication

Not Applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:

Interview guide based on the ICM and VBHC concept. Describes the interview topic guide and questions. (DOCX 15 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zipfel, N., Nat, P.B., Rensing, B.J.W.M. et al. The implementation of change model adds value to value-based healthcare: a qualitative study. BMC Health Serv Res 19, 643 (2019) doi:10.1186/s12913-019-4498-y

Download citation

Keywords

  • Implementation science
  • Implementation of change model
  • Value-based healthcare
  • Outcome improvement
  • Outcome measurement