Skip to main content

Table 6 Factors influencing the SHARE process of selecting disinvestment projects

From: Sustainability in Health care by Allocating Resources Effectively (SHARE) 6: investigating methods to identify, prioritise, implement and evaluate disinvestment projects in a local healthcare setting

Positive

Negative

External environment

â–ª The SHARE program was adequately funded (until the final phase of the program)

â–ª Two proposals that received state health department funding and endorsement were considered favourably.

â–ª Two proposals were triggered by new national guidelines, one by an editorial in the Medical Journal of Australia, and others by journal articles, email bulletins, attendance at conferences and proposers awareness of practice elsewhere.

â–ª The state health department withdrew funding for the final phase of the SHARE program resulting in reduction of the proposed evaluation activities.

▪ One project was rejected due to difficulties implementing change during the national accreditation process for this department’s services.

Organisational environment (Monash Health)

â–ª Monash Health encourages and supports innovation

â–ª High level expertise was available from CCE and Clinical Information Management

â–ª Waiting for responses to email correspondence and requests for appointments to meet with key personnel; time lags due to annual and long service leave and decisions by committees that only meet monthly delayed the processes of identification, prioritisation, decision-making and project development. Delays in deciding that unsuitable projects would not go ahead prevented other potentially suitable projects from being investigated.

â–ª The proposer of one project was unaware of an existing organisational review into the problem.

â–ª Delays related to introduction of a new computer database and electronic ordering system contributed to one project being rejected.

Identification process

▪ The ‘bottom up’ Expression of Interest process was the only systematic approach used, resulting in two projects being received and accepted (but both later rejected).

▪ The ‘top down’ evidence-based catalogue of disinvestment opportunities was not utilised in identifying potential projects.

▪ The ‘ad hoc’ process of nominations and decision-making dominated

▪ Most proposals were made by ‘outsiders’ not involved in the nominated clinical pathway. Only two proposals were made by the potential adopters, although one subsequently withdrew their application.

Prioritisation and decision-making process

â–ª All discussions were held within meetings and documented in the minutes; there were no attempts to be covert or follow hidden agendas.

â–ª Conflict of interest was addressed as a routine agenda item.

â–ª All clinical programs, health professional disciplines, consumers and technical experts in evidence, data, legal, ethics, finance, purchasing, biomedical engineering and information technology were represented in decision-making.

â–ª There were no explicit processes for risk assessment, deliberation or appeal. It was not always clear how decisions had been made.

â–ª The SHARE Steering Committee did not have authority to direct change. Proposals were put to department heads who declined to follow them up (based on reasoned arguments that they should not to go ahead).

Rationale and motivation

â–ª Safety and effectiveness were the primary reasons for nominating TCPs for disinvestment, cost-savings were a secondary benefit

 

Proposal for change

â–ª Six proposals were submitted based on guidelines, systematic reviews or health technology assessments; the four accepted projects were in this group.

â–ª Four proposals had supporting data, two regarding unnecessary diagnostic imaging tests and the two VPACT projects.

â–ª The two VPACT projects presented defined objectives.

â–ª One project had a clear reinvestment plan which allowed operating theatre time previously used by patients now undergoing the new non-surgical procedure to be used by other patients on the waiting lists, this was the implemented pilot project.

â–ª In 13 proposals, the nominator did not provide supporting evidence.

â–ª Many of the proposals did not clearly define the TCP, patient population group, circumstances of restriction, etc. This is difficult to quantify as clarification may have been forthcoming but the proposals were not investigated further

Potential adopters

â–ª Three nominations were made by the potential adopters; one was the pilot project accepted and implemented, one was accepted as a pilot project but was subsequently withdrawn by the applicants and the other was nominated too late to be included in the SHARE timeframe

â–ª Decisions regarding eight proposals were declined by heads of the departments responsible for the proposed TCP. Reasons included lack of clarity of the problem, lack of supporting evidence, or the evidence was not relevant to local patient groups.

â–ª In two of the accepted projects, the key adopters reversed their decisions about the supporting evidence and withdrew.

Potential patients

 

â–ª Two proposals were rejected when it became clear that the evidence did not apply to the Monash Health population.

Implementation and evaluation plans and resources

â–ª The CCE/SHARE support staff had appropriate expertise and knowledge of methods and tools for implementation and evaluation.

â–ª The CCE team provided access to research literature and liaised on behalf of the clinical project teams with the Clinical Information Management (CIM) unit who were happy to provide access to data and assistance with analysis.

â–ª All implementation activities within the control of the SHARE project team were completed

â–ª Detailed evaluation plans were developed in consultation with an external health program evaluator and health economist

â–ª One proposal had assistance of a research fellow to undertake the project work (but this did not go ahead for other reasons).

â–ª The clinical project leads of two accepted projects attended workshops in evidence-based change, implementation and evaluation

â–ª Lack of evaluation funding precluded understanding of the barriers that prevented implementation of the planned systematic evidence-based processes

â–ª Lack of evaluation funding limited evaluation activities in the last year of the program

â–ª One project was rejected by the department head because they could not provide backfill for the clinical duties of the project leader.