Skip to content


  • Research article
  • Open Access
  • Open Peer Review

Stroke units, certification, and outcomes in German hospitals: a longitudinal study of patient-based 30-day mortality for 2006–2014

BMC Health Services Research201818:880

  • Received: 13 September 2017
  • Accepted: 30 October 2018
  • Published:
Open Peer Review reports



Treatment of stroke patients in stroke units has increased and studies have shown improved outcomes. However, a large share of patients in Germany is still treated in hospitals without stroke unit. The effects of stroke unit service line, and total hospital quality certification on outcomes remain unclear.


We employ annual hospital panel data for 1100–1300 German hospitals from 2006 to 2014, which includes structural data and 30-day standardized mortality. We estimate hospital- and time-fixed effects regressions with three main independent variables: (1) stroke unit care, (2) stroke unit certification, and (3) total hospital quality certification.


Our results confirm the trend of decreasing stroke mortality ratios, although to a much lesser degree than previous studies. Descriptive analysis illustrates better stroke outcomes for non-certified and certified stroke units and hospitals with total hospital quality certification. In a fixed effects model, having a stroke unit has a significant quality-enhancing effect, lowering stroke mortality by 5.6%, while there is no significant improvement effect for stroke unit certification or total hospital quality certification.


Patients and health systems may benefit substantially from stroke unit treatment expansion as installing a stroke unit appears more meaningful than getting it certified or obtaining a total hospital quality certification. Health systems should thus prioritize investment in stroke unit infrastructure and centralize stroke care in stroke units. They should also prioritize patient-based 30-day mortality data as it allows a more realistic representation of mortality than admission-based data.


  • Stroke
  • Stroke unit
  • Hospital specialization
  • Certificate
  • Accreditation
  • Stroke outcomes


Stroke is the second-leading cause of death worldwide [1]. Recent data shows an incidence of about 16 million first-ever strokes annually, resulting in 5.7 million deaths, substantial long-term disabilities and significant long-term care costs [2]. Worldwide, the substantial stroke incidence, associated deaths and resulting medical and economic costs make it a truly global disease burden [3, 4].

The latest OECD Health Care Quality Indicator data suggest that admission-based mortality rates for ischemic stroke have decreased in the past decade, but stark cross-country outcome differences (6-fold variation) remain [5]. In Germany, an analysis of admission-based data showed an almost 20% reduction of raw and standardized mortality ratios (SMR) between 2005 and 2010. The study’s authors concluded a possible relation to improved primary and secondary prevention as well as increased treatment in specialized stroke units (SUs) [6]. The latter provide specialized acute and rehabilitation care with co-located and dedicated interdisciplinary teams of neurologists, internists, neuro- and vascular surgeons, and radiologists. 24/7 access to radiology (e.g. CT scanners) and thrombectomy equipment is also often included. SU care has been shown to improve both short- and long-term stroke outcomes [68], and reduce overall stroke treatment cost [9, 10].

In contrast, the evidence of a positive relationship between total hospital quality (THQ) certification and outcomes is mixed and incomplete. For stroke and acute myocardial infarction (AMI), deliveries, and hip fractures, a 2014 study found a positive association between certified THQ management systems and clinical leadership, systems for patient safety, and clinical review, but not for clinical practice [11]. Similarly, a study of the Joint Commission on Accreditation of Healthcare Organization (JCAHO) certification found risk-adjusted mortality rates improved in a cross-section analysis of 965 hospitals in 1996 and 1997 [12]. However, most studies find a weaker or non-existent effect between THQ and hospital outcomes [11, 13], and a more significant effect between service line quality systems and quality indicators (e.g. for stroke and AMI) [7, 13, 14].

Studies with a robust fixed effect framework, large hospital panel and patient-based outcome data - including for the period after hospital discharge - are rare. Further, while certification schemes continue to grow, the relationship between certification and hospital care outcomes remains inconclusive [15]. Studies have often examined the link between certification and process measures of care, but not the (or found only a weak) association between certification and outcome measures of care. To our knowledge, no study exists that differentiates outcomes for stroke care in a (i) conventional model, (ii) non-certified SU model, (iii) certified SU model, and (iv) hospitals with certified SU and/or additional THQ certification model, based on a large patient-based panel dataset.

To examine the influence of SU infrastructure and process specialization and certification on quality of stroke care, we rely on Donabedian’s structure, process, and outcome framework, in which outcomes are influenced by hospital structures and processes [16]. Stroke care is a particularly apt example to test this relationship since SU set-up and certification require substantial structural and process standards to be met. Therefore, we explore whether treatment of stroke in specialized facilities (i.e. SUs) improves quality and thereby warrants substantial investment at hospital and health system level. Likewise, we ask whether an additional SU certification further improves stroke care outcomes. We also examine if THQ certification and case volumes influence the relationship between SU specialization, certification, and stroke outcomes.



We linked hospital data from different sources based on standardized institutional codes, which are unique mandatory identifiers for each hospital in Germany. First, we obtained structural hospital data (e.g. case volume, hospital teaching status, type of ownership) for the available years 2006, 2008, 2010, 2012, 2013, and 2014 from the German mandatory quality monitoring system, operated by the executive authority of the German health care system, the Federal Joint Committee (Gemeinsamer Bundesausschuss, G-BA). The G-BA provides publicly available hospital report cards for research purposes upon request via XML files on hospital and annual level.

Second, we integrated risk-adjusted, patient-based stroke outcome data (for the stroke diagnoses I. intracerebral hemorrhage, ICD Code I61; II. ischemic stroke, I63, and III. stroke not specified as hemorrhage or ischemic, I64) from the Quality Assurance with Routine Data (Qualitätssicherung mit Routinedaten, QSR) program. The QSR is operated by the AOK, the largest German sickness fund, and employs routine in- and outpatient data of AOK insured patients. It provides a risk-adjusted 30-day SMR, comparing observed vs. expected events. For risk-adjustment purposes, the QSR calculates 30-day expected mortality by means of logit regressions which includes patient-specific risk-factors like age, gender, and a set of comorbidities [17, 18]. To ensure comparability across years, we applied the 2014 logit risk-adjustment model to the AOK patient data for all data years.

Third, we included information on SU certification from the German Stroke Society (Deutsche Schlaganfall Gesellschaft, DSG), the premier German SU certification scheme [19]. The data provides information on which hospitals have DSG-certified SUs and the period of certification. A DSG certificate, granted for three years, requires minimum patient volume, minimum volume of certain interventions, staff level resources, and training obligations. Hospitals with non-certified SUs were identified by two specific procedure codes (OPS 8-891and 8-89b), which capture provision of complex stroke care [20].We assumed the existence of a SU when a hospital reported at least ten such procedures per year [6]. Structural standards are generally higher for DSG certification than for documenting complex stroke procedures.

Fourth, we integrated data from the THQ certificate Cooperation for Transparency and Quality in Health Care (Kooperation für Transparenz und Qualität im Gesundheitswesen, KTQ), comparable to JCAHO accreditation. Central components include continuous quality improvement in: patient orientation, employee orientation, patient safety, quality management, communication, transparency, and leadership [21]. Like the DSG SU certificate, the certificate is granted for 3 years. Hospital specific information on both certification schemes were provided from the mentioned organizations and integrated via standardized institutional codes and address information.

Empirical strategy

Based on Donabedian’s quality framework [16], we hypothesize better stroke outcome quality for hospitals that organize care through: (ii) a dedicated SU facility, (iii) SU certification, and (iv) total hospital quality (THQ) certification, relative to the (i) conventional, non-SU care model. We employ a fixed effects model with a within-regression estimator at hospital level. To quantify the influence of (certified) SU care on stroke outcomes, we regress the log of stroke 30-day SMR (SMRit) on separate dummy variables, specifying the existence of a SU (SUit), a DSG-certified SU (acc _ SUit) and a THQ certification (acc _ THQit). We add the log of stroke case volume (stroke _ CVit) to model stroke treatment experience, and a flattening learning curve. We include the share of stroke patients relative to all patients treated to account for relative importance of and organizational focus on stroke care. Hospital beds (bedsit), dummy variables for hospital teaching status and ownership type, and a category medical specialization (CMS) [22] index reflect important time-variant characteristics. For time-variant trends that affect each hospital equally such as technological advances, regulatory changes, and judicial decisions, we specify time effects (τt), excluding 2006 as the reference year. To adjust for the optimal level of stroke quality of care with a 0 SMR value (0 observed mortality), we adapt Battese’s (1997) approach to include an dummy explanatory variable (\( {D}_{it}^{SMR} \)), which takes on the value of 1 when the SMR is 0, and add \( {D}_{it}^{SMR} \) to SMRit before taking the log [23]. We further adjust for the fact that hospitals treat variable amounts of stroke patients using AOK patient stroke case volume as analytical weights. The main model is specified in Eq. 1:
$$ \log \left({SMR}_{it}\right)={\beta}_0+{\beta}_1{D}_{it}^{SMR}+{\beta}_2{SU}_{it}+{\beta}_3{certSU}_{it}+{\beta}_4{certTHQ}_{it}+{\beta}_5\log \left({stroke}_{CVit}\right)+{\beta}_6{\frac{ stroke\ cases}{all\ cases}}_{it}+{\beta}_7{beds}_{it}+{\beta}_8{CMS}_{it}+{\beta}_9{teach}_{it}+{\beta}_{10}{private}_{it}+{\beta}_{11}{public}_{it}+{\alpha}_i+{\tau}_t+{\varepsilon}_{it} $$

In addition to the variables specified above, β0 is the intercept, αi is individual time invariant hospital-fixed effects, and εit is the error term. To assess result robustness, we further estimate the model using the log of the number of SU complex procedures instead of the dummy indicator variable for stroke units. The data comprise repeated measurements at the hospital level which may involve autocorrelation in the error term εit. A Hausman test indicates that a random effects specification would likely yield inconsistent estimates. We therefore use hospital fixed effects αi to control for unobserved hospital characteristics and avoid inconsistencies. Testing the time-fixed effects τt for joint significance indicates systematic differences in mortality across years. All statistical inferences are based on heteroscedasticity- and autocorrelation-consistent estimates for the standard errors.


Between 2006 and 2014 our sample includes on average 1243 hospitals per year (Table 1). Because of hospital closures and mergers the number of hospitals within our sample decreased by 13% from 1331 in 2006 to 1162 in 2014, 726 stroke-treating hospitals had no SU, 436 hospitals did, of which 222 SUs were DSG-certified, and 280 hospitals were THQ-certified. On average, hospitals treat 227 stroke patients per annum and have a 30-day stroke SMR of 0.99, a reduction of approximately 13% since 2006. In 2014, our hospital sample includes 86% of all hospitals that recorded at least 2 stroke diagnoses. The discrepancy (Table 1) is due to QSR data availability and the G-BA’s 2010 shift to reporting at site level, resulting in increases to the number of hospitals and sites in the overall, non-QSR sample.
Table 1

Overview main variables over time from 2006 to 2014








Number of hospital observationsa







Average 30-day stroke SMRb







Number of hospitals with SUc,d







Number of ospitals with certified SUd,e







Number of hospitals with THQ certificationd







Average stroke case volumef per hospital







Share stroke cases/inpatient cases (in %)







Numb. of hospitals with teaching status (%)

511 (38)

557 (43)

581 (47)

619 (50)

625 (52)

642 (55)

Average number of hospital beds







Average CMS specialization indexf







Number of orivate, for-profit hospitals (%)

225 (17)

237 (18)

245 (20)

257 (21)

255 (21)

255 (22)

Number of private, non-profit hospitals (%)

592 (44)

574 (44)

551 (44)

527 (43)

514 (43)

477 (41)

Number of public hospitals (%)

514 (39)

481 (37)

448 (36)

444 (36)

434 (36)

430 (37)

Number of hospitals with stroke diagnosesg







aall observations that have QSR SMR stroke outcome data

bweighted by the AOK stroke patient volume for each hospital

cbased on more than 10 documented complex stroke procedures (OPS codes 8_891 und 8_89b), for 2014 461 SU exist in full sample independent on whether QSR data is available

dand have QSR SMR stroke outcome data (especially for THQ overall more certified hospitals in Germany)

eDSG SU certification suspended in 2008 and part of 2009, which led to a backlog of (re-) certification applications and a reduction in DSG certified hospitals in data year 2010

fbased on ICD stroke diagnoses I61 (hemorrhage), I63 (ischemic) and I64 (not further specified)

gAll hospitals which have coded 2 or more stroke ICD cases. Discrepancy in number of observations due to QSR data availability and G-BA reporting for multiple sites starting in 2010 and becoming mandatory in 2012 and 2013

Figure 1 presents the weighted median and standard deviation (SD) of the SMR for the respective hospital sub-groups with conventional stroke care (‘No SU’), a dedicated SU care model (‘SU’), a certified SU (‘Cert SU’) and a certified SU within a hospital with a KTQ THQ certificate (‘Cert SU + KTQ’).
Fig. 1
Fig. 1

Median and standard deviation (above and below median) for the 30-day stroke SMR and hospitals with a conventional care model (‘No SU‘), a SU facility (‘SU’), a certified SU (‘Cert SU’) and a certified SU within a hospital with a KTQ THQ certificate (‘Cert SU + KTQ’). Note: 1. QSR stroke volume applied as analytical weights; 2. Number of hospitals and associated hospital sites; 3. Mean annual stroke ICD case volume including diagnoses I61 (hemorrhage), I63 (ischemic) and I64 (not further specified)

Hospitals that treat stroke patients in a conventional model have the highest SMR and the largest outcome variation (i.e. SD). Their number reduces from 1047 hospitals in 2006 to 721 in 2014 and their average stroke patient volume declines from 69 to 42 patients; however, in 2014 30,000 stroke patients are still treated at hospitals with a subpar care model and a substantially higher risk of death.

Compared to the conventional mode, the outcome quality improves for patients treated in a stroke unit. Both the median SMR and the outcome variation are substantially reduced. Over time, the median SMR for all subgroups improves, however outcome variation remains roughly constant.

In 2006 and 2008, the SMR is lower in both certified SU care models relative to the non-certified SU. However, from 2010 to 2012, the median SMR for hospitals with a non-certified SU decreased from 1.07 to 0.98, while for hospitals with a certified SU or both SU and THQ certifications it increased to 1.05 and 1.03. More than 30 larger hospitals with a relatively high 30-day SMR received a SU certification between 2010 and 2012 and decreased their 30-day SMR, which lowered the overall average in the following years, but pushed up the SMRs for the certification subgroups.

Table 2 presents descriptive statistics for the relevant empirical model variables summing across all years.
Table 2

Descriptive statistics, all years (Mean, standard deviation, minimum, maximum)






Number of observationsa (2006–2014)



Average stroke 30-day SMRb





 Log average stroke 30-day SMRb





Hospitals with a specialized SU





Average complex stroke procedures





 Log average complex stroke proceduresb





Hospitals with DSG-certified SU





Hospitals with KTQ-TQH certification





Average case volume stroke cases





 Log average case volume stroke casesb





Share stroke cases / inpatient cases





Hospitals with teaching status





Average number of hospital beds





CMS specialization index





Private, for-profit hospitals





Private, non-for-profit hospitals





Public hospitals





aall hospitals that have QSR SMR stroke outcome data

bQSR stroke patient volume applied as analytical weights; 0.000000001 or 1 added before taking the log to avoid losing observations with 0 values, results are similar for both

Table 3 presents regression results of our main model (M1). SU care is associated with a 5.6% lower 30-day SMR, while SU or THQ certification shows no significant additional effect on stroke outcomes. Neither stroke volume nor the share of stroke cases relative to all inpatient cases has a significant effect on SMR. The time fixed effects for years 2013 and 2014 have negative and significant coefficients (− 0.05***, − 0.08***). We consider M1 our main model as it implements our empirical strategy and has the lowest Bayesian Information Criterion (BIC) [24].
Table 3

Regression results main model M1 (beta, lower and upper confidence interval)



Log dummy

−0.059* (−0.112, −0.007)

Stroke unit (SU)

− 0.056** (− 0.092, − 0.021)

SU certification

− 0.005 (− 0.036, − 0.026)

THQ certification

0.016 (− 0.019, 0.051)

Log stroke case volume

− 0.003 (− 0.010, 0.003)

Share stroke ICD/all ICD

− 0.054 (− 0.743, 0.635)

Hospital beds

− 0.000 (− 0.000, 0.000)

Teaching hospital status

0.007 (− 0.041, 0.056)

CMS ICD specialization index

− 0.003 (− 0.039, 0.033)

Private, for-profit hospitalsa

− 0.022 (− 0.144, 0.101)

Public hospitalsa

− 0.011 (− 0.138, 0.117)

Log all OPS stroke procedures

Interaction SU and THQ certification


0.182* (0.030, 0.333)









Intraclass correlation




Number of observations


QSR stroke patient volume applied as analytical weights; *p < 0.05, **p < 0.01, ***p < 0.001; time-fixed effects not displayed separately (in M1, β2008 = − 0.02, β2010 = − 0.03, β2012 = − 0.03, β2013 = − 0.05***, β2014 = − 0.08***), test for joint significance of time effects in M1 with F-statistic of 5.59

aprivate (non-profit) hospitals serve as reference category

For model robustness, we ensure consistency of our results when using alternative variable, sample, and model specifications M2 to M9 (see Additional file 1).

Discussion and limitations


Our analysis confirms the positive trend over time of SMR reduction after stroke in Germany, although to a much lower degree than prior studies have shown [6]. This can be attributed to the use of patient-based 30-day mortality data, including time after patient discharge. This data enables a cross-sectoral perspective on stroke care and demonstrates the shortcomings of admission-based data.

The descriptive stroke SMR trends for the different hospital sub-groups suggest progressively better stroke outcomes in hospitals with SU infrastructure, a SU that is also DSG-certified, and a certified SU within a THQ certified hospital. Results of the fixed effects regression models also show that having a SU alone significantly enhances outcome quality of care. The results align with previous research and confirm the benefits of treating patients in a dedicated SU facility [7, 8, 14].

Conversely, both certifications do not show significant effects. The structural and process differences between non-certified and certified SUs might be too small to show significant impact, and the overall hospital quality management improvements associated with the THQ certification might not be meaningful enough to influence outcomes in emergency medical conditions such as stroke.

On a health system level, our results question why a large share of German stroke patients is still treated in non-specialized facilities, and, related, why the shift towards a centralized stroke treatment model is sluggish [6]. Our findings suggest that treating all stroke patients at hospitals with a SU may result in a decrease in the absolute 30-day stroke mortality by 5.6%, from 16.2 to 15.3% even after adjusting for case volume and share of stroke cases. For those roughly 50,300 stroke patients currently treated at hospitals without SUs, this would correspond to 460 fewer annual stroke-related deaths. Considerable reductions in stroke-related disabilities and in medical and economic costs are additional expected benefits [7].

Experience in other European countries demonstrates the positive outcome impact of stroke care centralization in SUs [25, 26]. Underpinning the centralization argument is the positive volume-outcome relationship, which has also been shown to hold for stroke [27]. In the mid-term, national and regional policy makers should ensure that all stroke patients are treated in SUs by requiring SU infrastructure for stroke care and centralizing stroke care with hospitals that already operate a well-performing SU.

The German certification of SUs sets high procedural, personnel, and infrastructural standards; however, as above, in contrast to expectations, the SU service line certification shows no additional significant improvement with 30-day stroke SMR when non-certified SU existence is controlled for. Several explanations are possible. First, DSG certification confirms the SU set-up externally, with some additional staffing and process requirements. These enhancements might not have a large enough additional effect on the 30-day mortality compared to the standard SU characteristics.

Second, mortality is a valid and well-accepted outcome parameter [28], but it is only one of the outcomes that matters in stroke care [29]. Others, such as readmissions, degree of disabilities, and quality of life are also important [7, 29]. Standardized and risk-adjusted data for these outcome parameters are not currently available in Germany. Certified SUs, however, might have better outcomes for these indicators because the DSG certification takes a holistic approach, focusing on reducing disabilities after stroke [19]. Third, certified SU might have improved outcomes over a longer timeframe than the 30 days after hospital admission examined here.

Likewise, certified SUs might provide care for more severe patients, as they have on average substantially higher case volumes (Fig. 1). While the standardized 30-day stroke mortality is adjusted for co-morbidities, stroke severity (e.g. National Institutes of Health Stroke Scale from 0 to 42) is not fully reflected by administrative data [30]. However, the impact of severity adjustment on risk-adjusted indicators that already are adjusted for co-morbidities, age and other patient characteristics has been shown to be limited [31]. Lastly, the suspension of the DSG SU certification process in 2008 and first months of 2009, which resulted in delays for about 100 re- or new stroke unit certifications [32], might have also reduced the effectiveness of the DSG certification for the time span 2008–2012 and the amount of 30-day stroke SMR improvement attributable to the DSG certification.

THQ certification showed no additional significant effect on 30-day stroke mortality, in line with previous studies in other countries [11, 13]. The primary purpose of this certification is the general improvement of hospital quality management; its achievement might not be appropriately reflected by 30-day mortality in one specific emergency condition. Other measures such as patient safety, patient and employee responsiveness and satisfaction, and operational efficiency at the hospital level might be more affected by THQ certification. For example, Lindlbauer et al. (2016) show improved technical efficiency for THQ-certified hospitals. A downward bias of the THQ effect could be possible due to the fact that no consolidated and standardized data on ISO 9001 certification, which is a universal quality certificate also applied in hospitals, is available. Hospitals without a KTQ certification might alternatively have an ISO 9001 THQ certification even though they appear without THQ certification in our dataset. However, the number of ISO 9001 certifications is likely substantially smaller compared to the KTQ-certified hospitals [22].

Lastly, there are benefits from certification schemes that are not captured by outcome data. Both the SU and the THQ certification provide quality signals for patients, emergency teams, and admitting physicians, which can facilitate hospital choice decisions.

Limitations of this study

Besides the limitations mentioned above, the results of this study should be viewed considering some data and methodological limitations. The validity of self-reported hospital data might be compromised, due to reputational concerns by hospitals and different coding practices. Annual, random validity checks and cross-checks with administrative patient data, demonstrated for 5% of hospital reports some validity issues affecting 15–60% of the examined reporting data(26, 57).

The analyzed post-discharge timeframe of 30 days for stroke mortality provides substantial information on outcome quality, but an extended period like365 days might provide additional insights. While the AOK QSR indicators have some advantages, they only rely on data for patients insured by the AOK sickness fund. This might lead to biased outcome indicators, but the high share of AOK insured patients in all German hospitals (35% average market share) and results from previous studies (58) demonstrate the representativeness of the AOK QSR data.

Even though the outcome data is risk-adjusted for a large set of comorbidities and age, some bias might be affecting the results as the outcome data is not fully adjusted for severity. This might especially affect certified stroke unit hospitals as they could receive more severe cases, also via transfer from non-certified stroke units, leading to higher mortality that is not accounted for in the patient-based risk adjustment. Therefore, the effect of a SU certification or a full hospital certification is possibly underestimated in our data.


Our results substantiate the positive effect of SU treatment on stroke outcomes, based on a fixed effects model and large multi-year hospital sample, suggesting that hospital and health system investment in SUs improve stroke outcomes. SUs may help save numerous life-years, reduce stroke associated disabilities, and lower long-term stroke treatment cost considerably. Germany can learn from other country examples regarding centralization and (mandatory) emergency protocols for stroke treatment. As the first study to distinguish the potential effects of SU existence, SU certification and THQ certification, we do not find a significant effect for SU certification or THQ certification on top of the large and significant effect for SU specialization.

Our research contributes to the literature on outcomes and operational research and how hospital quality of care can be improved through structural and process enhancements. The results have implications for the organization of stroke care in other countries as well as the academic and professional debate around the benefits of infrastructure specialization and certification in health care. Additional research can examine the effect of specialization and service line certification on other stroke outcome measures (e.g. disabilities) and outcomes in other treatment areas, such as cardiology or oncology specialized treatment units. Likewise, the effect of THQ can also be examined with other outcome indicators, with additional information on other THQ certifications and for other more elective treatment areas, where a THQ certification might possibly show a higher impact.



Acute myocardial infarction


Bayesian Information Criterion


Category medical specialization


Deutsche Schlaganfall Gesellschaft (German Stroke Society)


Gemeinsamer Bundesausschuss (Federal Joint Committee)


Joint Commission on Accreditation of Healthcare Organization


Kooperation für Transparenz und Qualität im Gesundheitswesen (Cooperation for Transparency and Quality in Health Care)


Qualitätssicherung mit Routinedaten (Quality Assurance with Routine Data)


Standard deviation


Standardized mortality ratios


Stroke unit


Total hospital quality management


Wissenschaftliches Institut der AOK (Research Institute of the AOK SHI fund)



We thank the Research Institute of the AOK SHI fund (WIdO), and specifically Christian Günster, for granting us access to the QSR outcome data. We are also grateful to Prof. Dr. med. Otto Busse from the German Stroke Society and Mrs. Gesine Dannenmaier and her team at the Kooperation für Qualität und Transparenz im Gesundheitswesen for their willingness to share their data and expertise with our research team. Furthermore, we thank Katherine M. Polin, a colleague at the department of Health Care Management at Berlin University of Technology, for her help in editing this article.


The study was supported by the German Research Foundation (DFG) within the project Outcome Transparency and its Effects on Quality of Care in Hospitals, project number 323809466, , CP was supported by a general PhD scholarship from the Konrad-Adenauer-Foundation. Financial support for MS from the German Ministry of Education and Research (BMBF), funding number 01EH1604A, is gratefully acknowledged. All mentioned funding bodies had no role in the design of the study and collection, analysis, and interpretation of data or in writing the manuscript.

Availability of data and materials

The data that support the findings of this study are available from Research Institute of the AOK SHI fund (WIdO) but restrictions apply to the availability of these data, which were used under license for the current study, and so are not publicly available. Data are however available from the authors upon reasonable request and with permission of WIdO.

Authors’ contributions

Lead authors were CP and EB. CP and AG initiated and drafted the study idea, outline and implementation strategy. MS supported data analyses and method selection. AG and RB supported the study design and methods selection, methodologies and contributed to the writing of the article. All authors read and approved the final manuscript.

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Authors’ Affiliations

Department of Health Care Management, Berlin University of Technology, Administrative office H80, Str. des 17. Juni 135, 10623 Berlin, Germany
Department of Empirical Health Economics, Berlin University of Technology, Berlin, Germany
European Observatory on Health Systems and Policies, Brussels, Belgium


  1. GBD 2015 Mortality. Causes of Death Collaborators: Global, regional, and national life expectancy, all-cause mortality, and cause-specific mortality for 249 causes of death, 1980–2015: A systematic analysis for the Global Burden of Disease Study 2015. Lancet. 2016;388(10053):1459–544.View ArticleGoogle Scholar
  2. Strong K, Mathers C, Bonita R. Preventing stroke: saving lives around the world. The Lancet Neurology. 2007;6(2):182–7.View ArticleGoogle Scholar
  3. Mozaffarian D, Benjamin EJ, Go AS, et al. Heart disease and stroke statistics-2015 update: a report from the American Heart Association. Circulation. 2015;131(4):e29–322.PubMedGoogle Scholar
  4. Nichols M, Townsend N, Scarborough P, Rayner M. European cardiovascular disease statistics. 2012nd ed. Brussels and Sophia Antipolis: European Heart Network and European Society of Cardiology; 2012.Google Scholar
  5. OECD. Health at a Glance 2015: OECD Indicators. Paris: OECD Publishing.
  6. Nimptsch U, Mansky T. Stroke unit care and trends of in-hospital mortality for stroke in Germany 2005-2010. Int J Stroke. 2014;9(3):260–5.View ArticleGoogle Scholar
  7. Stroke Unit Trialists' Collaboration. Organised inpatient (stroke unit) care for stroke. Cochrane Database Syst Rev. 2013;9:CD000197.Google Scholar
  8. Seenan P, Long M, Langhorne P. Stroke units in their natural habitat: systematic review of observational studies. Stroke. 2007;38(6):1886–92.View ArticleGoogle Scholar
  9. Grieve R, Porsdal V, Hutton J, Wolfe C. A comparison of the cost-effectiveness of stroke care provided in London and Copenhagen. Int J Technol Assess Health Care. 2000;16(2):684–95.View ArticleGoogle Scholar
  10. Saka O, Serra V, Samyshkin Y, McGuire A, Wolfe CCDA. Cost-effectiveness of stroke unit care followed by early supported discharge. Stroke. 2009;40(1):24–9.View ArticleGoogle Scholar
  11. Shaw CD, Groene O, Botje D, et al. The effect of certification and accreditation on quality management in 4 clinical services in 73 European hospitals. Int J Qual Health Care. 2014;26(Suppl 1):100–7.View ArticleGoogle Scholar
  12. Joshi M. Hospital Quality of Care: The Link Between Accrediation and Mortality. Journal of Clinical Outcome Management. 2003;10(9):473–80.Google Scholar
  13. Hinchcliff R, Greenfield D, Moldovan M, et al. Narrative synthesis of health service accreditation literature. BMJ Qual Saf. 2012;21(12):979–91.View ArticleGoogle Scholar
  14. Lichtman JH, Allen NB, Wang Y, Watanabe E, Jones SB, Goldstein LB. Stroke patient outcomes in US hospitals before the start of the joint commission primary stroke center certification program. Stroke. 2009;40(11):3574–9.View ArticleGoogle Scholar
  15. Brubakk K, Vist GE, Bukholm G, Barach P, Tjomsland O. A systematic review of hospital accreditation: the challenges of measuring complex intervention effects. BMC Health Serv Res. 2015;15:280.View ArticleGoogle Scholar
  16. Donabedian A. Evaluating the quality of medical care. The Milbank Memorial Fund Quarterly Vol. 44, No. 3, Part 2: Health Services Research I. A Series of Papers Commissioned by the Health Services Research Study Section of the United States Public Health Service. Discussed at a Conference Held in Chicago, October 15-16, 1965 (Jul., 1966), pp. 166–206.Google Scholar
  17. WIdO: Methods. Accessed 20 Feb 2015.
  18. WIdO: Indicator handbook for treatment areas not covered in AOK Hospitalnavigator: QSR Verfahren. Accessed 20 Feb 2015.
  19. Nabavi DG, Ringelstein EB. Stroke Units. In: Jungehülsing GJ, Endres M, editors. Komplikationen und Folgeerkrankungen nach Schlaganfall. Stuttgart and New York: Thieme; 2015. p. 158–61.Google Scholar
  20. Schilling M, Kiefer R, Busse O, Ferbert A, Grond M, Ringelstein EB. Kodierleitfaden Schlaganfall der DSG und DGN 2008; 2008.Google Scholar
  21. KTQ-GmbH. KTQ-Manual/KTQ-Katalog: ab 2009/Version, vol. 2. 2nd ed. Berlin: Fachverlag Matthias Grimm; 2012.Google Scholar
  22. Lindlbauer I, Schreyögg J, Winter V. Changes in technical efficiency after quality management certification: a DEA approach using difference-in-difference estimation with genetic matching in the hospital industry. Eur J Oper Res. 2016;250(3):1026–36.View ArticleGoogle Scholar
  24. Schwarz G. Estimating the dimension of a model. Ann Stat. 1978;6(2):461–4.View ArticleGoogle Scholar
  25. Douw K, Nielsen CP, Pedersen CR. Centralising acute stroke care and moving care to the community in a Danish health region: challenges in implementing a stroke care reform. Health Policy. 2015;119(8):1005–10.View ArticleGoogle Scholar
  26. Ramsay AIG, Morris S, Hoffman A, et al. Effects of centralizing acute stroke services on stroke care provision in two large metropolitan areas in England. Stroke. 2015;46(8):2244–51.View ArticleGoogle Scholar
  27. Saposnik G, Baibergenova A, O'Donnell M, Hill M, Kapral M, Hachinski V. Hospital volume and stroke outcome: does it matter? Neurology. 2007;69(11):1142–51.View ArticleGoogle Scholar
  28. Walsh K, Gompertz PH, Rudd AG. Stroke care: how do we measure quality? Review. PostgradMedJ. 2002;78:322–6.View ArticleGoogle Scholar
  29. AQUA - Institut für angewandte Qualitätsförderung und Forschung im Gesundheitswesen GmbH: Versorgungsqualität bei Schlaganfall: Konzeptskizze für ein Qualitätssicherungsverfahren 2015.\_AQUA\_Abnahme-Konzeptskizze-Schlaganfall.pdf. Accessed 22 Feb 2015.Google Scholar
  30. Wiedmann S, Heuschmann PU, Hillmann S, et al. Qualität der Behandlung des akuten Schlaganfalls: Auswertung evidenzbasierter Indikatoren von 260 000 Patientendaten. Deutsches Ärzteblatt. 2014;111(45):759–65.Google Scholar
  31. Keyhani S, Cheng E, Arling G, et al. Does the inclusion of stroke severity in a 30-day mortality model change standardized mortality rates at veterans affairs hospitals? Circ Cardiovasc Qual Outcomes. 2012;5(4):508–13.View ArticleGoogle Scholar
  32. Busse O. Mitteilungen der Deutschen Schlaganfallgesellschaft. Nervenarzt. 2009;6:748.Google Scholar


© The Author(s). 2018