Key Performance Indicators to Benchmark Hospital Information Systems - A Delphi Study

Page created by Bobby Ray
 
CONTINUE READING
508   © Schattauer 2009                                                                                                                                   Original Articles

      Key Performance Indicators to
      Benchmark Hospital Information
      Systems – A Delphi Study
      G. Hübner-Bloder; E. Ammenwerth
      Institute for Health Information Systems, UMIT – University for Health Sciences, Medical Informatics and Technology,
      Hall in Tyrol, Austria

                                                                                                                             ponents in their information-processing
       Keywords                                                   and user satisfaction. Isolated technical indi-
                                                                                                                             roles [3]. The central task of HISs is to sup-
       Benchmarking, hospital information systems,                cators or cost indicators were not seen as use-
                                                                                                                             port a high-quality and cost-effective patient
       medical informatics, Delphi technique, quality             ful. The experts favored an interdisciplinary
                                                                                                                             care [4].
       indicators, key performance indicators, user               group of all the stakeholders, led by hospital
                                                                                                                                 As a result of the increasing importance of
       satisfaction                                               management, to conduct the HIS benchmark-
                                                                                                                             efficient information logistics, a systematic
                                                                  ing. They proposed benchmarking activities
       Summary                                                                                                               information management, comprising plan-
                                                                  both in regular (annual) intervals as well as at
       Objectives: To identify the key performance                                                                           ning, directing and monitoring, becomes a
                                                                  defined events (for example after IT introduc-
       indicators for hospital information systems                                                                           central mission for hospitals [3]. While the
                                                                  tion). Most of the experts stated that in their
       (HIS) that can be used for HIS benchmarking.                                                                          planning and directing of information sys-
                                                                  institutions no HIS benchmarking activities
       Methods: A Delphi survey with one quali-                                                                              tems seem to be supported by several stan-
                                                                  are being performed at the moment.
       tative and two quantitative rounds. Forty-four                                                                        dards and guidelines (for example [3, 5–7]),
                                                                  Conclusion: In the context of IT governance,
       HIS experts from health care IT practice and                                                                          the monitoring of information systems is
                                                                  IT benchmarking is gaining importance in the
       academia participated in all three rounds.                                                                            often seen as more complex and not well sup-
                                                                  healthcare area. The found indicators reflect
       Results: Seventy-seven performance indi-                                                                              ported [8, 9]. Monitoring comprises, among
                                                                  the view of health care IT professionals and
       cators were identified and organized into                                                                             others, the definition and application of those
                                                                  researchers. Research is needed to further
       eight categories: technical quality, software                                                                         criteria that should reflect the quality and
                                                                  validate and operationalize key performance
       quality, architecture and interface quality, IT                                                                       efficiency of information logistics. In this
                                                                  indicators, to provide an IT benchmarking
       vendor quality, IT support and IT department                                                                          sense, monitoring stands in close relation
                                                                  framework, and to provide open repositories
       quality, workflow support quality, IT outcome                                                                         with a systematic benchmarking of HISs.
                                                                  for a comparison of the HIS benchmarks of
       quality, and IT costs. The highest ranked indi-                                                                           According to the Joint Commission [10],
                                                                  different hospitals.
       cators are related to clinical workflow support                                                                       benchmarking can be defined as the “con-
                                                                                                                             tinuous measurement of a process, product,
                                                                                                                             or service compared to those of the toughest
       Correspondence to:                                         Methods Inf Med 2009; 48: 508–518                          competitor, to those considered industry
       Univ.-Prof. Dr. Elske Ammenwerth                           doi: 10.3414/ME09-01-0044                                  leaders, or to similar activities in the organi-
       Institute for Health Information Systems                   received: May 20, 2009                                     zation in order to find and implement ways to
       Eduard Wallnöfer-Zentrum I                                 accepted: July 29, 2009
                                                                                                                             improve it”. The Joint Commission differenti-
       6060 Hall in Tirol                                         prepublished: November 5, 2009
       Austria                                                                                                               ates between internal benchmarking (similar
       E-mail: elske.ammenwerth@umit.at                                                                                      processes within the same organization are
                                                                                                                             compared) and competitive benchmarking
                                                                                                                             (organization’s processes are compared with
                                                                                                                             best practices within the industry).
      1. Introduction                                             mation system (HIS). A HIS should support                      Benchmarking in general comprises the
                                                                  the information logistics within a hospital,               choice of an appropriate target, the definition
      Technological and medical evolution and or-                 making the appropriate information – the                   of the related performance indicators, and the
      ganizational changes affect health care. As a               appropriate knowledge – at the appropriate                 collection of the relevant data that can then
      consequence, health care has become increas-                time – at the appropriate location – the ap-               be used for comparison purposes [11].
      ingly more complex, and is today “charac-                   propriate individuals – in an appropriate and                  There exist several approaches to develop
      terized by more to know, more to do, more to                usable form available [2]. In this sense, a HIS            performance indicators for hospitals. For
      manage, more to watch, and more people                      can be defined as the socio-technical subsys-              example, the Joint Commission’s Annual
      involved than ever before” [1].                             tem of an enterprise, which consists of the                Report on Quality and Safety 2008 provides
          To manage these increasing requirements,                information-processing activities and re-                  evidence of U.S. hospitals’ performance re-
      hospitals require an efficient hospital infor-              sponsible human and mechanical com-                        garding National Patient Safety Goals [12].

      Methods Inf Med 6/2009
G. Hübner-Bloder; E. Ammenwerth: Key Performance Indicators to Benchmark Hospital Information Systems                         509

This report describes how well hospitals fol-     in most hospitals, no regular HIS monitoring        chose Type 3 (씰Fig. 1), comprising one
low the pre-defined treatment guidelines.         activities based on the objective and quanti-       qualitative and two quantitative rounds:
   For benchmarking the information sys-          fied assessment of HIS quality are conducted.       1. First qualitative round: A written survey
tems of a hospital, fewer approaches seem to      Overall, it seems necessary to first define what       with five open-ended questions was used
exist. The information management stan-           useful HIS performance indicators are, before          to collect ideas for the useful performance
dards of JCAHO [13] define ten major stan-        adequate validated methods to measure them             indicators for HIS benchmarking.
dards with several sub-standards, in turn fo-     can be developed.                                   2. Second quantitative round: Based on the
cusing on the general issues of information           The objective of the present paper is to           results of the first round, a standardized
management within a hospital (including, for      develop a prioritized list of the useful per-          questionnaire with close questions was
example, usage of standardized codes and          formance indicators for HISs, based on a               developed, in turn requesting the experts
classifications, availability of a complete       Delphi-based survey of HIS experts. The                to rate the importance of each proposed
medical record, or access to knowledge re-        study questions were:                                  indicator on a 4-point Likert scale (21).
sources). These standards, however, do not        ● What are and what are not useful per-             3. Third quantitative round: The results of
define the objective, quantitative indicators         formance indicators for HISs?                      the second round were sent back to the ex-
that can be used for benchmarking.                ● Who should carry out HIS benchmarking,               perts, in turn requesting them to re-con-
   Health IT researchers, therefore, have de-         and when?                                          sider their voting in light of the opinions
veloped quantitative instruments trying to        ● Are there any HIS benchmarking activities            of the overall expert panel.
assess the quality of a HIS, based on user            being performed in hospitals?
surveys [14, 15]. These instruments, however,                                                         The first round was submitted to the expert
do not comprise objective performance in-                                                             panel on September 2006. The following two
dicators.                                         2. Methods                                          quantitative rounds started February 2007
   Other approaches that define the IT per-                                                           and were finalized December 2007.
formance indicators are COBIT (for the plan-      2.1 The Approach: A Delphi Study                       Our Delphi study was conducted based on
ning, acquisition, operation, support and                                                             an online questionnaire (developed with the
monitoring of IT) [16] and ITIL (for the          We decided to use the Delphi method to ad-          help of the software 2ask.at).
planning and monitoring of IT service man-        dress our study questions. The Delphi
agement) [6]. Both provide objective and          method allows for a systematic, interactive,
subjective performance indicators (for            iterative collection of expert opinions [17,        2.2 Selection of the Appropriate
example, percentage of IT projects that can be    18]. After each round, the experts receive an       Expert Panel
derived from the IT strategy, percentage of       anonymous summary of all the experts’
users satisfied with IT training, number of       opinions from the previous round. After-            We established a panel with experts from Aus-
user complaints, costs of IT non-com-             wards, the participants are encouraged to re-       tria, Germany and Switzerland. For this, we
pliance). These approaches, however, pri-         vise their earlier answers in view of the replies   invited 152 experts, who were from the uni-
marily focus on the IT operation and IT sup-      of the other experts. Compared to other             versity field of medical informatics (typically
port quality and not on the overall HIS.          methods, such as group discussion or expert         researchers also involved in the IT manage-
   Summarizing, while several established         interviews, a Delphi study enables the inclu-       ment of the university hospitals) or from
attempts exist to define the indicators for the   sion of a larger number of experts, in which        practical hospital-based health care IT (for
quality and performance of hospitals in gen-      the experts are allowed to revise their             example CIOs, head of the IT department of a
eral and of the IT management processes, sys-     opinions in view of the other experts’              hospital, etc.). This combination should help
tematic approaches to objectively benchmark       opinions [19]. From the four different types        to combine the opinions of experts from both
the quality of HISs are lacking. Consequently,    of Delphi studies described by Häder [20], we       research and practice.

Fig. 1
Steps of the Delphi
study

© Schattauer 2009                                                                                                          Methods Inf Med 6/2009
510   G. Hübner-Bloder; E. Ammenwerth: Key Performance Indicators to Benchmark Hospital Information Systems

      2.3 First Round: Qualitative                          of categories reflected the possible perform-       qualitative content analysis, we aggregated
      Survey                                                ance indicators for HIS. The overall catego-        these expert answers into 77 items, organized
                                                            rization was done by two researchers; any           into eight categories: technical quality, soft-
      The first qualitative round focusing on “What         differences in opinions were resolved by dis-       ware quality, architecture and interface
      could be the useful performance indicators for        cussion.                                            quality, IT vendor quality, IT support and IT
      HIS?” was used to collect ideas and to obtain a                                                           department quality, workflow support
      broad range of opinions. In this first round, we                                                          quality, IT outcome quality, IT costs (for de-
      contacted 152 experts and posed the following         2.4 Second Round: Quantitative                      tails of the items in each category see 씰Ap-
      questions to them in free-text form:                  Survey (Part 1)                                     pendix).
      1. Please provide at least eight criteria or per-                                                             Experts also provided 74 comments on the
         formance indicators that you find useful           Based on the results of the first round, a stan-    indicators that they did not find useful. The
         to assess a hospital information system            dardized questionnaire was developed that           most significant comments were as follows (in
         (HIS)(for example criteria in reference to         presented the list of proposed indicators for       brackets is the number of experts commenting
         satisfaction, IT diffusion, functionality,         HIS benchmarking. We requested the experts          on the given topic, in descending order):
         architecture, interfaces, etc.).                   to rate the importance of each indicator, of-       1. Acquisition costs for hardware and soft-
      2. Please provide at least three key perform-         fering them a four-point Likert scale (very             ware, IT budget or IT costs per user are not
         ance indicators that you do NOT find use-          important – rather important – rather not               useful indicators, if they are not related to
         ful to assess a HIS.                               important – not important).                             the outcome obtained (n = 10).
      3. Which departments or persons should be                                                                 2. Availability of specific types of computer sys-
         responsible – in your opinion – for HIS                                                                    tems, operation systems or database systems
         benchmarking?                                      2.5 Third Round: Quantitative                           are not useful as indicators (n = 9).
      4. When should the HIS benchmarking be                Survey (Part 2)                                     3. The number of computer systems or the
         carried out (for example after certain                                                                     number of IT staff members as the only
         events, in regular intervals, etc.)?               The results of the standardized questionnaire           indicator is not useful (n = 6).
      5. Is there any form of HIS benchmarking in           were analyzed by using descriptive statistics       4. Only a combination of indicators and a
         your current organization? If yes, in which        and then were sent back to the experts. The             combination or point of views (for
         form?                                              experts were then asked to reconsider their             example different user groups) can pro-
                                                            choice in light of the aggregated results, offer-       vide a good picture of HIS quality, in
      The free-text answers to those questions were         ing them the identical questionnaire as in the          which individual indicators may be mis-
      systematically analyzed by using qualitative          second round. The descriptive data analysis             leading (n = 5).
      content analysis [22]. The answers were               for the second and third round was realized         5. The usage of technological buzzwords
      coded based on a system of categories. First,         by SPSS 13.0.                                           such as “open”, “component-based”,
      we defined the dimension of the categories                                                                    “SOA”, “workflow engine” does not say
      and the level of abstraction and we deter-                                                                    much about the efficiency of patient care
      mined the screening criteria for each cat-            3. Results                                              and is, therefore, not useful (n = 4).
      egory. In the next step, we went line-by-line                                                             6. The overall HIS architecture (monolithic,
      through the material, in which we assigned            3.1 First Round: Qualitative                            distributed, etc.) and the software architec-
      each text passage to a category (subsump-             Survey                                                  ture (3-tier architecture, etc.) are not good
      tion). If a text passage did not match the es-                                                                criteria, as “there are good and bad HISs, in-
      tablished categories, we defined a new cat-           From the 152 invited experts, 35 experts re-            dependent of the architecture” (n = 4).
      egory (inductive categorization). After pass-         sponded (response rate: 23%). The distribu-         7. Finally, the popularity of an IT product or
      ing 50% of the material, the whole system of          tion of the experts is shown in 씰Table 1.               the number of installations of a given IT
      categories was revised and adapted with re-              From those 35 experts, we received more              software product are also not good indi-
      gard to the subject and aims of the survey, be-       than 400 proposals for the performance in-              cators for HIS quality (n = 4).
      fore finalizing the analysis. The resulting list      dicators for HIS benchmarking. By using

                                                                                       Table 1                  3.2 Second Round: Quantitative
      Affiliation of the partici-        First            Second      Third                                     Survey (Part 1)
                                                                                       Number of experts
      pants                              round            round       round
                                                                                       participating in each
      Hospital/health organization      20 (57.1 %) 32 (53.3 %) 22 (50%)               of the three rounds      In the second round, we invited 159 experts
      Research organizations/univer-     13 (37.1 %) 25 (41.7 %) 20 (45.5%)            of the Delphi study,     (108 experts of hospital/health organization
      sity                                                                             and their affiliations   and industry/consulting companies and 51
      Industry/consulting companies       2 (5.7 %)       3 (5%)        2 (4.5%)                                experts of research organizations or univer-
                                                                                                                sities). We invited the entire expert panel of
      Sum                                35 (100%) 60 (100%) 44 (100%)
                                                                                                                the first (n = 152) round and seven additional

      Methods Inf Med 6/2009                                                                                                                 © Schattauer 2009
G. Hübner-Bloder; E. Ammenwerth: Key Performance Indicators to Benchmark Hospital Information Systems                      511

experts. Sixty experts answered in this second       씰Figure 2 shows the results for the 15 in-      3.5 Do Hospitals Have HIS
round (rate of return = 37.7 %). The distribu-   dicators judged as “very important” by at least     Benchmarking?
tion of the experts is shown in 씰Table 1.        70% of the participants in this round. The de-
    The questionnaire that the experts re-       tailed results for all 77 indicators are shown in   As a final question, we asked whether the ex-
ceived contained 77 questions, reflecting the    the 씰Appendix . The overall inter-rater relia-      perts have any form of HIS benchmarking in
77 performance indicators identified in the      bility for all items is 0.35, calculated based on   their respective institutions. Here, we re-
first qualitative round (see 씰Appendix 1 for     the formula provided by Gwet [23].                  ceived answers from 31 experts (씰Table 4).
the complete list of questions).

                                                 3.4 Who Should Perform
3.3 Third Round: Quantitative
                                                                                                     4. Discussion
                                                 Benchmarking, and When?
Survey (Part 2)                                                                                      The aim of this Delphi study was to develop a
                                                 We asked the experts as to which departments        comprehensive approach of the quantitative
In this last round, we invited the 60 experts    or persons should be responsible for HIS            performance indicators in order to bench-
that participated in the second round, in        benchmarking. From the 35 participants, we          mark hospital information systems.
which 44 of them responded (rate of return:      received 35 open-ended answers that we ag-
73.3%). The distribution of the experts is       gregated. 씰Table 2 shows the results.
shown in 씰Table 1. The questionnaire con-           We asked “when should the HIS bench-
tained the same 77 questions as in the second    marking be carried out?”. Here, we received 31
round (씰Appendix ).                              answers (씰Table 3).

Fig. 2
Results for the 15
performance indi-
cators that more
than 70% of the 44
experts judged as
“very important”
(results of the third
round of the Delphi
study)

© Schattauer 2009                                                                                                        Methods Inf Med 6/2009
512   G. Hübner-Bloder; E. Ammenwerth: Key Performance Indicators to Benchmark Hospital Information Systems

      4.1 Answers to the Study                          agreement of both groups may reflect that all      hardware or software systems, buzzword-
      Questions                                         were health IT specialists, either from practice   oriented indicators (“SOA”) or indicators on
                                                        or academia. Other groups such as users and        the type of HIS architecture are not useful.
      4.1.1 What Are Useful HIS                         hospital administration may have completely        These indicators seem to not provide suffi-
      Performance Indicators?                           different view. Their view, however, has not       cient information on the performance of a
                                                        been assessed in this study.                       HIS and therefore should only be seen in
      Our study developed a list of 77 performance                                                         combination with other indicators.
      indicators that are suitable for HIS bench-       4.1.2 What Are not Useful HIS
      marking. Of those 77 indicators, 15 were re-      Performance Indicators?                            4.1.3 Who Should Carry out HIS
      garded by more than 70% of the experts as                                                            Benchmarking?
      “very important” (씰Fig. 2). From those 15         The experts stated that individual IT cost in-
      most important items, only three are related      dicators, indicators on the number or type of      The experts had different opinions (씰Ta-
      to technical issues, while five are related to                                                       ble 2): One-third of the experts favored an in-
      clinical workflow support, three to IT man-       Table 2 Answers to the question: Who should        terdisciplinary group of representatives of all
      agement issues, and two to user satisfaction      be responsible for HIS benchmarking? (n = 35 ex-   the relevant professions (IT users, IT depart-
      and user interface design (씰Fig. 2). HIS          perts)                                             ment, and hospital management). One-
      quality seems thus to be understood in a                                                             fourth of the experts either favored the IT de-
      strongly user- and workflow-oriented way.                                            n    %          partment as responsible for HIS benchmark-
      This is supported by the finding that the item    Only CIO or IT department           9   25.7%      ing, or the hospital management. The advan-
      with the highest support by the experts was       Only hospital management,           8   22.9%      tage of an interdisciplinary group is that the
      “user satisfaction with HIS systems”.             controlling or quality manage-                     different points of view can be integrated
          Key performance indicators need to be         ment                                               when developing benchmarking criteria – the
      operationalized to be of use. When looking at                                                        need to combine indicators that reflect differ-
                                                        Only user representatives (for      2       5.7%
      the list of indicators (씰Appendix), the ma-       example a physician, a ward
                                                                                                           ent points of view was mentioned in several
      jority seems fairly well quantifiable (for        manager) as the process owner                      comments. On the other side, the IT depart-
      example, indicators focusing on time,                                                                ment has the best access to the needed data,
                                                        Interdisciplinary group of users   12   34.3%
      number, effort or costs). However, the indi-                                                         but is not independent, in which the bench-
                                                        (physicians, nurses, adminis-
      cators for workflow support quality may be        trative staff) together with the
                                                                                                           marking outcome may be biased. Therefore,
      much more difficult quantifiable (e.g. this       management and IT staff                            some experts favored the hospital manage-
      may explain why, despite the high importance                                                         ment to be responsible, or they request sup-
                                                        External, independent persons       4   11.4%
      of this issue, no established benchmarking                                                           port from external, independent experts.
                                                        (for example consultants, uni-
      frameworks for health IT exist).                  versities) as supporters
                                                                                                           Summarizing, an interdisciplinary group of
          The opinions of the experts were quite                                                           all stakeholders, led by hospital management
      stable between the second and third round.        Total                              35 100%         and supported by external experts, may be an
      Only for two items was their median changed                                                          appropriate way to organize benchmarking.
      (item H9 “Costs of clinical documentation”
                                                        Table 3 Answers to the question: “When
      changed from rather important to rather not       should the HIS benchmarking be carried out?”       4.1.4 When Should the
      important, and item D2 “Participation in          (n = 31 experts)                                   HIS Benchmarking Be Performed?
      standard setting bodies” changed from rather
      important to rather not important).                                                  n    %          Experts stated that both regular benchmark-
          Our group of experts comprised both re-       Quarterly                           3       9.7%   ing as well as benchmarking after defined
      searchers as well as hospital IT staff members.                                                      events (for example introduction of a new IT
                                                        Annually                           12   38.7%
      A subgroup analysis revealed largely com-                                                            system, larger updates, or new legal regu-
      parable opinions on the significance of all in-   Any 2–3 years                       4   12.9%      lations) are necessary (씰Table 3). For regular
      dicators between those two groups, with three     At defined events (HIS updates,     5   16.1%      benchmarking, most experts favored annual
      exceptions: The hospital IT experts judged        changes in HIS architectures,                      benchmarking; probably as the effort for
      the two performance indicators C2 “Number         organizational changes)                            shorter periods seems too high. Two experts
      of interfaces” and E11 “Training effort per       Regularly (for example yearly)      5   16.1%      proposed to combine shorter reports on a
      user” higher than the experts from research.      and also at defined events (for                    quarterly basis with more detailed annual re-
      This may reflect the fact that interface man-     example after IT introduction)                     ports. Some comments indicated that it is
      agement and training organization make up a                                                          helpful when the data for HIS benchmarking
                                                        Short reports quarterly, more       2       6.5%
      larger part of their work. On the other side,     detailed (strategic) reports                       can be derived automatically, for example by
      the researchers judged the indicator G1 “Pa-      annually                                           using data warehousing approaches.
      tient satisfaction with patient care” higher
                                                        Total                              31 100%
      than the hospital IT experts. The overall

      Methods Inf Med 6/2009                                                                                                          © Schattauer 2009
G. Hübner-Bloder; E. Ammenwerth: Key Performance Indicators to Benchmark Hospital Information Systems                        513

4.1.5 Do Hospitals Have HIS                         Table 4 Answers to the question: Is there any     less emphasis on technical issues such as HIS
Benchmarking?                                       form of HIS benchmarking in your organization?    architecture.
                                                    (n = 31 experts)                                      Our panel consisted of experts from Aus-
Two-thirds of the experts stated that no sys-                                                         tria, Germany and Switzerland, with an over-
                                                                                        n    %
tematic benchmarking is carried out in their                                                          representation of Austrian participants (from
institutions, or that it is only performed in in-   No                                  16   51.6%    the 44 participants in the third round, 21 were
formal ways (씰Table 4). Seven experts stated        Planned or in development            4   12.9%    from Austria, 17 from Germany, and 6 from
that benchmarking is performed at least                                                               Switzerland). These countries have several
                                                    Not in a systematic way, but for     4   12.9%
partly. Indicators that are already in use com-     example by informal user feed-                    similarities with regard to language, culture
prise IT usage, IT coverage, data quality, user     back or informal user inter-                      and organization in health care. The results
satisfaction, or number of discharge letters or     views                                             may not be transferable to other countries with
diagnosis per day. Some experts also men-                                                             different organizational or cultural systems.
                                                    Yes or yes, partly:                  7   22.6%
tioned dedicated evaluation or impact               ● Since 2001, comprehensive
                                                                                                          The return rates were satisfactory, with
studies. Here, however, it should be noted that       benchmarking accordingly                        23% in the first qualitative round, 37.7% in
this would typically not be considered as             based on defined criteria, re-                  the second round and 73.3% in the third
benchmarking, as this comprises the regular           sults available to all users                    round. Those experts that already partici-
assessment of a pre-defined set of standard-        ● Yearly report on usage, cover-                  pated in the first round were also sufficiently
ized indicators.                                      age, data quality, plus regular                 motivated to participate in further rounds.
                                                      user satisfaction and impact                        One limitation of our study was that the
                                                      studies
                                                                                                      understanding of the term “HIS benchmark-
                                                    ● During system acquisition
4.2 Strengths and Weaknesses                                                                          ing” could vary between the experts. First, a
                                                      according to defined user
of the Study                                          criteria                                        “HIS” can be understood as the overall infor-
                                                    ● Number of discharge letters/                    mation processing subsystem of a hospital
For the development of performance indi-              day, number of diagnoses/                       (i.e. including the paper-based tools and the
cators, we used the Delphi study Type 3 with a        day, number of appoint-                         workflow), as only the computer-based soft-
combination of a qualitative and two quanti-          ments/day                                       ware systems, or only as the clinical systems.
                                                    ● User survey by external
tative rounds. The qualitative round helped                                                           In addition, the term “benchmarking” may be
to explore first ideas on the indicators, in          company                                         understood as a regular assessment of quanti-
                                                    ● Regular reports in IT project
which the quantitative surveys served to ob-                                                          tative performance criteria, but also as a syn-
                                                      steering committees and IT
tain a quantitative opinion from the expert           strategy committees                             onym for “IT evaluation”, which would in-
panel. This combination of qualitative and          ● During system acquisition,                      clude single, dedicated IT impact and evalu-
quantitative methods is an example of multi-          use of a requirement catalog                    ation studies. This different understanding of
method triangulation, where the qualitative                                                           those terms is partly reflected in the answers
                                                    Total                               31 100%
part aims at identifying the relevant variables,                                                      and comments of the experts. While we at-
which are then thoroughly studied in the                                                              tempted to describe all the performance indi-
quantitative part [24].                                                                               cators in an unambiguous way, this may not
    Forty-four experts participated in all three    IT strategic committee), thus this expert         have been successful for the entire list. This
rounds of our study. Reliable outcomes of a         panel represented experts with strong practi-     partly reflects the ongoing discussion of clear
Delphi study can already be obtained by an          cal health IT experience. We feel that this       definitions of the major health informatics
even smaller number of participants, as for         combination is an adequate representation of      terms (see for example [26]).
example Akins [25] showed, therefore, this          the needs and requirements of hospital IT
number seems adequate to attain valid con-          with regard to HIS benchmarking.
clusions. However, a formal validation on the          Different groups (such as IT experts, hos-     4.3 Results in Relation to Other
completeness of the indicator list was not          pital administration, clinicians, patients) may   Studies
performed.                                          have different perceptions of useful HIS per-
    The expert panel consisted of experts from      formance indicators. We did not include a         Many approaches for IT benchmarking exist,
the field of academia and hospital IT practice.     larger group of user representatives (for         which often focus on cost issues [27]. Our
Most participants had leading positions with-       example physicians, nurses, or administra-        study shows that costs are just one important
in their institutions and extensive experience      tion staff). Users may focus on different as-     issue among several other aspects such as
in health IT (for example professor for health      pects, for example, they may concentrate on       workflow support, user satisfaction or out-
informatics, CIO, head of IT department, IT         the quality of IT support for their respective    come quality. This supports the notion of IT
project manager). Around half of the partici-       tasks. So, probably, a Delphi study with those    governance with IT as a business enabler –
pants from academia also had responsibilities       groups may have found a stronger emphasis         and not only as a cost driver – also for health
in the IT management of their local univer-         on indicators from categories F (workflow         care organizations [28]. In fact, the most im-
sity hospitals (for example as a member of the      supports) and G (IT outcome quality), and         portant quality indicator that our experts de-

© Schattauer 2009                                                                                                          Methods Inf Med 6/2009
514   G. Hübner-Bloder; E. Ammenwerth: Key Performance Indicators to Benchmark Hospital Information Systems

      fined was user satisfaction. This is supported     the customer perspective (most of categories       cators should be selected to develop clear
      by the vast literature on this issue, stating in   F and G), the financial perspective (category      definitions, operationalization, target values,
      turn that low user acceptance can lead to user     H), and the learning-to-growth perspective         and adequate instruments [27]. For certain
      frustration and project failure [29–32]. While     (partly reflected in C and E, for example the      indicators, instruments have already been de-
      COBIT [27] and ITIL [6] assess processes and       flexibility of the overall IT architecture, or     veloped, such as for user satisfaction [14, 33,
      systems from the point of view of the IT, our      qualifications of the IT staff).                   37], usage of systems [38], usability evalu-
      results highlight the users’ point of view. For        James Martin [36] identified four major        ation [39], quality of HIS architecture [40],
      example, COBIT defines around 340 per-             levels of business processes: the operational      quality of IT management [27], and com-
      formance indicators for IT management,             level, the monitoring and control level where      posite approaches [11]. Based on the defined
      concentrating on aspects such as the quality       the correct running of processes is moni-          indicators, a health IT benchmarking frame-
      of IT service and IT support, which primarily      tored, the planning and analysis level where       work may then be developed.
      corresponds to our categories A (technical         correct processes are defined, and the stra-           Benchmarking does not only comprise the
      quality) and E (quality of IT support and IT       tegic level. In this pyramid, benchmarking is      definition and measurement of indicators,
      department), and do not cover for example          linking the strategic level to the planning and    but also a comparison to the best competitor.
      outcome quality or quality of workflow sup-        analysis level by helping to transform stra-       Therefore, open repositories need to be devel-
      ports. This also means that, in turn, impor-       tegic objectives into operation. Benchmark-        oped to gather HIS benchmarking data from
      tant technical issues such as maintainability      ing is thus an activity to be conducted in regu-   various hospitals, and to provide an (anony-
      and expandability of the IT systems are not        lar, though larger intervals (for example          mous) comparison with other hospitals. This
      reflected in our results.                          quarterly, yearly), which corresponds to our       would allow the IT management to assess the
          Also focusing on the users’ point of view,     experts’ opinion.                                  strengths and weaknesses of its respective
      Otieno et al. [11] and Ribière [33] developed                                                         hospital information system, and help to sys-
      and validated survey-based instruments for                                                            tematically improve it.
      benchmarking HIS quality. Both groups              4.4 Meaning and Generalizability
      based their primary selection of items on a        of the Study
      systematic literature survey. We chose an-                                                            5. Conclusion
      other approach, collecting and prioritizing        Our results present, to our knowledge, the
      the opinions of experts directly working in        first attempt to systematically develop HIS        Albert Einstein is often quoted as saying “not
      this field by a systematic Delphi approach.        performance indicators by a Delphi ap-             everything that can be measured is impor-
          Other authors have presented bench-            proach. The list of indicators was quite stable    tant, and not everything that is important can
      marking approaches focusing on objective           in our Delphi study, which reflects the many       be measured” (for example by [41]). HIS
      HIS data. For example, Dugas et al. [34] pre-      different issues (technology, data quality,        management is thus advised to identify a sub-
      sented a benchmarking system focusing on           workflow support, IT management) that              set of items that seem to best represent local
      the number of discharge letters per month,         should be tackled by HIS benchmarking pro-         HIS performance. In addition, besides for
      the number of appointments made per day,           jects. Obviously, while the list may not be ex-    pure quantitative benchmarking, further im-
      etc. These numbers are easy to gather and          haustive, it seems infeasible for a hospital to    portant insight into HIS quality may be
      aggregate. In our survey, however, those data      include all of those indicators in a HIS bench-    achieved from more qualitative approaches,
      were not among the list of the most impor-         marking project. Instead, a thorough selec-        which help to complement the picture, to
      tant items, probably as those indicators alone     tion based on the benchmarking objectives,         explain the quantitative findings, and to
      do not reflect the HIS quality very well.          taking into account both the feasibility and       propose improvements [24]. Given this con-
      Müller and Winter [8] presented a project          usefulness of the chosen indicators, may be        straint, HIS benchmarking can be seen as one
      where those indicators focusing on system          carried out in an interdisciplinary group.         important contribution to IT governance, in
      usage were quarterly extracted and presented                                                          turn helping to professionally manage and
      to the hospital staff. Aspects other than sys-                                                        steadily improve hospital information sys-
      tem usage were not covered.                        4.5 Unanswered and New                             tems.
          We structured our items according to a         Questions
      qualitatively developed system of eight cat-                                                          Acknowledgment
      egories. When we understand the hospital in-       The developed list of potential HIS bench-         We thank all experts for participating in the
      formation systems as a service of an IT de-        marking criteria can be seen as a starting         Delphi study, and Roland Blomer for his criti-
      partment that is delivered to the users as cus-    point for the development of a HIS bench-          cal comments on an earlier version of the
      tomers, we could also adopt the Balanced           marking framework. This framework should           paper.
      Scorecard (BSC) [35] approach to the struc-        provide performance indicators for different
      ture of the items. From this point of view, our    points of view (such as IT management,
      77 indicators are related to all four BSC di-      users, or patients). What has to be done now is
      mensions: the internal business process per-       to complete the list by adding views from
      spective (most of categories A, B, C, D and E),    other groups. Then, the most important indi-

      Methods Inf Med 6/2009                                                                                                           © Schattauer 2009
G. Hübner-Bloder; E. Ammenwerth: Key Performance Indicators to Benchmark Hospital Information Systems                                                515

                                                               Satisfaction Assessment Tool. In: El-Rewini H, editor.       Preview. Journal of Health Care Compliance 2005;
References                                                     Proc 32nd Hawaii International Conference on Sys-            7 (6): 27–30.
                                                               tem Sciences (HICSS-32), Maui, Hawaii, January 5–8,      29. Dewan N, Lorenzi N. Behavioral Health Infor-
1. Committee on Quality Health Care in America –
                                                               1999. IEEE Computer Society Press; 1999. pp 1–9.             mation Systems: Evaluating Readiness and User
    Institute of Medicine. Crossing the Quality Chasm:
                                                           15. Ammenwerth E, Ehlers F, Hirsch B, Gratl G. HIS-              Acceptance. MD Computing 2000; 17(4): 50–52.
    A New Health System for the 21st Century.
                                                               Monitor: an approach to assess the quality of infor-     30. Brender J, Ammenwerth E, Nykanen P, Talmon J.
    Washington: National Academy Press; 2001.
                                                               mation processing in hospitals. Int J Med Inform             Factors influencing success and failure of health in-
    http://www.nap.edu/books/0309072808/html/.
                                                               2007; 76 (2–3): 216–225. http://www.ncbi.nlm.                formatics systems – a pilot Delphi study. Methods
2. Haux R, Winter A, Ammenwerth E, Brigl B.
                                                               nih.gov/entrez/query.fcgi?cmd=Retrieve&db=                   Inf Med 2006; 45 (1): 125–136. http://www.ncbi.
    Strategic Information Management in Hospitals –
                                                               PubMed&dopt=Citation&list_uids=16777476                      nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&
    An Introduction to Hospital Information Systems.
                                                           16. ISACA. Information Systems Audit and Control                 db=PubMed&dopt=Citation&list_uids=16482383
    New York, Berlin, Heidelberg: Springer-Verlag;
                                                               Assocation (ISACA): Control Objectives for Infor-        31. Beynon-Davies P, Lloyd-Williams M. When health
    2004.
                                                               mation and Related Technology (COBIT). 2004                  information systems fail. Topics Health Inform
3. Winter A, Ammenwerth E, Bott O, Brigl B,
                                                               (cited May 2009). Available from: http://www.isaca.          Manage 1999; 20 (1): 66–79.
    Buchauer A, Gräber S, et al. Strategic Information
                                                               org/cobit.htm                                            32. Southon G, Sauer C, Dampney K. Lessons from a
    Management Plans: The Basis for Systematic In-
                                                           17. Linstone HA, Turoff M. The Delphi Method: Tech-              failed information systems initiative: issues for
    formation Management in Hospitals. Int J Med
                                                               niques and Applications. 2002 (cited May 2009).              complex organisations. Int J Med Inform 1999; 55
    Inform 2001; 64 (2–3): 99–109.
                                                               Available from: http://is.njit.edu/pubs/delphibook/          (1): 33–46.
4. Kuhn K, Guise D. From Hospital Information Sys-
                                                           18. Dalkey NC. The Delphi Method: An Experi-                 33. Ribière V, La Salle A, Khorramshahgol R. Measuring
    tems to Health Information Systems – Problems,
                                                               mental Study of Group Opinion. 1969 (cited May               Customer Satisfaction: A Case Study in Hospital
    Challenges, Perspectives. Methods Inf Med 2000;
                                                               2009). Available from: http://www.rand.org/pubs/             Information Systems Evaluation. In: The First
    40: 275–286.
                                                               research_memoranda/RM5888/                                   World Customer Service congress, Oct 29–31, 1997;
5. Ward J, Griffiths P. Strategic Planning for Informa-
                                                           19. Green KC, Armstrong JS, Graefe A. Methods to Eli-            1997; Tysons Corner, VA; 1997.
    tion Systems. 2nd ed. Chichester: John Wiley &
                                                               cit Forecasts from Groups: Delphi and Prediction         34. Dugas M, Eckholt M, Bunzemeier H. Benchmark-
    Sons; 1996.                                                Markets Compared. Foresight: The International               ing of hospital information systems: monitoring of
6. ITIL. Information Technology Infrastructure Li-
                                                               Journal of Applied Forecasting 2007(issue 8, fall            discharge letters and scheduling can reveal het-
    brary. 2009 (cited May 2009). Available from:
                                                               2007): 17–21.                                                erogeneities and time trends. BMC Med Inform
    http://www.itil-officialsite.com/home/home.asp
                                                           20. Häder M. Delphi Befragungen – Ein Arbeitsbuch                Decis Mak 2008; 8: 15. http://www.ncbi.nlm.nih.
7. Köhler P. Prince 2. Heidelberg: Spinger; 2006.
                                                               (Delphie surveys). Wiesbaden, Germany: West-                 gov/entrez/query.fcgi?cmd=Retrieve&db=
8. Müller U, Winter A. A monitoring infrastructure
                                                               deutscher Verlag GmbH; 2002.                                 PubMed&dopt=Citation&list_uids=18423046
    for supporting strategic information management
                                                           21. Bortz J, Döring N. Forschungsmethoden und                35. Kaplan R, Norton D. The Balanced Scorecard –
    in hospitals based on key performance indicators.
                                                               Evaluation für Human- und Sozialwissenschaftler              Measures that Drive Performance. Harvard Busi-
    In: Hasman A, Haux R, van der Lei J, De Clercq E,
                                                               (Research methods and evaluation). 3rd ed. Berlin:           ness Review 1992; January-February: 71–79.
    France F, editors. Ubiquity: Technologies for Better
                                                               Springer; 2002.                                          36. Martin J. Strategic Information Planning Method-
    Health in Aging Societies. Proceedings of MIE2006.
                                                           22. Dey I. Qualitative Data Analysis. A User-Friendly            ologies. 2nd ed. Prentice Hall: Englewood Cliffs;
    Studies in Health Technology and Informatics vol.
                                                               Guide for Social Scientists. London, Great Britain:          1989.
    124. Amsterdam: IOS Press; 2006. pp 328–332.
                                                               Routledge; 1993.                                         37. Ohmann C, Boy O, Yang Q. A systematic approach
9. Friedman C, Wyatt JC. Evaluation Methods in Medi-
                                                           23. Gwet K. Handbook of Inter-Rater Reliability. Gai-            to the assessment of user satisfaction with health
    cal Informatics. 2nd ed. New York: Springer; 2006.
                                                               thersburg, MD: STATAXIS Publishing Company;                  care systems: constructs, models and instruments.
10. JCAHO. Sentinel Event Glossary of Terms. 2009
                                                               2001.                                                        In: Pappas C, editor. Medical Informatics Europe
    (cited May 2009). Available from: http://www.
                                                           24. Barbour RS. The case for combining qualitative and           ’97. Conference proceedings. Amsterdam: IOS
    jointcommission.org/SentinelEvents/se_glossary.
                                                               quantitative approaches in health services research.         Press; 1997. pp 781–785.
    htm
                                                               J Health Serv Res Policy 1999; 4 (1): 39–43.             38. Laerum H, Ellingsen G, Faxvaag A. Doctors’ use of
11. Otieno GO, Hinako T, Motohiro A, Daisuke K,
                                                           25. Akins RB, Tolson H, Cole BR. Stability of response           electronic medical record systems in hospitals: cross
    Keiko N. Measuring effectiveness of electronic
                                                               characteristics of a Delphi panel: application of            sectional survey. Bmj 2001; 323 (7325): 1344–1348.
    medical records systems: towards building a com-
                                                               bootstrap data expansion. BMC Med Res Methodol               http://www.pubmedcentral.nih.gov/tocrender.
    posite index for benchmarking hospitals. Int J Med
                                                               2005; 5: 37. http://www.ncbi.nlm.nih.gov/entrez/             fcgi?action=cited&artid=61372
    Inform 2008; 77 (10): 657–669. http://www.ncbi.
                                                               query.fcgi?cmd=Retrieve&db=PubMed&dopt=                  39. Nielsen J. Usability Engineering. New York: Aca-
    nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db
                                                               Citation&list_uids=16321161                                  demic Press; 1993.
    =PubMed&dopt=Citation&list_uids=18313352
                                                           26. Hersh WR. A Stimulus to Define Informatics and           40. Brigl B, Huebner-Bloder G, Wendt T, Haux R,
12. JCAHO. Improving America’s Hospitals: The
                                                               Health Information Technology. BMC Med In-                   Winter A. Architectural quality criteria for hospital
    Joint Commission’s Annual Report on Quality and
                                                               form Decis Mak 2009; 9: 24. doi: 10.1186/1472-               information systems. AMIA Annu Symp Proc 2005.
    Safety 2008. 2008 (cited May 2009). Available
                                                               6947-9-24. http://www.biomedcentral.com/1472-                pp 81–85. http://www.ncbi.nlm.nih.gov/entrez/
    from:       http://www.jointcommissionreport.org/
                                                               6947/9/24.                                                   query.fcgi?cmd=Retrieve&db=PubMed&dopt=
    introduction/introduction.aspx
                                                           27. Kütz M. Kennzahlen in der IT – Werkzeuge für                 Citation&list_uids=16779006
13. JCAHO. Joint commission for accreditation for
                                                               Controlling und Management (IT performance in-           41. Protti D. A proposal to use a balanced scorecard to
    healthcare organizations (JCAHO): Information
                                                               dicators). 3 ed. Heidelberg, Germany: dpunkt.ver-            evaluate Information for Health: an information
    management standards. 2009 (cited May 2009).
                                                               lag GmbH; 2009.                                              strategy for the modern NHS (1998–2005). Com-
    Available from: http://www.jcaho.org
                                                           28. Lutchen M, Collins A. IT Governance in a Health              put Biol Med 2002; 32 (3): 221–236.
14. Ribière V, LaSalle A, Khorramshahgool R, Gousty Y.         Care Setting: Reinventing the Health Care Industry.
    Hospital Information Systems Quality: A Customer

© Schattauer 2009                                                                                                                                 Methods Inf Med 6/2009
516   G. Hübner-Bloder; E. Ammenwerth: Key Performance Indicators to Benchmark Hospital Information Systems

      Appendix
      Importance of the HIS performance indicators, as judged by 44 experts (third round of the Delphi study). Median values are highlighted in grey.

      Cat.     Item                                                                           N valid Very      Rather    Rather not Not
                                                                                                      important important important important
                                                                                                      (Valid %) (Valid %) (Valid %)  (Valid%)
      A        Technical Quality
      A1       Availability of the HIS systems (e.g. down times per year).                    42       40 (95.2%)      2 (4.8%)    –               –
      A2       Performance of the HIS systems (amount of data that is processed within a      43       21 (48.8%)     21 (48.8%)       1 (2.3%)    –
               given time period)
      A3       Response time of the HIS systems (time period between user action and sys- 43           37 (86.0%)      6 (14.0%) –                 –
               tem reaction)
      A4       Duration of user authentication (time until the functions are available)       42       27 (64.3%)     13 (31%)       2 (4.8%)      –
      A5       Number of new hardware acquisitions per year                                   42         1 (2.4%)     10 (23.8%) 27 (64.3%)        4 (9.5%)
      A6       Hardware equipment (e.g. sufficient number, sufficient performance)            43         8 (18.6%)    31 (72.1%)       4 (9.3%)    –
      A7       Data loss rate and the restore time of the HIS systems per year                42       27 (64.3%)     13 (31%)         2 (4.8%)    –
      A8       Independence and mobility of the tools for data entry and information re-      42         5 (11.9%)    24 (57.1%) 12 (28.6%)            1 (2.4%)
               trieval (e.g. notebook, tablet PC, PDA, etc.)
      B        Software Quality
      B1       Functional coverage of the HIS software                                        43       11 (25.6%)     29 (67.4%)       3 (7.0%)    –
      B2       Support of legal guidelines by the software (e.g. ICD 10, DRG, data trans-     42       37 (88.1%)      5 (11.9%) –                 –
               mission laws)
      B3       Ergonomics and uniformity of the user interface of the HIS systems as well     42       31 (73.8%)     10 (23.8%)       1 (2.4%)    –
               as intuitive usage
      B4       Time needed for standard functions (e.g. patient admission) – how many         42       25 (59.5%)     16 (38.1%)       1 (2.4%)    –
               “clicks” are necessary
      B5       Possibility to adapt software to the local conditions, also by the customer    43       25 (58.1%)     15 (34.9%)       3 (7.0%)    –
               (e.g. mean effort for the initial adaptation)
      B5       Level of the maturity of the software, as indicated by the number of service   42       14 (33.3%)     27 (64.3%) -                     1 (2.4%)
               calls of the IT department to the HIS vendor
      B7       Effort for updates/upgrades of the software (e.g. duration, instability)       43         9 (20.9%)    23 (53.5%) 11 (25.6%)        –
      B8       Support of the market standards (e.g. standards of development and data-       42         9 (21.4%)    24 (57.1%)       8 (19%)         1 (2.4%)
               base system, operating systems, client software)
      C        Architecture and Interface Quality
      C1       Homogeneity and heterogeneity of the HIS systems                               44       13 (29.5%)     24 (54.5%)       7 (15.9%)   –
      C2       Number of interfaces between the HIS systems                                   43         2 (4.7%)     30 (69.8%) 10 (23.3%)            1 (2.3%)
      C3       The relation of HIS systems connected by interfaces to those without inter-    42         7 (16.7%)    23 (54.8%) 10 (23.8%)            2 (4.8%)
               faces
      C4       Support of interface standards (e.g. HL7, DICOM)                               42       33 (78.6%)      8 (19%)         1 (2.4%)    –
      C5       Number of clinical departments that use an own subsystem for documentation 42             4 (9.5%)     23 (54.8%) 12 (28.6%)            3 (7.1%)
      C6       Time effort and costs when connecting subsystems that have standard inter- 43           12 (27.9%)     26 (60.5%)       5 (11.6%)   –
               faces
      C7       Number of double interfaces (e.g. one message is sent directly to two appli- 41           5 (12.2%)    18 (43.9%) 15 (36.6%)            3 (7.3%)
               cation systems)
      C8       Number of external interfaces, to illustrate the support of co-operative pa-   42         6 (14.3%)    24 (57.1%) 11 (26.2%)            1 (2.4%)
               tient care (e.g. query of medical patient documents by other health care in-
               stitutions)

      Methods Inf Med 6/2009                                                                                                                © Schattauer 2009
G. Hübner-Bloder; E. Ammenwerth: Key Performance Indicators to Benchmark Hospital Information Systems                        517

Cat.    Item                                                                        N valid   Very          Rather       Rather not Not
                                                                                              important     important    important important
                                                                                              (Valid %)     (Valid %)    (Valid %)  (Valid%)
C9      Level of service orientation of the architecture (IT Infrastructure         44        13 (29.5%)    22 (50.0%)       8 (18.2%)       1 (2.3%)
        aligned to business processes)
C 10    Compatibility of the whole IT infrastructure (e.g. operating systems,       43        17 (39.5%)    25 (58.1%)       1 (2.3%)    –
        application systems)
D       IT Vendor Quality
D1      References of the HIS vendor                                                40         6 (15%)      31 (77.5%)       3 (7.5%)    –
D2      Participation of the HIS vendor in standard setting bodies (e.g. HL7)       40         5 (12.5)     14 (35%)     18 (45%)            3 (7.5%)
D3      Sustainability of the HIS vendor (e.g. assured further development)         40        30 (75%)       9 (22.5%)       1 (2.5%)    –
D4      Implementation and operation support and a good update and bug              40        27 (67.5%)    13 (32.5%)   –               –
        fixing management by the HIS vendor
D5      Preparation of HIS handbooks and HIS trainings by the HIS vendor            40         4 (10%)      32 (80%)         4 (10%)     –
D6      Sufficient qualified staff at the HIS vendor (for development, support      40        29 (72.5%)    11 (27.5%)   -               –
        and adaption)
E       IT Support and IT Department Quality
E1      Number of IT staff in relation to the number of users, beds, outpatient     43        12 (27.9%)    24 (55.8%)       7 (16.3%)   –
        cases and workstations
E2      Qualification of the staff in the IT department                             43        26 (60.5%)    17 (39.5%)   –               –
E3      Availability of process definitions in the IT department for error man-     41         8 (19.5%)    27 (65.9%)       6 (14.6%)   –
        agement, updates, documentation, etc.
E4      Availability and quality of an IT failure and emergency management          43        39 (90.7%)     4 (9.3%)    –               –
        concept
E5      Availability and quality of a data protection and authorisation concept 43            38 (88.4%)     5 (11.6%)   –               –
E6      Number of data protection violations per year                               42        11 (26.2%)    25 (59.5%)       6 (14.3%)   –
E7      Number of hotline calls per user and the mean duration of incident          43         7 (16.3%)    33 (76.7%)       3 (7%)      –
        and problem solving
E8      Number of calls that are not incoming through hotline or first-level-       42         6 (14.3%)    23 (54.8%)   11 (26.2%)          2 (4.8%)
        support
E9      Number of calls that are successfully solved within a set timeframe         42        13 (31%)      23 (54.8%)       5 (11.9%)       1 (2.4%)
        (e.g. 6 hours)
E 10    Overall number of HIS user/user groups and number of new HIS user/          42         5 (11.9%)    25 (59.5%)   11 (26.2%)          1 (2.4%)
        year that must be supported
E 11    Training effort per user.                                                   43         8 (18.6%)    29 (67.4%)       6 (14.0%)   -
E 12    Number of successful completed HIS projects                                 41         6 (14.6%)    28 (68.3%)       7 (17.1%)   -
E 13    Percentage of discontinued IT projects in relation to all the IT projects   41         3 (7.3%)     22 (53.7%)   13 (31.7%)          3 (7.3%)
E 14    Fulfilment of service levels within the service level agreements (SLA)      39         4 (10.3%)    25 (64.1%)   10 (25.6%)      -
F       Workflow Support Quality
F1      Number of departments, users, professional groups that use the HIS          43        13 (30.2%)    26 (60.5%)       4 (9.3%)    -
        systems routinely, and the respective frequency of use
F2      User satisfaction of different user groups with the HIS systems             44        42 (95.5 %)    2 (4.5 %)   -               -
F3      Functional range of the HIS systems (the necessary functions for            43        35 (81.4%)     8 (18.6%)   -               -
        administration and patient care are available)
F4      Coverage of the functionality desired by the users                          44        24 (54.5%)    20 (45.5%)   -               -

© Schattauer 2009                                                                                                             Methods Inf Med 6/2009
518   G. Hübner-Bloder; E. Ammenwerth: Key Performance Indicators to Benchmark Hospital Information Systems

      Cat.    Item                                                                               N valid Very      Rather    Rather not Not
                                                                                                         important important important important
                                                                                                         (Valid %) (Valid %) (Valid %) (Valid%)
      F5      Level of information of the user about the provided functionality of               44     11 (25.0%)   32 (72.7%)    1 (2.3%)    –
              the HIS systems
      F6      IT sophistication with regard to the functions (relation of the IT                 43     15 (34.9%)   25 (58.1%)    3 (7.0%)    –
              supported tasks to the non-IT supported tasks)
      F7      The continuity of workflow support by HIS systems                                  42     31 (73.8%)   11 (26.2%)   –            –
      F8      Functional redundancy (number of enterprise functions that are                     43      9 (20.9%)   18 (41.9%)   13 (30.2%)   3 (7.0%)
              supported by more than one system)
      F9      Redundancy during data collection (must the same data item be                      43     34 (79.1%)    8 (18.6%)   –            1 (2.3%)
              documented more than once?)
      F 10    Time needed for clinical documentation per staff and time period                   43     22 (51.2%)   18 (41.9%)    2 (4.7%)    1 (2.3%)
      F 11    Number of discharge letters, medical reports, appointments,                        43     15 (34.9%)   22 (51.2%)    6 (14%)     –
              orders, diagnoses/procedures, operation reports, pictures per period
              in relation of clinical key data (number of cases)
      F 12    Amount of departmental documentation that is regularly documented in the 42                4 (9.5%)    24 (57.1%)   14 (33.3%)   –
              (main) HIS system
      F 13    Completeness of the electronic patient record in relation to the total number 42          22 (52.4%)    9 (45.2%)    1 (2.4%)    –
              of patient documents (relation of the electronic documents to the remaining
              paper-based documents)
      F 14    Coverage of medical knowledge bases                                                42      1 (2.4%)    21 (50.0%)   18 (42.9%)   2 (4.8%)
      F 15    Frequency of the usage of medical knowledge bases                                  43      1 (2.3%)    15 (34.9%)   23 (53.5%)   4 (9.3%)
      G       IT Outcome Quality
      G1      Patient satisfaction with patient care                                             41     24 (58.5%)   16 (39%)      1 (2.4%)    –
      G2      Completeness and correctness of the clinical documentation in the                  42     36 (85.7%)    6 (14.3%)   –            –
              HIS systems
      G3      Timely availability of clinical documents in the HIS systems                       42     37 (88.1%)    5 (11.9%)   –            –
      G4      Contribution of the HIS systems to the hospitals’ success                          43     23 (53.5%)   16 (37.2%)    4 (9.3%)    –
      G5      Contribution of the HIS systems to the strategic goals of the medical, nurs-       41     27 (65.9%)   13 (31.7%)    1 (2.4%)    –
              ing and administrative management
      G6      Duration between the patient discharge and completion of the discharge             42     16 (38.1%)   21 (50%)      4 (9.5%)    1 (2.4%)
              letter and accounting
      H       IT Cost
      H1      Total costs of the HIS systems (acquisition, operation, IT staff, training etc.)   43     11 (25.6%)   25 (58.1%)    6 (14.0%)   1 (2.3%)
              per year
      H2      Total costs of the HIS systems in relation to the hospital turnover                40     12 (30%)     21 (52.5%)    7 (17,5%)   –
      H3      Total costs of the HIS systems in relation to the offered functionality and        42     13 (31%)     26 (61.9%)    3 (7.1%)    –
              usage in the departments
      H4      Cost effectiveness of the HIS systems (cost versus benefit)                        42     25 (59.5%)   13 (31%)      3 (7.1%)    1 (2.4%)
      H5      Operating costs of the HIS systems per year                                        42     14 (33.3%)   24 (57.1%)    4 (9.5%)    –
      H6      Monetary benefits by the HIS systems (e.g. reduction of staff, paperwork)          41     10 (24.4%)   22 (53.7%)    7 (17.1%)   2 (4.9%)
      H7      Costs of IT hardware in relation to IT support staff                               40      5 (12.5%)   19 (47.5%)   14 (35%)     2 (5%)
      H8      Costs of IT software and IT diffusion in relation to the number of users           42      6 (14.3%)   24 (57.1%)   11 (26.2%)   1 (2.4%)
      H9      Costs of clinical documentation in relation to the total income of the hospital 41         4 (9.8%)    14 (34.1%)   20 (48.8%)   3 (7.3%)
      H 10    Yearly increase in IT investments for HIS systems                                  42      6 (14.3%)   25 (59.5%)   11 (26.2%)   –

      Methods Inf Med 6/2009                                                                                                            © Schattauer 2009
You can also read