Hostname: page-component-745bb68f8f-cphqk Total loading time: 0 Render date: 2025-02-06T19:48:33.008Z Has data issue: false hasContentIssue false

Information technology capacities assessment tool in hospitals: Instrument development and validation

Published online by Cambridge University Press:  06 January 2009

Mirou Jaana
Affiliation:
University of Ottawa
Guy Paré
Affiliation:
HEC Montreal
Claude Sicotte
Affiliation:
University of Montreal
Rights & Permissions [Opens in a new window]

Abstract

Objectives: This research integrates existing literature on information technology (IT) in hospitals, and proposes and validates a comprehensive IT capacities assessment tool in these settings.

Methods: A comprehensive literature review was conducted on Medline until September 2006 to identify studies that used specific IT measures in hospitals. The results were mapped and used as a basis for the development of the proposed instrument, which was tested through a survey of Canadian healthcare organizations (N = 221).

Results: A total of seventeen studies provided indicators of clinical and administrative IT capacities in hospitals. Based on the mapping of these indicators, a comprehensive IT capacities assessment instrument was developed including thirty-four items exploring computerized processes, thirteen items assessing contemporary technologies, and eleven items investigating internal and external information sharing. A time frame was inserted in the tool to reflect “plans for” versus “current” implementation of IT; in the latter, the extent of current use of computerized processes and technologies was measured on a (1–7) scale. Overall, the survey yielded a total of 106 responses (52.2 percent response rate), and the results demonstrated a good level of reliability and validity of the instrument.

Conclusions: This study unifies existing work in this area, and presents the psychometric properties of an IT capacities assessment tool in hospitals. By developing scores for capturing IT capacities in hospitals, it is possible to further address important research questions related to the determinants and impacts of IT sophistication in these settings.

Type
Research Reports
Copyright
Copyright © Cambridge University Press 2009

Since the release of the reports “To Err is Human” (19) and “Crossing the Quality Chasm” (18), health information technologies have been recognized as essential components for an improved health system. The framework for strategic action prepared by the national coordinator for health information technology (IT) in the United States (Reference Thompson and Brailer28) emphasized on the value of IT in supporting consumer-centered care and outlined strategic steps for achieving interoperability and supporting electronic health records. In Canada, a national initiative for IT adoption is lead by Health Infoway, a federally funded organization, which aims at fostering fast deployment of technologies through national funding of projects in various IT-related areas (10).

Hospitals in the United States and Canada are continuously exploring opportunities for investment in technologies, which would enable improvement in clinical processes and efficiency, and promote patient safety and better quality of care (Reference Ball3;Reference Bates and Gawande4;Reference Chaudhry, Wang and Wu11). Several large-scale surveys indicate that the use of health IT has recently increased in hospitals, with a special emphasis on technologies and applications promoting patient safety (2;17;25).

However, despite the progress in IT adoption in hospitals, the level of IT capacities remains variable across healthcare settings and challenging to gauge. Prior research has made progress in identifying main technology-related measures that reflect key IT functionalities / applications in hospitals (e.g., Reference Burke, Menachemi and Brooks7;Reference Burke and Menachemi8;Reference Menachemi, Burkhardt, Shewchuk, Burke and Brooks24;Reference Poon, Jha and Christino27), but these efforts remain scattered and constrained by limitations associated with the characteristics of the measures and tools used.

This research addresses these issues and proposes a comprehensive instrument for assessing IT capacities in hospitals, including various applications and technologies and the level of integration among different systems. For this purpose, a comprehensive literature review was conducted to identify prior efforts that attempted to assess IT in hospitals. The results were mapped using the original conceptual framework developed by Paré and Sicotte (Reference Paré and Sicotte26) that defined IT sophistication along three dimensions (functional, technological, and integration). Subsequently, an IT capacities assessment tool, which produces IT scores for hospitals, was developed and validated through a survey of healthcare organizations in two Canadian provinces. This study focuses on the instrument development process and assessment of its psychometric properties. Specifically, it presents the literature review and mapping process, describes the proposed IT capacities assessment tool and the scoring approach, outlines the steps used in its development, and presents the results of its validation.

LITERATURE REVIEW

A comprehensive literature review was conducted on Medline until September 2006 to identify studies in the field that used specific measures / tools to assess IT in hospitals. The search was done using three keywords (hospital information systems, clinical information systems, and information technology), in conjunction with the terms survey, hospitals, instrument, and measures. A total of seventeen studies were found that provided indicators of clinical and administrative IT capacities.

Overview of Early Efforts to Capture IT Capacities in Hospitals

Six studies used different approaches for capturing IT capacities in hospitals, and discussed various information systems (IS) functionalities (Reference Amarasingham, Diener-West and Weiner1;Reference Brown, Lincoln, Groen and Kolodner6;Reference Burke, Menachemi and Brooks7;Reference Burke, Wang, Wan and Diana9;Reference Goldberger and Kremsdorf15;Reference Haruki, Ogushi and Okada16). These studies varied in scope, and used different terminologies (e.g., information systems, information technology, hospital information system, health information technology) to refer to specific IT capacities.

The earliest efforts to capture IT capacities in hospitals appeared in a study by Haruki et al. (Reference Haruki, Ogushi and Okada16) who surveyed hospital managers in Japan about various hospital IS: “dedicated management systems” (e.g., billing, personnel); “order entry systems for outpatients” (e.g., prescriptions, physiological tests); “order entry systems for inpatient” (e.g., laboratory tests, appointments); and “reference systems and other applications” (e.g., medication history, tests results) (Reference Haruki, Ogushi and Okada16).

Two years later, another study by Goldberger and Kremsdorf (Reference Goldberger and Kremsdorf15) identified 54 clinical IS functionalities in an effort for prioritizing and assessing current technological capabilities in a large hospital system in the United States. A methodology was developed for ranking the clinical functionalities (e.g., admission / discharge / transfer, lab results reviews, pharmacy department functions etc.) that relate to one of five work processes: results retrieval, clinical care delivery and documentation, department operations, inpatient care management, or administrative procedures (Reference Goldberger and Kremsdorf15).

Brown et al. (Reference Brown, Lincoln, Groen and Kolodner6) also described the functionalities included in an information system by presenting a study on the historical development of the Veterans Administration health information system and technology architecture (VistA) and its functionalities. A list of applications was provided along the dimensions of infrastructure (e.g., master patient index), financial (e.g., accounts receivable), administrative (e.g., incident reporting), and clinical (e.g., computerized patient record system) areas to represent the features and capabilities of the system (Reference Brown, Lincoln, Groen and Kolodner6).

Assessment of clinical IT capacities in four U.S. hospitals was also performed by Amarasingham et al. (Reference Amarasingham, Diener-West and Weiner1) who developed an instrument that measures the automation and usability of information transactions in hospitals from providers’ perspectives. The instrument focused on specific medical diagnoses and procedures (e.g., nephrology consults, colonoscopy results). Sixty nine items were identified to assess automation along four subdomains: test results (e.g., lab data), notes and records (e.g., vital statistics), order entry (e.g., medications), and various processes (e.g., event monitoring); the measures of usability (twenty-one items) were related to effectiveness, ease, and support (Reference Amarasingham, Diener-West and Weiner1).

Last, two studies conducted by Burke et al. (Reference Burke, Wang, Wan and Diana9) and Burke and Menachemi (Reference Burke, Menachemi and Brooks7) classified IT functions into administrative, clinical, and strategic applications. They used the Dorenfest database to study IT capacities in relation to organizational and market variables. The scores for each of the three categories of IT applications were based on the number of automated applications reported by each hospital (Reference Burke, Menachemi and Brooks7).

Despite these efforts for gauging IT capacities in hospitals, limitations appeared in relation to the measures used. Examples include the absence of a conceptual framework and random categorization of measures, the focus on one dimension of IT capacities (e.g., only applications and not technologies), the crude nature of indicators (e.g., using counts), the lack of validation of the measures used, and the lack of generalizability with a focus on specific cases / settings.

Recent Streams of Research in Relation to IT Capacities in Hospitals

More recently, and in light of the increasing interest in capturing the level of IT capacities in hospitals, two streams of research evolved in this field, which relied on two IT measurement instruments in hospitals.

The first stream of research is represented by five studies that surveyed hospitals in the State of Florida (Reference Brooks, Menachemi, Burke and Clawson5,Reference Burke and Menachemi8;Reference Menachemi, Burke and Brooks22Reference Menachemi, Burkhardt, Shewchuk, Burke and Brooks24). These studies varied in scope from examining IT in relation to systems affiliation and financial performance (Reference Menachemi, Burke, Clawson and Brooks23;Reference Menachemi, Burkhardt, Shewchuk, Burke and Brooks24), to studying IT in relation to patient safety issues (Reference Brooks, Menachemi, Burke and Clawson5;Reference Burke and Menachemi8;Reference Menachemi, Burke and Brooks22). The same IT measurement instrument was used including items that assess planned IT adoption in hospitals, adoption issues, patient safety issues, and the current use of administrative, clinical and strategic IT applications (Reference Burke and Menachemi8;Reference Menachemi, Burke, Clawson and Brooks23;Reference Menachemi, Burkhardt, Shewchuk, Burke and Brooks24). Overall, twenty-five clinical IT applications (e.g., electronic health records, computerized physician order entry), twenty-one administrative applications (e.g., patient billing, payroll), and ten strategic IT applications (e.g., case-mix analysis, enterprise resource planning) were investigated (Reference Menachemi, Burkhardt, Shewchuk, Burke and Brooks24). In these cases, a count of IT applications was used to represent the actual IT capacities (Reference Brooks, Menachemi, Burke and Clawson5;Reference Burke and Menachemi8;Reference Menachemi, Burkhardt, Shewchuk, Burke and Brooks24). Nevertheless, no formal validation of the instrument was performed, and the crude nature of the measures and scoring approach were important limitations.

On the other hand, Paré and Sicotte (Reference Paré and Sicotte26) proposed an instrument that measures three dimensions of IT sophistication (functional, technological, and integration) in hospitals in four administrative and clinical domains: patient management, patient care activities (medical-physician, nursing, emergency, surgery); clinical support activities (laboratory, radiology, pharmacy); and administrative functions (financial resources, human resources, and materials management). Computerized applications (functional capacities) were assessed as present / absent and the percent of hospitals reporting these processes was computed (e.g., Jaana et al. and Paré and Sicotte) (Reference Jaana, Ward, Paré and Wakefield21;Reference Paré and Sicotte26). The extent of use of various technologies in clinical areas (technological IT capacities) was measured on a (1–7) scale (barely used to extensively used) (Reference Paré and Sicotte26). The level of IT integration was also assessed on a (1–7) scale representing the level of internal and external integration of various applications (Reference Paré and Sicotte26). Following the work by Paré and Sicotte (Reference Paré and Sicotte26) who applied the instrument in a survey of hospitals in Canada, five other studies appeared in the literature that were based on the same tool (Reference Culler, Atherly and Walczak13;Reference Culler, Hawley, Naylor and Rask14;Reference Jaana, Ward, Paré and Sicotte20;Reference Jaana, Ward, Paré and Wakefield21;Reference Ward, Jaana, Bahensky, Vartak and Wakefield29). This survey instrument, which was used in hospitals in the United States and Canada, was validated in both settings and demonstrated good psychometric properties. Nevertheless, IT has significantly evolved since 2001 which necessitates revisiting the original instrument. Furthermore, the length of the original instrument and the absence of a consistent scale to reflect the implementation of various applications and technologies represent two critical issues that must be addressed.

The recent efforts to gauge IT in hospitals provide a solid ground for the development of a new generation of instrument that is more comprehensive and user-friendly. This research addresses this issue by integrating the existing literature in this field and proposing an IT capacities assessment tool in hospitals that is tested through a survey of healthcare organizations in two Canadian provinces.

METHODS

Instrument Development Steps

As a first step in the development of the instrument, we relied on the conceptual model originally proposed by Paré and Sicotte (Reference Paré and Sicotte26), which represents a solid and holistic approach for examining multi-dimensions of IT capacities in hospitals, and has been previously considered and validated in studies in the United States and Canada. Therefore, we divided the instrument into three sections measuring: (i) The extent of implementation of computerized processes / applications; (ii) The extent of implementation of technological devices; and (iii) The integration of internal administrative and clinical information and the extent of sharing of information with other external entities.

Second, based on the literature review, we extracted all items that had been used in prior studies as measures of IT in hospitals. Overall, six studies in the literature clearly presented measures of IT / IS capabilities in hospitals that can be used for the purpose of this research (Reference Brown, Lincoln, Groen and Kolodner6Reference Burke and Menachemi8;Reference Goldberger and Kremsdorf15;Reference Haruki, Ogushi and Okada16;Reference Menachemi, Burkhardt, Shewchuk, Burke and Brooks24;Reference Paré and Sicotte26). These measures were closely examined to identify overlaps and redundancies (e.g., different terminologies to indicate the same technology; measures of computerized processes with areas of overlap). The earliest study by Haruki et al. (Reference Haruki, Ogushi and Okada16) was not considered because most of the applications examined back then focused on order entry and results reporting, in addition to very few basic administrative applications. The study by Amarasingham et al. (Reference Amarasingham, Diener-West and Weiner1) was also excluded because it focused on IS automation and usability for specific medical diagnoses and procedures from the perspectives of healthcare providers, which is beyond the scope of this research. To ensure comprehensiveness in addressing the objectives of this study, we decided to consider the Healthcare Information and Management Systems Society (HIMSS) survey (17) as an additional reference in the development of the proposed IT assessment tool. Table 1 presents the thirty-nine major computerized processes and twelve contemporary technologies identified.

Table 1. Mapping of Computerized Processes and Contemporary Technologies Identified Based on the Literature Reviewa

Third, based on the mapping of IT/IS indicators, we developed the IT capacities assessment tool, which included thirty-two major computerized processes/applications and thirteen contemporary technologies. Specific items were added to the measures identified in the mapping as to capture advanced processes and technologies not presented in prior studies (e.g., clinical dashboards, robots for medication dispensing). To better evaluate the current IT capacities and the organizational strategy on this matter, a time frame was inserted in the tool along four categories: no plan for implementation, planning to implement, began implementation, and implemented. Hospitals reporting “implemented” were also asked to report the extent of use of computerized processes and technologies on a (1–7) scale. To represent the integration dimension, questions were developed to assess the extent of information sharing on a (1–7) scale with other external entities (e.g., insurance companies, medical clinics etc.). Finally, the last section of instrument assessed the profile of the respondents and surveyed organizations.

Fourth, pretesting was performed among five IT experts in Canada and two in the United States. Their feedback was integrated to improve the content and structure of the instrument. (e.g., adding examples in the instrument, rewording items). In addition, based on the results of the pretest, the measures assessing the implementation of Enterprise Resource Planning (ERP) systems and the modules deployed, and the items measuring the implementation of electronic medical records and the systems with which it is integrated were moved to the section assessing IT internal integration. These items were considered more as indicators of internal clinical and administrative integration than computerized processes. The final instrument included thirty-four items exploring the implementation of computerized processes, thirteen items assessing the current status of contemporary technologies, and eleven items investigating internal and external information sharing in hospitals (see Appendix; a copy of the instrument is available upon request from the authors).

Sample and Data Collection

In order to apply the IT assessment tool and evaluate its psychometric properties, we conducted a survey of healthcare organizations in two Canadian provinces (Québec and Ontario) between June and September 2007. All hospitals in these provinces, which represent the largest health jurisdictions in Canada in terms of population served and health infrastructures, were invited to participate in this study.

As a first step, we contacted IT directors/administrators by phone, excluding those who had participated in the pretest, to introduce the current study and solicit their participation (Québec, N = 92; Ontario, N = 129). Five IT directors in Quebec and twelve in Ontario refused to participate due to reported time constraints; they were excluded from the study. A hard copy of the questionnaire was then sent with a cover letter and a return envelope to all remaining organizations. Four weeks after the initial mailing, a reminder letter was mailed to organizations that had not responded. In total, sixty and forty-six responses were received in Québec and Ontario, respectively. The overall response rate was 52.2 percent (106 hospitals).

Scoring and Variables

The developed IT assessment tool includes fifty-eight items divided into eight dimensions (Figure 1):

  • D1 = Administrative systems (nine items)

  • D2 = Patient management systems (eight items)

  • D3 = Clinical support systems (four items)

  • D4 = Clinical systems (thirteen items)

  • D5 = Emerging technologies (thirteen items)

  • D6 = Internal integration – Administrative (Enterprise Resource Planning system)

  • D7 = Internal integration – Clinical (Electronic Medical Record system)

  • D8 = External integration (nine items)

Figure 1. Conceptual model representing IT capacities in hospitals.

For each of the eight subsections, a score was computed over 100 based on the weights (points) assigned to the respondents’ answers. First, items under the dimensions D1–D5 were assigned the following weights: (i) No plan for implementation = 0 point; (ii) Planning to implement = 1 point; (iii) Began installation = 3 points; (iv) Implemented with weak utilization as indicated by answers within the [1–4] interval on the Likert scale = 4 points; and (v) Implemented with strong utilization as indicated by answers within the [5–7] interval on the Likert scale = 5 points. Four items measuring the implementation of advanced computerized processes (remote monitoring applications, on-line consumer health information, on-line patient appointment system, and clinical and support staff workload management), which showed no variability in our sample (<1 percent of the respondents reported having them in place), were excluded. Second, items under the dimensions D6 and D7 were assigned similar weights: (i) No plan for implementation = 0 point; (ii) Planning to implement = 1 point; (iii) Began deployment = 3 points; (iv) Implementation completed with (1–3) ERP modules or 1–4 systems integrated with the EMR = 4 points; and (v) Implementation completed with > 4 ERP modules or > 5 systems integrated with the EMR = 5 points. The resulting score (over 100) for the first seven dimensions equals the sum points for all items under a specific dimension, divided by the total number of items in that dimension multiplied by 5 (maximum points for an item), times 100. The functional (D1–D4) and technological (D5) IT sophistication scores in the sample were 66.3 and 30.1, respectively. Third, questions under D8 were assigned the following weights: (i) No external integration (1 on the Likert scale) = 0 point; (ii) Minimal external integration ([2–3] on the Likert scale) = 1 point; (iii) Moderate level of integration (4 on the Likert scale) = 3 points; (iv) High level of external integration ([Reference Brooks, Menachemi, Burke and Clawson5Reference Brown, Lincoln, Groen and Kolodner6] on the Likert scale) = 4 points; and (v) Very high level of external integration (7 on the Likert scale) = 5 points. Five items measuring external information sharing with drug stores, payers, laboratories, government agencies and patients, which showed consistently low scores, were excluded. The score for D8 equals the sum of points for the four items measuring external integration, divided by the total number of items (four) multiplied by the maximum points for an item (i.e., five), times 100. The integration score (D6–D8) in the sample was 50.9. Finally, the overall IT score in the sample was 56.3 (Sum (D1 – D8)/8).

Data Analysis

Descriptive data analysis was conducted to provide an overview of the respondents and surveyed organizations. To assess the psychometric properties of the IT assessment tool, a threefold analysis was done. First, Cronbach alpha coefficients were computed to examine the reliability of the IT measures in each of the eight subsections. Second, construct validity was examined by calculating correlations that reflect the convergent and discriminant validity between the measures in each of the eight subsections. Third, concurrent validity was examined by computing the correlations between the level of IT capacities in each of the eight subsections and five variables.

RESULTS

Profile of Respondents and Hospitals

Table 2 provides an overview of the respondents and their healthcare organizations. Overall, most of the respondents had either an undergraduate or master level education. The managerial and IT tenure of the IT directors/CIOs was high. The average years of experience in IT was 17 years, with more than 10 years average experience in their current organizations and 8 years average experience in their current position.

Table 2. Overview of Respondents and Healthcare Organizations Characteristics

A close examination of the profile of the surveyed healthcare organizations demonstrate that the majority (62 percent) were characterized as rural hospitals and were not affiliated with a teaching university (52 percent). Nevertheless, these hospitals were not small in size (average of 354 beds), nor had limited human resources (e.g., average number of physicians and nurses was 216 and 853, respectively).

Reliability

To assess the reliability of the instrument, Cronbach alpha coefficients, which are indicators of internal consistency of the items (Reference Churchill12), were computed. They varied in magnitude from 0.65 to 0.85. With the exception of the clinical support applications subsection that was associated with a coefficient of 0.65, all coefficients were above 0.70, which indicates a good level of reliability of the measures developed, and their ability to capture the underlying constructs.

Construct Validity

Construct validity refers to the ability of an instrument to measure specific constructs and traits (Reference Churchill12). It is usually determined by examining whether measures behave as expected, and evaluating their correlation with other measures designed to capture the same thing (Reference Churchill12). Table 3 presents the correlations used to assess the construct validity of the measures used. As the results show, the correlations that appear on the leading diagonal, which represent the square root of the variance shared by the constructs and their measures, are larger than off diagonal correlations among constructs. This is reflective of convergent and discriminant validity (Reference Churchill12); the correlation between different measures of the same construct are high and larger than those among constructs.

Table 3. Construct and Concurrent Validity of IT Measures in the Eight Subsections Assessing the Three Dimensions of IT Capacities in the Proposed Instrument

*** p < .001; ** p < .05; * p < .01; ns, nonsignificant.

D1, Administrative applications; D2, Patient management applications; D3, Clinical support applications; D4, Clinical applications; D5, Contemporary technologies; D6, Internal integration–administrative; D7, Internal integration – clinical; D8, External integration.

Concurrent Validity

Concurrent validity refers to the ability of measures to correlate with other variables and differentiate between organizations based on these variables. Concurrent validity was assessed by examining the correlation between the items in the eight subsections and four variables: bed size, annual organizational budget, annual IT budget, and number of IT staff (Table 3). As the results show, with the exception of the measures of clinical support applications that significantly correlated with only the annual organizational budget and annual IT budget, all measures of computerized processes and technologies presented significant correlations with the four indicators considered. The measures of integration however only correlated significantly with the annual IT budget, which might be due to the small number of measures used to assess internal and external integration in the developed instrument.

DISCUSSION

Although progress has been made over the years in relation to assessing IT capacities in hospitals (e.g., Reference Menachemi, Burkhardt, Shewchuk, Burke and Brooks24;Reference Paré and Sicotte26;Reference Poon, Jha and Christino27), prior measures had been constrained by the limited scope of the IT areas covered, the lack of proper validation of the measures, the characteristics of the instruments used, and the lack of a comprehensive IT score that reflects IT capacities in these settings. This research unifies the IT literature in this area and addresses these issues by proposing and validating an IT capacities assessment tool. The psychometric properties demonstrated a good level of validity and reliability. When assessing concurrent validity, the absence of significant correlations between the measures of internal and external integration and the criteria examined might be attributed to the small number of variables measuring the integration dimension. In addition, the absence of significant correlations between the measures of clinical support applications and the number of beds, and the number of IT staff might be explained by the nature of these applications that support ancillary services/departments in hospitals and might not be directly related to these variables.

Despite the contribution of this research by developing a validated IT capacities assessment tool and scoring approach, it is important to note that these scores do not indicate the extent to which IT capacities are adequately applied in a hospital setting. The proposed instrument aims at capturing IT capacities through a comprehensive IT score. It does not reflect however the degree of success in applying IT components, nor indicates problems in their implementation, which is beyond the scope of this study. It is also important to note that data on healthcare organizations were only available from respondents, which precluded any comparison between responding and nonresponding hospitals on these characteristics. Finally, given the fact that the instrument was validated among Canadian healthcare organizations, replicating the survey outside Canada or in other provinces would further support the generalizability of the proposed IT assessment tool.

This study presents a contribution to this field by unifying existing literature on IT in hospitals and presenting a validated IT capacities assessment tool. This is an important step toward better understanding the environment of hospitals in relation to IT. The developed instrument can be used by hospitals to exercise benchmarking and assess their position in the market in relation to IT capacities. It can also assist researchers in addressing important questions investigating the relationship between IT and organizational/contextual variables. By overcoming prior limitations in this area and developing scores for capturing IT capacities, it is possible to further analyze the determinants of IT in hospitals, and evaluate the relationship between IT sophistication and organizational outcomes.

CONTACT INFORMATION

Mirou Jaana, PhD (), Assistant Professor, Tefler School of Management, Health Administration, University of Ottawa, 55 Laurier Avenue East, Ottawa, Ontario K1N 6N5, Canada

Guy Paré, PhD (), Canada Research Chair in Information Technology in Health Care, HEC Montreal, 3000, Cote-Ste-Catherine, Montréal, Québec H3T 2A7, Canada

Claude Sicotte, PhD (), Professor, Department of Health Administration, University of Montréal, P.O. Box 6128, Station Downtown, Montréal, Québec H3C 3J7, Canada

APPENDIX

Overview of the Measures Included in the IT Capacities Assessment Tool

References

REFERENCES

1. Amarasingham, R, Diener-West, M, Weiner, M, et al. Clinical information technology capabilities in four U.S. hospitals: Testing a new structural performance measure. Med Care. 2006;44:216224.CrossRefGoogle ScholarPubMed
2. American Hospital Association. Continued progress: Hospital use of information technology. Chicago: American Hospital Association. http://www.aha.org/aha/content/2007/pdf/070227-continuedprogress.pdf. Accessed April 14, 2008.Google Scholar
3. Ball, MJ. Hospital information systems: Perspectives on problems and prospects, 1979 and 2002. Int J Med Inform. 2003;69:8389.CrossRefGoogle ScholarPubMed
4. Bates, DW, Gawande, AA. Improving safety with information technology. N Engl J Med. 2003;348:25262534.CrossRefGoogle ScholarPubMed
5. Brooks, RG, Menachemi, N, Burke, D, Clawson, A. Patient safety-related information technology utilization in urban and rural hospitals. J Med Syst. 2005;29:103109.CrossRefGoogle Scholar
6. Brown, SH, Lincoln, MJ, Groen, PJ, Kolodner, RM. VistA – U.S. Department of veterans affairs national-scale HIS. Int J Med Inform. 2003;69:135156.CrossRefGoogle ScholarPubMed
7. Burke, DE, Menachemi, N. Opening the black box: Measuring hospital information technology capability. Health Care Manage Rev. 2004;29:207217.CrossRefGoogle ScholarPubMed
8. Burke, D, Menachemi, N, Brooks, RG. Diffusion of information technology supporting the Institute of Medicine's quality chasm care aims. J Healthc Qual. 2005;27:2432.CrossRefGoogle ScholarPubMed
9. Burke, DE, Wang, BBL, Wan, TTH, Diana, ML. Exploring hospitals’ adoption of information technology. J Med Syst. 2002;26:349355.CrossRefGoogle ScholarPubMed
10. Canada Health Infoway. Coming together: The first electronic health record systems are beginning to transform Canadian health care. 2003–2004 Annual Report. Montreal Toronto: Canada Health infoway; 2004. http://www2.infoway-inforoute.ca/Documents/Annual%20Report%2003-04%20EN.pdf. Accessed December 15, 2008.Google Scholar
11. Chaudhry, B, Wang, J, Wu, S, et al. Systematic review: Impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med. 2006;144:E12-E22.CrossRefGoogle ScholarPubMed
12. Churchill, GA Jr. A paradigm for developing better measures of marketing constructs. J Mark Res. 1979;16:6473.CrossRefGoogle Scholar
13. Culler, SD, Atherly, A, Walczak, S, et al. Urban-rural differences in the availability of hospital information technology applications: A survey of Georgia hospitals. J Rural Health. 2006;22:242247.CrossRefGoogle ScholarPubMed
14. Culler, SD, Hawley, JN, Naylor, V, Rask, KJ. Is the availability of hospital IT applications associated with a hospital's risk adjusted incidence rate for patient safety indicators: Results from 66 Georgia hospitals. J Med Syst. 2007;31:319327.CrossRefGoogle ScholarPubMed
15. Goldberger, D, Kremsdorf, R. Clinical information systems – Developing a systematic planning process. J Ambul Care Manage. 2001;24:6783.CrossRefGoogle ScholarPubMed
16. Haruki, Y, Ogushi, Y, Okada, Y, et al. Status and perspective of hospital information systems in Japan. Methods Inf Med. 1999;38:200206.Google ScholarPubMed
17. Healthcare Information and Management Systems Society. 18th Annual HIMSS leadership survey: CIO results final report. Chicago: Healthcare Information and Management Systems Society; 2007. http://www.himss.org/2007survey/DOCS/18thAnnualLeadershipSurvey.pdf. Accessed February 11, 2008.Google Scholar
18. Institute of Medicine. Crossing the quality chasm: A new health system for the 21st century. Washington, DC: National Academy Press; 2001.Google Scholar
19. Institute of Medicine. To err is human: Building a safer health system. Washington, DC: National Academy Press; 2000.Google Scholar
20. Jaana, M, Ward, MM, Paré, G, Sicotte, C. Antecedents of clinical information technology sophistication in hospitals. Health Care Manage Rev. 2006;31:289299.CrossRefGoogle ScholarPubMed
21. Jaana, M, Ward, M, Paré, G, Wakefield, D. Clinical information technology in hospitals: A comparison between the State of Iowa and two provinces in Canada. Int J Med Inform. 2005;74:719731.CrossRefGoogle ScholarPubMed
22. Menachemi, N, Burke, D, Brooks, RG. Adoption factors associated with patient safety-related information technology. J Healthc Qual. 2004;26:3944.CrossRefGoogle ScholarPubMed
23. Menachemi, N, Burke, D, Clawson, A, Brooks, RG. Information technologies in Florida's rural hospitals: Does system affiliation matter? J Rural Health. 2005;21:263268.CrossRefGoogle ScholarPubMed
24. Menachemi, N, Burkhardt, J, Shewchuk, R, Burke, D, Brooks, RG. Hospital information technology and positive financial performance: A different approach to finding an ROI. J Healthc Manag. 2006;51:4058.Google ScholarPubMed
25. Ontario Hospital Association. Ontario hospital e-health adoption survey: 2007 Survey top line report. Toronto: Ontario Hospital Association; 2007. http://www.oha.com/Client/OHA/OHA_LP4W_LND_WebStation.nsf/resources/E-Health/$file/2007+Ontario+Hospital+e-Health+Adoption+Survey+Top+Line+Report.pdf. Accessed April 14, 2008.Google Scholar
26. Paré, G, Sicotte, C. Information technology sophistication in health care: An instrument validation study among Canadian hospitals. Int J Med Inform. 2001;63:205223.CrossRefGoogle ScholarPubMed
27. Poon, EG, Jha, AK, Christino, M, et al. Assessing the level of healthcare information technology adoption in the United States: A snapshot. BMC Med Inform Decis Mak. 2006;6:19.CrossRefGoogle ScholarPubMed
28. Thompson, TG, Brailer, DJ. The decade of health information technology: Delivering consumer-centric and information-rich health care, framework for strategic action. Washington, DC: US Department of Health and Human Services; 2004.Google Scholar
29. Ward, MM, Jaana, M, Bahensky, JA, Vartak, S, Wakefield, DS. Clinical information system availability and use in urban and rural hospitals. J Med Syst. 2006;30:429438.CrossRefGoogle Scholar
Figure 0

Table 1. Mapping of Computerized Processes and Contemporary Technologies Identified Based on the Literature Reviewa

Figure 1

Figure 1. Conceptual model representing IT capacities in hospitals.

Figure 2

Table 2. Overview of Respondents and Healthcare Organizations Characteristics

Figure 3

Table 3. Construct and Concurrent Validity of IT Measures in the Eight Subsections Assessing the Three Dimensions of IT Capacities in the Proposed Instrument