1National Center for Human Factors in Healthcare, MedStar Health, Washington, DC, USA2Department of Emergency Medicine, Georgetown University School of Medicine, Washington, DC, USA
Rollin J Fairbanks
1National Center for Human Factors in Healthcare, MedStar Health, Washington, DC, USA2Department of Emergency Medicine, Georgetown University School of Medicine, Washington, DC, USA3Department of Industrial and Systems Engineering, State University of New York at Buffalo, Buffalo, New York, USA
A Zachary Hettinger
1National Center for Human Factors in Healthcare, MedStar Health, Washington, DC, USA2Department of Emergency Medicine, Georgetown University School of Medicine, Washington, DC, USA
Natalie C Benda
1National Center for Human Factors in Healthcare, MedStar Health, Washington, DC, USA3Department of Industrial and Systems Engineering, State University of New York at Buffalo, Buffalo, New York, USA
The usability of electronic health records (EHRs) continues to be a point of dissatisfaction for providers, despite certification requirements from the Office of the National Coordinator that require EHR vendors to employ a user-centered design (UCD) process. To better understand factors that contribute to poor usability, a research team visited 11 different EHR vendors in order to analyze their UCD processes and discover the specific challenges that vendors faced as they sought to integrate UCD with their EHR development. Our analysis demonstrates a diverse range of vendors’ UCD practices that fall into 3 categories: well-developed UCD, basic UCD, and misconceptions of UCD. Specific challenges to practicing UCD include conducting contextually rich studies of clinical workflow, recruiting participants for usability studies, and having support from leadership within the vendor organization. The results of the study provide novel insights for how to improve usability practices of EHR vendors.
electronic health records
health information technology
As the adoption of electronic health records (EHRs) by hospitals and ambulatory health care facilities continues to increase rapidly, dissatisfaction with EHRs also continues to increase, primarily driven by usability and safety challenges.1–3 A survey of clinicians, conducted by the American College of Physicians, suggests that dissatisfaction with the EHR’s ease of use increased from 23% in 2010 to 37% in 2012.4 EHRs are often perceived by providers as difficult to use, and usability analysts have cited issues with difficult-to-read interfaces, confusing displays, and iconography that lacks consistency and intuitive meaning.5,6 In addition, providers often feel that there are missed opportunities; the systems do not fully support their cognitive workflow or information needs and often display data in ways that inhibit clinicians from easily drawing insights.7,8 Poor usability not only results in an increased level of clinician frustration, it can lead to errors, posing serious threats to patient safety.9–11
Realizing that usability will encourage adoption, enhance safety, and is a critical element in the design and development of EHRs, the Office of the National Coordinator of Health Information Technology (ONC) has included “Safety Enhanced Design” certification requirements in the 2014 Standards and Certification Criteria, which stipulate both usability process requirements and testing requirements.12 A vendor that seeks to develop a certified EHR must attest to employing a user-centered design (UCD) process and must conduct and report the results of summative usability testing on 8 specified EHR functions. UCD is a process in which the needs of the user are taken into consideration during each stage of design and development (see International Organization for Standardization 9241 for a more extensive description). This definition of UCD is used by the ONC and the National Institute of Standards and Technologies for EHRs.12,13
Few studies have focused on EHR vendors’ UCD processes; rather, the majority of research on EHR usability has focused on describing the impact of the EHR on clinician work processes and optimal EHR design principles.5,14,15 Since the quality of the UCD process is thought to be a major factor in determining the usability of the EHR, our research goal was to better understand the current range of UCD processes being employed by vendors and the specific challenges that vendors face as they practice UCD.13,16 Our research is unique in that it is a vendor-centric project that was conducted within ONC’s larger strategic health information technology (IT) advanced research projects—SHARPC—program that focuses on efforts to improve the usability of health IT systems.
To gain a better understanding of vendors’ UCD processes, a team composed of a human factors scientist, a clinician/human factors expert, a clinician/informatics expert, and an industrial engineer traveled to a diverse set of EHR vendors to conduct onsite interviews with vendors’ staff. Because no previous studies have sought to understand current UCD processes in the EHR vendor community, qualitative methods were determined to be the most appropriate approach.
The research team defined UCD as any approach to design that integrates extensive information about the people who will use the product (end users) into each stage of the design process, including information about the cognitive needs, abilities, and limitations of the end users, and about the factors and constraints introduced by the work environment and tasks.17
Following the ONC certification requirements, vendors could develop their own UCD process or follow published processes such as the International Organization for Standardization guidelines or guidance from the National Institute of Standards and Technology.13,18
Given the diversity of EHR vendors in terms of company size and product types, a representative sample was sought. A purposive sampling method was used to recruit vendor participants. This qualitative research recruitment method facilitated the selection of vendors based on their position in the marketplace and the type of knowledge that might be elicited. Eleven different vendors were recruited for the interviews that varied in total revenue, total number of employees, and size of the usability staff (if present), as shown in table 1. The vendors were not compensated for their participation and were ensured anonymity.
Semi-structured interviews were conducted on-site with each vendor; 5 vendor visits were full-day and 6 were half-day. Interviews were conducted with staff members identified by the vendor as being most familiar with their software development process and UCD activity. The specific staff members that were selected for the interviews were prearranged. Staff members included business analysts, product managers, developers, and dedicated usability experts (if employed by the vendor). Generally, the interviews were held with 1 to 3 vendor’s staff at a time. Two to 4 members of the research team traveled to each vendor to conduct the interviews, and interviews were conducted together to ensure redundancy in the documentation of comments from the interviewees. To ensure continuity across the visits, 1 member of the research team (the human factors expert and principal investigator) was present at each of the 11 vendor visits and was always accompanied by at least 1 clinician researcher. This design ensured that both human factors and clinical expertise were present for each vendor interview.
During each interview, the research team asked open-ended questions about the vendor’s current software development and UCD processes, challenges to practicing UCD, and potential avenues to facilitate a more robust UCD process. Follow-up probing questions were used extensively to pursue relevant topics and to obtain more specific information.
Data Collection and Analysis
Each of the researchers documented the responses during the semi-structured interviews. The field notes were transcribed and combined into a single data file by the research assistant immediately after the interviews. Once all of the interviews were complete, the research team conducted a qualitative analysis and emerging themes were extracted, with a focus on the characterization of the UCD processes being employed by the vendors and challenges faced by the vendors. This study was approved by the MedStar Health Research Institute human subjects review board.
Characterizing Vendor UCD Practices
From the interview data, 3 distinct groups of vendors emerged based on the characterizations of their UCD process. The 3 groups are described below with quotes from select interviews as support for the categorizations.
Well-developed UCD: Vendors have a well-developed UCD process, including the infrastructure and expertise necessary to understand user requirements in context, utilize an iterative design process with early usability input, and conduct formative and summative testing on several different aspects of their product. Importantly, these vendors have developed efficient processes that allow them to integrate UCD with the rigorous software development timelines that they operate under. These efficient processes often include methods to recruit participants using “existing networks of clients and recruiters that [they] have established” and methods to easily solicit feedback using “remote testing capabilities and testing at [their] user conferences.” These vendors typically have an extensive usability staff and appear to be valued by senior leadership.
Basic UCD: Vendors understand the importance of UCD but do not have a complete UCD process fully integrated with their EHR design and development. Some vendors in this group have already identified the specific process barriers they face and are working to overcome these barriers, while other vendors realize they have an incomplete UCD process but are still working to identify the specific barriers that are preventing a more complete UCD practice. For example, one vendor stated, “We know that conducting observations in the clinical environment is important and we aim to do it once per quarter, but it ends up happening far less than that.” Vendors in this group typically employ a few usability experts, but the usability personnel face specific challenges that will need to be overcome in order for a more rigorous UCD process to be employed.
Misconceptions of UCD: Vendors do not have any UCD process in place although they believe they do through mechanisms like “web forums that allow [their] users to post suggested features and vote on these features.” Vendors in this group generally have a misconception about what constitutes UCD. One prominent misconception is the belief that having an infrastructure for responding to users’ feature requests and complaints is what constitutes UCD. Extensive methods to collect and respond to user feedback have generally been developed, and the vendors highlight these methods as evidence of their UCD process. These vendors generally do not have usability experts on staff, and the leadership of these vendors often lacks an appreciation for UCD.
Challenges to Practicing UCD
Vendors in each of the categories identified specific challenges they face as they strive to practice UCD. These challenges have different themes within each vendor UCD type and are presented below for each category with supporting quotes.
Well-developed UCD: Vendors in this category overwhelmingly described the difficulties of conducting detailed, contextually rich studies of workflow in different sub-specialty clinical disciplines. The investments required to conduct these studies is extensive; consequently, vendors focus on specialties that have the largest market audience. Vendors also described challenges with providers sharing health IT hazards that may be associated with the EHR product. For example, one vendor stated, “With our ambulatory product, people are willing to say there is a design issue but will not share any adverse events because there is no protection.” This information is critical for the vendor to improve the usability and safety of their products.
Basic UCD: Vendors in this category require additional knowledge and resources on how to effectively and efficiently employ the UCD processes that they seek to implement, given their specific development cycles and resource constraints. Knowledge gaps include number of participants for summative testing and appropriate testing procedures (eg, training, experimental setup, etc). Specific resource challenges include difficulties recruiting participants (“it is a real challenge to get participants”) and developing detailed use case scenarios for their usability studies.
Misconceptions of UCD: Vendors in this category require education on the importance of UCD for the usability and safety of the product being developed. Importantly, the leadership needs to understand the business case for UCD to encourage investments in usability resources. One vendor stated, “Our product is used by thousands of people everyday. So if it was that bad, it would already be out of the market.”
Table 2 shows which of the vendors fit into each of the UCD practice categories and provides a summary of the strengths and challenges associated with the vendors in each category. Nearly all vendors identified the rigorous development timelines that they operate under as a significant challenge to practicing UCD. In particular, meeting the summative testing requirements for EHR certification was consistently described as challenging and resource intensive.
Summary of vendor strengths and challenges by UCD category
UCD Practice Category
Well- Developed UCD
Well-developed infrastructure of participants
Remote usability testing
Efficient integration of UCD with development timeline
Resources for research of sub-specialty workflow
Information on safety hazards associated with EHRs
Understand the importance of UCD
Motivated to improve the UCD process
Knowledge of participant training and number of participants for testing
Development of use cases
Efficient participant recruitment
Misconception of UCD
Soliciting user recommendations after product deployment
Leadership support of UCD
Investment in UCD resources and knowledgeable staff
There are increasing pressures on health IT vendors to improve the usability of EHRs and other health IT systems. The ONC has UCD certification requirements in place to promote improved usability. Although the health IT vendors themselves are the end users of these regulations, no data are available to describe the current usability processes of health IT vendors.
Our results reveal variability in the UCD practices of EHR vendors, despite the ONC’s certification requirements that all EHR vendors attest to employing a UCD process in order to certify their EHR product. Given that UCD is an important factor that contributes to the usability and safety of the EHR, the variability in UCD practices may partially account for the poor usability of some vendors’ EHR products.
With an improved understanding of EHR vendors’ UCD processes, there are several opportunities to better integrate UCD into vendor EHR development. Vendors with a well-developed UCD process have discovered how to integrate UCD with the rigorous software development timelines in which nearly all vendors operate. The dissemination of this information to vendors in the basic UCD and misconceptions of UCD categories would help those vendors optimize their UCD practices.
Vendors in the basic UCD category have specific needs that, if addressed, could lead to dramatic improvements in their UCD practices. For example, recognizing that some vendors are challenged with recruiting participants with relevant clinical knowledge to participate in usability studies should prompt health IT stakeholders, such as the ONC, and organizations that represent end users to encourage participation. In addition, strategies should be developed to alleviate the burden of each vendor having to develop detailed use case scenarios for testing. Targeted investments in developing use cases and studying clinical workflows that can then be shared with all vendors has the potential to facilitate UCD practices without stifling innovation.
At the policy level, the variability in UCD practices and the fact that some vendors have a misconception of UCD yet have certified EHR products in the marketplace suggest that certification requirements may need to be adjusted. The challenge will be to implement certification requirements that lead to improved UCD practices for vendors in the misconceptions category without hindering the UCD practices of vendors in the well-developed category. The challenges faced by vendors in the basic UCD and misconceptions of UCD categories suggest that more specific summative testing education on the number of participants and the demographic of participants may need to be provided.
While the vendor-centric research approach provides unique and valuable insights, particularly given the combination of human factors and clinical knowledge from the research team, there were limitations to the study. Although requests were made to the vendors to arrange interviews with the most knowledgeable staff, it is possible that these staff members were unavailable during the visit. There was also variability in the source of information from each of the vendors, given that some vendors had usability experts while others did not. In addition, the research team was not able to directly observe the UCD process, and there may have been variability in what was meant by UCD. Importantly, the usability of the EHR itself was not examined; rather, our focus was on the integration of usability methods into the development process. An important next step will be to examine the usability of the EHR product to determine whether there is a relationship with the rigor of the UCD process being employed by the vendor.
This research was supported by the Office of the National Coordinator of Health Information Technology through the SHARPC research program. The funding technically came from UT Health.
US Department of Health and Human Services. Health information technology: standards, implementation specifications, and certification criteria for electronic health record technology, 2014 edition; revisions to the permanent certification program for health information technology. 45 CFR 170. https://federalregister.gov/a/2012-20982. Published September 4, 2012. Effective October 4, 2012. Accessed February 5, 2013.
. National Institute of Standards and Technology. (NISTIR 7952) Toward a shared approach for ensuring patient safety with enhanced workflow design for electronic health records—summary of the workshop. http://www.nist.gov/customcf/get_pdf.cfm?pub_id=914226. Published August 7, 2013. Accessed August 20, 2013.