Skip to main content

The anatomy of electronic patient record ethics: a framework to guide design, development, implementation, and use



This manuscript presents a framework to guide the identification and assessment of ethical opportunities and challenges associated with electronic patient records (EPR). The framework is intended to support designers, software engineers, health service managers, and end-users to realise a responsible, robust and reliable EPR-enabled healthcare system that delivers safe, quality assured, value conscious care.


Development of the EPR applied ethics framework was preceded by a scoping review which mapped the literature related to the ethics of EPR technology. The underlying assumption behind the framework presented in this manuscript is that ethical values can inform all stages of the EPR-lifecycle from design, through development, implementation, and practical application.


The framework is divided into two parts: context and core functions. The first part ‘context’ entails clarifying: the purpose(s) within which the EPR exists or will exist; the interested parties and their relationships; and the regulatory, codes of professional conduct and organisational policy frame of reference. Understanding the context is required before addressing the second part of the framework which focuses on EPR ‘core functions’ of data collection, data access, and digitally-enabled healthcare.


The primary objective of the EPR Applied Ethics Framework is to help identify and create value and benefits rather than to merely prevent risks. It should therefore be used to steer an EPR project to success rather than be seen as a set of inhibitory rules. The framework is adaptable to a wide range of EPR categories and can cater for new and evolving EPR-enabled healthcare priorities. It is therefore an iterative tool that should be revisited as new EPR-related state-of-affairs, capabilities or activities emerge.

Peer Review reports


The increasing digitalisation of healthcare raises a range of ethical opportunities and challenges [1]. Digital healthcare can simultaneously advance and pose risks to ethical values such as patient autonomy, privacy and confidentiality and individual well-being. Understanding how it can promote or be at odds with ethical values is fundamental to accomplishing responsible digital healthcare. Applied Ethics is a practical approach to identifying and examining ethical concerns related to real world actions and practices [2]. This manuscript presents a framework to guide the discovery and assessment of ethical concerns associated with electronic patient records (EPR), which are a key component of healthcare digitalisation.

An EPR is a digital repository used to collect, store and display information regarding an individual’s medical history. EPRs support clinical care and health service administration and can often be used for secondary purposes such as research or billing in countries with market-based medical reimbursement systems. Compared to the traditional paper-based healthcare record, an EPR presents increased technical capability. Personal health information can be duplicated, shared, and queried with unprecedented speed and scale and therefore used in novel ways to benefit patient care, patient empowerment, efficiency of healthcare processes, and healthcare personnel (HCP) work satisfaction [3,4,5]. Clinical information is rendered more accessible when and where needed allowing better integration of healthcare services as the same healthcare record can be available to authorised HCPs at any location. Efficient interrogation of large volumes of individual or population data made possible with EPRs can support health service monitoring, evaluation, planning, public health and research [6,7,8]. These increased technological capabilities affect a broad range of direct and in-direct stakeholders including patients, their families and carers, HCPs, and third parties such as researchers or policy makers.

With the increased capabilities arising from EPRs comes new or alternative risks and potential for harm. Examples of negative impacts include clinician burnout related to EPR usage [9]; software defects resulting in incorrect drug prescriptions or instructions [10]; threats to privacy associated with poor database security [11]; copying and pasting other clinician's findings without clarifying its provenance [12]; templates design that lead to inaccurate information [13, 14]; promoted prescriptions that deviated from accepted medical standards [15]; and discriminatory algorithms stemming from biased EPR datasets [16]. In short, where EPR systems mediate healthcare delivery, regard for ethical values during all stages of the technology’s life-cycle from design, development, implementation through usage is paramount.

The framework described in this paper is designed to aid the identification of ethical challenges and opportunities associated with EPR technology. It is intended to support designers, software engineers, health service managers, and end-users to realise a responsible, robust and reliable EPR-enabled healthcare system that delivers safe, quality assured, value conscious care. The framework can be understood as an ‘ethical tool’ that guides “debates and deliberative structures for a systematic engagement with ethical issues” related to EPR technology in practice [17]. Development of the framework is based on a consolidation of EPR-related ethical challenges and opportunities debated in the literature that can inform decisions at every stage (design, development, implementation and use) of the EPR life-cycle [18]. While the framework offers an ethical tool, it should not be considered to end all philosophical deliberation of EPR implications. For example, the varied perspective of different users of the tool may lead to mixed interpretation of ethical issues. In this regard, multiple ethical tools can be used in parallel, as ethical tools can complement each other and form a “toolbox” [17].


Development of the EPR applied ethics framework was preceded by a scoping review which mapped the literature related to the ethics of EPR technology [18]. That review identified a range of ethical values which were clustered into: privacy, autonomy, beneficence, human relationships and responsibility. In addition, it attributed responsibilities and duties to stakeholders including patient obligation to provide accurate information and clinician personal behaviours regarding correct documentation of patient records. To develop the framework, the same body of literature was re-examined to determine ethical challenges and opportunities debated in the literature in relation to characteristics across the EPR lifecycle from design, through development, implementation and subsequent use of the technology. In a series of repeat sessions, a multidisciplinary team consisting of an ethicist (TJ), senior hospital-based physician (CD), and an eHealth expert (MF) came together to discuss and reach consensus on interpretation of the literature. Discussions were guided by the assumption that ethical values shape the key components of the design, development, implementation, and use of EPRs.

The concept that decisions made at every stage of a system’s life-cycle can have ethically relevant implications is widely accepted within technology ethics [19,20,21]. Ethical opportunities or challenges occur when the technology supports or conflicts with ethical values. With EPR technology, these can be addressed through concrete and specific instructions, including EPR design requirements, codes of conduct regarding safe use of the system, and standard operating procedures. To illustrate, patient autonomy is an ethical value worth protecting. A related opportunity is to allow patients obtain a greater understanding of the personal health information held about them in an EPR. Specific instructions detail how this opportunity can be operationalised safely and in a manner that is understandable to the patient. The elements of the applied ethics framework described below emerged from considering how EPR technology either upholds or is incompatible with ethical values.


The resultant applied ethics framework aims to support identification and management of EPR-related ethical challenges and opportunities. It has two main sections, Context and Core functions, each of which has been divided into three categories which in turn have a number of attributes (Fig. 1). To apply the framework, an EPR of interest is assessed against each of its elements (sections, categories, and attributes) in order to identify any ethical considerations determine the associated benefits and/or risks, and put measures in place to appropriately address these issues. The framework should be applied at all phases of the EPR lifecycle to ensure robust requirements engineering and design, solid software development, reliable implementation, and safe use and evolution of the system.

Fig. 1

Electronic patient record (EPR) applied ethics framework

While the framework user should start with examining the categories and attributes associated with the EPR context and then explore the elements related to its core functions, it must be noted that this is not a strict linear process as there is a strong interrelatedness between elements of the framework. For example, the “Format and Content” of “Data Collection” within the “Core Functions” section will have implications for the effectiveness of its “Secondary Uses” in the “Context” (Purpose(s) of the technology) section (e.g. analysis of unstructured data is more challenging than that of structured data). Use of the framework is therefore an iterative process. The outcome of each ethical assessment element can feedback to a previous element or feedforward to a subsequent element. Furthermore, the framework can be considered as a continuous quality improvement tool with, for example, repeated “Quality control” (Digitally enabled healthcare in Core Functions) or regular assessment of “Security” (Data access in Core Function) rather than at a single point in time.

In the following each element of the framework and its relevance is explained.


The first step in applying the framework is to clarify the context within which the EPR in question exists or will exist (Fig. 1). Understanding the context provides insight to the associated ethical challenges and opportunities. The process of establishing context should begin at concept stage (when the idea of introducing an EPR is being considered) and be reviewed as the EPR progresses along its life-cycle from requirements engineering and design specification through development and on to implementation and practical application. This will help ensure that the context remains constant or that any necessary changes to it are approved and appropriately addressed. Context comprises the purpose(s) for which the technology is used (e.g. clinical care, administration, secondary use), the interested parties involved in its design, development, implementation and use (e.g. IT developers, researchers, patients, health service administrators and managers and so forth), and the frame of reference (e.g. existing regulations, codes of conduct, and policies and procedures).

Purpose(s) of the technology

As it impacts almost all other elements of the framework, it is essential to begin with identifying the purpose(s) of the technology. For example, the content and format of data collected and stored in the EPR will be a function of its purpose. Ill-defined EPR purpose invites uncontrolled function and scope creep, creating complexity and ethical ambiguity [22, 23]. Without clarity of purpose(s), likelihood of EPR user and use error increases and can results in perceived failure of the technology. Opportunity costs may also arise as money spent on suboptimal EPR projects cannot be spent on other goods or services that can benefit patient care. Especially with public funding, ethics mandates a responsible use of finite resources [24]. Finally, clear determination of EPR purpose(s) informs the next steps in the framework application, namely ascertainment of the interested parties (“Interested parties and their relationships” section).

In the framework, EPR purposes are sorted into clinical care, administration and secondary use (Fig. 1). The ethical considerations and their level of importance will vary according to which of these EPR purposes are in play. An EPR user may use the system for different purposes simultaneously. Equally, different users can share an EPR purpose (e.g. a doctor and a nurse will both have a clinical care purpose) or have different purposes (e.g. health service manager may have an administrative support purpose while a researcher may be interested in the secondary use of data from the EPR).

In general, the use of EPRs to support clinical care is uncontroversial. By facilitating timely access to, and sharing of, information EPRs can become enablers of improved quality, safety and efficiency of healthcare [25]. Nevertheless, clarity regarding the scope of the clinical care purpose is essential to understanding who will be impacted by the EPR system and how it should be used in practice.

EPRs can also support healthcare management and administration functions such as billing, service performance reporting, patient administration systems (PAS), computerized physician order entry (CPOE) and so forth. While such utilities are fundamental to health service delivery, their integration into the EPR system should not negatively impact patient care. For example, administrative or managerial informational needs should not take precedence above the clinicians primary objective of safe patient care nor overburden clinicians with additional workload [26,27,28].

Secondary use of data from EPRs for purposes such as research, marketing, insurance, data brokering, and education impose additional ethical concerns and require greater justification. Such use can sometimes be justified by balancing the projected societal benefits against potential harms. While research based on EPR data may not directly benefit the data subjects, with minimal risk to them the research may lead to healthcare improvements for others [29]. Similarly, if the use of personal data for marketing or other commercial purposes fails to yield sufficient personal or societal benefits, secondary use may be considered undesirable [15, 30] (see also “Data collection” and “Data access” sections).

Interested parties and their relationships

In order to appropriately attribute rights, duties and responsibilities across its life-cycle, those individuals or groups who can either be affected by or affect the safe and ethical design, development, implementation, and use of the EPR must be identified. Furthermore, how these interested parties or stakeholders relate to each other and exert influence on the design, development, implementation, and use should also be considered. The framework (Fig. 1) suggests that EPR stakeholders range from individuals to groups and organisations and include patients [31], their families and/or informal care-partners [32,33,34,35,36], clinical personnel [37], health service managers [38], EPR supplier [39], society [29] and government bodies [40, 41].

To guide identification of interested parties, a number of questions should be asked. Examples include: whose personal data will be captured and stored in the EPR? Who can benefit from, or be harmed by the system? Who has or requires access to the EPR or (part of) its data? Who provides and maintains the EPR system? Who influences requirements engineering, design specification, implementation, and operational decisions about the uses and functionalities of the system or its data?

While the patient is the apparent principal EPR data subject, personal data interests of others can also be affected and therefore require careful consideration. In recording and storing personal health data about a specific patient, EPRs may also capture information about others in close proximity to the patient (e.g. genomic data, family history). Similarly, through audit trails, EPRs capture data about healthcare providers who use the system and can therefore reveal information about the HCP’s productivity which may be used to evaluate their performance.

Realising EPR benefits, such as the facilitation of information exchanges between different healthcare providers both within and between healthcare facilities, requires the ability to navigate the complexities of diverse informational needs and varied roles of different HCPs in delivering healthcare services [37]. Detailed understanding of these roles, their inter-relationships, and when and where they are executed is key to informing the design, development and implementation of the EPR and consequently achieving an ethically robust EPR-enabled health service delivery.

In its duty of care to patients and accountability to the funder, health service management has an essential stake in the EPR domain. Policies and standard operating procedures regarding, for example, rules for using the EPR and technical security measures to prevent data breaches must be operationalised by personnel responsible for day-to-day management of the system [42, 43].

The role of the EPR provider/vendor is critical. Confidence in the supplied product and its on-going maintenance requires a collaborative relationship between the healthcare organisation and provider. Without this, the EPR provider/vendor may exert undue influence on the shape and use of the technology. For example, vendor lock-in may ensue where switching costs are prohibitive, the transfer of data is too difficult, or when there are no available alternatives to the particular EPR system [1]. Decisions about proprietary rights, including ownership of data stored in the EPR [44, 45], are therefore important, as well as agreements around system support. A vendor who stops support for an application can leave the client with an unsafe system.

As noted previously, a variety of third parties may have interest in the data held in an EPR e.g. researchers, auditors, and marketers. As their interests will present particular ethical challenges and opportunities, it is important to consider these stakeholders early in the establishment of the EPR context so that they can be appropriately addressed across all stages of the EPR life-cycle. For example, commercially biased EPR-based clinical decision support tools, and perverse incentives to employ them, may influence prescribing behaviours in clinicians that may be detrimental to safe patient care [15].

Government bodies play a role in creation of a trustworthy environment for the use of personal health data and have a stake in how EPRs can enable improved quality and efficiency of healthcare [46]. They also have a responsibility to ensure prudent use of tax-payers money to fund EPR procurement as well as providing the necessary legislative basis for technology-enabled healthcare. Similarly, wider society has a stake in the ethical impact of EPRs. For example, the large volumes of data contained within EPRs can enable vital public health research [29]. Citizens have an interest in and expectation that such use of their health data is safe and ethical.

Frame of reference

While the ‘purpose’ and ‘interested parties’ elements address “what” and “who” contextual considerations, the frame of reference deals with “how” EPR-enabled healthcare is embedded within existing regulations, professional codes of conduct and organisational policies.

Regulations can support the establishment of a trustworthy environment for patients and healthcare organisations to capture and share personal health data. For example, principles of privacy by design, data minimisation, transparency and so forth will guide EPR requirements engineering and design specification, software development, and standard operating procedures for EPR use that are required to safeguard the rights and freedoms of data subjects [47]. However, regulation regarding the management and use of healthcare data is not standardised across all jurisdictions with some offering less protection of data subjects (patients) than others [48,49,50]. In some instances, legal requirements can actually jeopardise efforts to protect confidentiality as seen in China where medical information used to combat COVID-19 is now being used by local governments for different purposes [51,52,53]. Codes of professional conduct however, oblige healthcare professionals to maintain reliable records of engagement with their patients and to do so in a manner that respects confidentiality and guards against any unauthorised or accidental disclosures of patient information [51, 54, 55].

Healthcare organisations both at the level of the wider system and discrete healthcare facilities adopt policies and procedures to ensure legal and regulatory compliance, and guide best practice for its day to day operations [56, 57]. In establishing EPR context, relevant policies and procedures such as inter alia recommendations for healthcare records management should be considered. Where such policies and procedures were formulated to guide the use of traditional paper-based medical records, they may need updating to appropriately address the extended capabilities offered by EPR systems [40, 41].

Core functions

Despite their many permutations, EPR systems share three core functions (Fig. 1). Firstly, they facilitate Data Collection (capture and storage) primarily about patients but also about family members and healthcare providers (see “Interested parties and their relationships” section): even more so than paper records, the design of an EPR determines which information can be captured and stored in a medical record. Secondly, they provide Data Access to a range of interested parties. EPRs facilitate access to health information for HCPs, administrative personnel and other parties. Finally, EPRs make possible digitally enabled healthcare. These core functions present a range of ethical challenges and opportunities that necessitate careful consideration of the interaction between an ensemble of people, processes and the technology.

Data collection

Key attributes of data collection have ethical implications and relate to: justification for its capture and storage; its content and format; and matters associated with populating the EPR (Fig. 1). In turn each of these attributes will be influenced by the EPR context and will determine who can or should be approved to use the EPR.

To justify the capture and storage of personal data in an EPR, clinical benefits for patients should be the guiding aim. Clinical benefits must significantly outweigh the risks to patient care both in probability and magnitude [24]. In terms of clinical or administrative purposes, so long as the appropriate safeguards are in place, capture and storage of data in an EPR may be considered justifiable in so far as they enable health service providers fulfil their contractual responsibility to service consumers. Where substantial EPR-related benefits and minimal harms exist, unless it is mandatory, waiving patient consent to capture and store their information may be pragmatic as, the consenting procedures can be resource intensive, the HCP-patient power-balance may interfere with the process, and in emergency situations, patients may (temporarily) lack capacity to provide consent [58,59,60]. However, where the inclusion of particular data types in the EPR poses a potential for harm [59,60,61,62], consent may be necessary to ensure the benefit-harm ratio is acceptable to the patient or their carer. An example can be seen in dermatological photography of the genital area or the entire body where increased privacy risks may arise [61].

Meanwhile, with collection of data in the EPR for secondary purposes such as research, the benefit-harm ratio often becomes more speculative so that respect for patient autonomy becomes more important [62, 63]. In these cases, the patient’s informed and voluntary decision about becoming the subject of an EPR system is indicated. Whatever the circumstances (clinical or secondary), the collection of their personnel health data should be transparent to data subjects so that they can exercise their rights in relation to its use if and when needed.

Healthcare involves a wide range of data types ranging from alpha-numeric, images, bioelectric signals and so forth [64]. The purpose of the EPR will determine its data content in terms of the type(s) of data and number of data fields or data tables captured and stored in the system. Data may be formatted in either structured data or unstructured free-text fields. Design of data collection and display interfaces must ensure that an EPR has no harmful effects on the quality and efficiency of patient care nor on the administrative burden or work satisfaction of the end-user of the EPR [26,27,28]. As a combination of both standardised data and patient nuances are essential for safe clinical care, EPR functionality should facilitate an appropriate balance between structured data and free-text [31, 65,66,67,68]. Meanwhile, structured data may be preferred by administrators and managers as it is more amenable to analysis and consequently can inform health service performance monitoring and evaluation. However, the format and content of data desired by administrators should not override informational needs of healthcare teams or patients [26, 27, 31].

Populating the EPR relates to how and where data is entered into the EPR and by whom. This is an important step, as the quality of record keeping can affect the quality of care and the well-being of patients. The process must lead to high quality EPR data as mistakes or omissions can lead to medical errors [69,70,71,72,73,74,75,76] negatively affecting the well-being of patients. Therefore, clinician personal behaviours, such as honesty, accuracy, and conscientiousness in completing patient records and entering data to the EPR is fundamental to the quality and safety of care. As populating the EPR can often be seen as time-consuming, shortcuts or workarounds which can increase risk are sometimes adopted to reduce the burden. For example, copy and paste functionality used to accelerate data entry may result in incorrect data, and abundance of redundant material in an individual’s EPR [28, 77,78,79] and thereby negatively affecting the utility of the medical record and possibly harming care. In some settings, professional scribes have been hired to assist with populating the EPR. However, this practice gives rise to confidentiality and privacy concerns, as it introduces a third party [80]. Additionally, capturing data for subsequent secondary usage can have other unintended negative consequences when it results in pressure on HCPs to populate EPR data fields that are not directly related to the patient’s clinical needs. Likewise, use of EPRs to support administrative purposes or health service financial management may lead to upcoding if the HCP is influenced to overstate diagnoses for monetary gain or commercial reasons [15, 81, 82]. To promote best practice, only authorised and trained users should enter data into the EPR and perverse incentives should be avoided.

Data access

The ease and promptness of data sharing and exchange made possible by EPRs enables access to and distribution of health data for a variety of clinical, administrative and secondary uses. For example, it can facilitate delivery of integrated health services and improved continuity of patient care, by providing HCPs with timely access to accurate information required to deliver healthcare services [67, 83, 84]. However, in the same way that capturing and recording data in the EPR must be justified, subsequent access to it must also validated and limited only to those who have ethically sound grounds to do so. Such grounds will depend on EPR context (purpose, stakeholder and frame of reference) and must be based on balancing autonomy of the data subject (patient) against the benefits-risks ratio associated with sharing and exchanging their data.

In terms of clinical care, decisions regarding access to data held in an EPR may be guided by asking questions such as: which healthcare professionals/providers are part of the patient’s circle of clinical care? What patient data access do they require? What are the potential harms resulting from the distribution of this information and how can these be mitigated? When determining rules regarding access to EPR data for clinical purposes, a tension between the data needs of the HCP to fulfil their healthcare responsibilities and the patient’s ability to decide that sharing their record (or parts of their record) is in their best interests [31, 74, 85,86,87,88,89,90,91,92,93,94,95,96,97] must be addressed. However, as previously noted, the process of capturing patient consent can be challenging. In addition, placing the onus on patients to make decisions may diminish the value or completeness of the EPR data if they refuse to give access to certain HCPs or to include or share specific pieces of information. For those adopting EPR technology, this tension may be reduced by ensuring transparency regarding the access to and distribution of patient data thus allowing the patient some form of control.

Unless there is a reasonable expectation that access will lead to significantly serious harm to the their physical or mental health, patients should be able to obtain access to their personal information. Patient portals are a digital solution for such access that can facilitate a degree of patient control over the content and sharing of their personal healthcare information. However, some ethical challenges must be taken into account when giving patient’s access to their own record. Examples include: ensuring the patient receives information in an understandable format [70, 85, 98, 99] and is supported in interpreting EPR content such as results of clinical investigations [94]; and avoiding HCPs purposely not documenting information in the EPR for fear of evoking a negative reaction from the patient [28, 99]. Another example relates to carers having access to the EPR on the patient’s behalf. For example, when parents have access to their child’s record there may be privacy implications [32,33,34,35,36]. Similarly, confidentiality may be impacted when an estranged parent reads information about themselves in their child’s EPR [35].

Authorising access to data in the EPR for secondary purposes, such as public health or the advancement of scientific knowledge [29, 36, 41, 71, 73, 75, 100,101,102,103,104,105,106,107], requires specific rules. With minimal risks to the patients and significant societal benefits, such efforts may take place without requesting consent, for example if records-based research is considered to pose minimal risk and/or where consent is impractical to obtain [29]. However, careful scrutiny of benefits and harms associated with secondary EPR data use is essential. For example, a vendor may provide an EPR free of charge with an understanding that they may capitalise on access to patient data [108]. As a result, a healthcare organisation may experience an unacceptable loss of control over the functioning of the EPR and the full realisation of its benefits [87, 109].

Risks associated with the EPR relate to unauthorised access to the data stored in the system, whether through intrusion by hacking or login misuse, or through data sharing without appropriate agreements. To protect personal information from such unauthorised access requires a set of policies, procedures, staff training and technical infrastructure [110, 111]. Codes of conduct oblige healthcare professionals to behave in a manner that does not facilitate data breaches. For example, users must not share passwords to the EPR system nor leave the EPR screen open [42, 43, 112, 113]. HCP training in the safe and secure use of EPR system is imperative [110, 111].

Safety and security in terms of user authentication, data access, data storage and backup, and acceptable usage should be incorporated into the design of the EPR [36, 73, 114,115,116,117,118,119,120]. Features that strengthen security include audit trails and role-based access controls (RBAC). The former are a chronological record of who has had access to the EPR, what they have accessed and any updates they have made to the record. With RBAC, EPR users have certain permissions related to their function in the healthcare team. For example, clinical personnel or administrative personnel access will be limited to the records of those patients for which they have clinical responsibility or to those elements of individual patient’s records that are relevant to their role. Where information might carry stigma, such as in the case of mental illness, substance abuse, and sexual health [117, 121, 122], the importance of defining appropriate access restrictions to parts of a patient’s record increases.

When the EPR is used for secondary purposes such as to teach medical students or in the case of data analytics research, security can be promoted through de-identification of the dataset of interest (information that leads to directly or indirectly identifying patients is altered or removed). However, it should be noted that even with de-identification achieving a sufficient level of anonymisation can be challenging as, for example, linkage of two or more datasets can lead to re-identification of data subjects [32, 84, 90, 94, 95]..

Interoperability is the facility of different EPR systems to share patient information within and across organizational boundaries. It enables improved integration of healthcare services and continuity of patient care [123]. Achieving interoperability is not simple as it requires processes and data to be harmonised across different healthcare services and facilities [61, 124,125,126]. For example, individual unique identifiers are required so that records about one patient held in different EPRs can be safely and accurately matched [32]. Poor interoperability poses ethical concerns as inadequate data co-ordination may lead to error and impact patient safety or impair the full realisation of EPR benefits [71, 72].

Digitally-enabled healthcare

The manner in which EPR technology is used in practice requires careful consideration to ensure optimal digital healthcare business processes, acceptable usage of the EPR, continuous quality control of the system and to understand the consequences of EPR-related automation (Fig. 1).

EPR implementation is not simply about the technical artefact. Interaction between the people who will use the EPR, the processes it is expected to support and the technology must be analysed so that improvement opportunities are identified and acted upon. In clinical care, where an EPR is a poor fit for clinical practices a decrease in efficiency, a lower quality and safety of patient care, reduced HCP job satisfaction, and diminished integration across healthcare organisations [70, 114, 118, 124, 127,128,129] may result. Similarly, as a key enabler of remote or virtual care, requirements engineering must ensure that the EPR technology is carefully matched to the associated workflow and patients’ needs [130] as well as mechanisms to establish and maintain the patient-clinicians relationship [28, 131].

During a clinical encounter, EPRs can give rise to the HCP being preoccupied with the computer screen rather than interacting with the patient [28, 76]. This challenge is indicative of the importance of ergonomics and the design of physical healthcare spaces where the EPR will be used [132]. For example, the layout of clinic rooms should facilitate ease of use of the EPR in a manner that promotes inclusivity e.g. positioning the computer monitor so that it can be viewed by the patient as well as the clinician.

An organisation adopting EPR systems must put in place a set of rules or acceptable usage policy (AUP) that guides how the system should be used. The AUP should advise all EPR users of their responsibility to protect patient confidentiality, to acquire accurate and complete information, their obligation to comply with the policy, the right of the organisation to monitor compliance [44, 103, 116, 133,134,135,136] and transparency requirements related to utilising patient data for secondary purposes [137]. AUPs should be formulated to help educate and support all EPR users including for example researchers and students [116, 138, 139]. Likewise, to ensure the AUP works in practice and does not disrupt workflows [140], representative EPR users with relevant expertise should be involved in devising it [134, 141, 142]. As previously noted, codes of professional conduct and employment contracts also inform acceptable usage of EPRs in practice and where necessary, these should be amended to address the new and emerging capabilities made possible with EPR technology [54].

To ensure that the expected benefits are realised and continuously improved, and that no unintended consequences arise, a process of EPR quality control should be implemented [143, 144]. Regular evaluation of benefits and harms can be used to monitor how well EPR-enabled activities are working, to identify opportunities for improvement, and where necessary realign priorities. For instance, if administrative uses of the EPRs negatively impact patient care, processes or priorities can be adjusted [26, 145]. Continuous monitoring can also highlight any unfair distribution of EPR related benefits across patient populations. For example, such inequities might arise due to a digital divide [74], unequal access [146], cultural diversity [36], characteristics or socio-economic status of patients [28, 74, 87], or the clinical condition [147]. Furthermore, as the cost of EPRs should not negatively impact patient services [73, 93, 148], regular review should allow any disproportionate outlay to be seen.

Third parties integrating health data from EPRs into their processes need to be aware of the limitations of the information captured. For example, biases in datasets can affect the outcome of research [72, 73, 75, 149, 150]. Biases can occur when the EPR system has been configured to suit a particular patient population, health concerns that are more prevalent in a particular region, administrative needs or regional medical guidelines [151] or when EPR-based clinical decision support results in drug prescribing behaviours that are influenced by commercial interests rather than clinical needs [15]. In addition, EPRs should not be considered the only relevant source of information for clinical care. Patients, carers or other HCPs may contribute information outside the EPR and their expertise should be considered [28, 131, 152].

The fusion of artificial intelligence (AI) such as machine learning with EPRs has the potential to automate certain parts of data capture, data distribution and communication of diagnoses to patients [153]. For example, chatbots that simulate human conversation, may allow users to access medical information [154]. Such automation may amplify existing ethical challenges and trigger new ethical questions. In their development, AI tools must be trained to conduct a desired task. Such training is based on large datasets containing personal health information of many patients that are sourced from EPRs. Sharing such datasets with AI algorithm developers/vendors is not without significant privacy implications. Furthermore, if biases exist in the training dataset then partiality may occur with the subsequent use of the AI tool [151, 155]. Additionally, automation can muddle responsibilities as clinicians who use AI tools to support clinical decision-making may need to weigh their own judgements against those of an algorithm. If digital avatars [156] are introduced to replace certain HCP tasks, face-to-face patient-clinician encounters are impacted, and critical data-entry errors may not be readily identified. Finally, AI algorithms may lack transparency so that the factors involved in their performance are not understandable to people who use, regulate, or are affected by the EPR system. The lack of transparency is a concern if algorithms are designed to promote commercial interests rather than aim to optimise clinical care [15].


Timely and efficient information sharing and exchange made possible through digitalisation promises to link healthcare services to healthcare constituencies (patients and healthcare providers at any location) thereby facilitating connected health and patient-centred care. EPRs are a foundational element of digital healthcare. However, safely and ethically embedding EPRs in the healthcare pathway involves an ensemble of factors that warrant careful consideration. This paper presents an applied ethics framework that may be used to guide decisions across all stages of the EPR life-cycle from requirements engineering and design specification, through development, implementation and on to practical application. The ultimate aim of the framework is to promote EPR-related practices that reap ethical opportunities while also addressing ethical challenges. Its development was based on a prior review of an extensive body of literature debating EPR-related ethical considerations and their determinants [14].

EPRs offer new capabilities that are unachievable with the traditional paper-based medical record. With EPRs, the same patient information can be available to all authorised healthcare providers regardless of their geographical location, multiple users can have simultaneous access, and large volumes of data are readily interrogated and analysed. As such, EPRs are widely acknowledged as central to the aspirations of health service modernisation which aim to: deal with burgeoning demands being placed on healthcare systems; move away from simply treating illness to promoting health and well-being; and develop new models of integrated care that is delivered in the most appropriate setting for the patient [3,4,5]. Consequently, a responsible design, development, implementation, and use of EPRs necessitates consideration of relevant moral norms to guide digital transformation of health services. For example, a re-appraisal of actors involved in the care process is called for when intra- and inter-organisational multidisciplinary patient care is enabled with EPR technology. Similarly, EPR-enabled data analytics may require greater understanding of third party interests such as public health researchers or IT developers, as well as their input as appropriate across various stages of the technology life-cycle.

The presented EPR applied ethics framework was developed by considering a broad range of issues of ethical interest that can inform all phases of the EPR life-cycle in order to achieve desirable EPR-based outcomes and minimise or eliminate any negative impacts. In this regard, the framework differs from, but may be complemented by, legal or regulatory obligations relevant to EPR adoption such as the European Union’s (EU) General Data Protection Regulations (GDPR) [47] or Health Insurance Portability and Accountability Act (HIPAA) [157] in the US. For example, the data protection impact assessment (DPIA), condition of GDPR supports concepts of privacy by design together with identification and minimisation of data protection risks. Hence, while the legally mandated DPIA focuses on data privacy and protection, the framework illustrated in this paper consolidates a broader range of ethical issues of interest in the EPR domain. Nevertheless, similar to how an EPR DPIA is documented and regularly reviewed and updated to reflect any data processing changes, the EPR ethics framework should be a living document, which is updated as necessary to reflect new state-of-affairs, capabilities or activities.

The primary objective of the EPR Applied Ethics Framework is to identify and create value and other benefits rather than to merely prevent risks. It should therefore be used to steer an EPR project to success rather than be seen as a set of inhibitory rules. The framework has some similarities to risk management employed by organisations to identify and mitigate threats to its function. The framework can augment a healthcare organisations risk management by, for example, preventing investments into sub-optimal EPR technology, and by guiding identification and control of risks arising from unethical EPR-related behaviour. Therefore, early incorporation of an applied ethics approach can help deliver patient safety, healthcare quality as well as economic benefits [19].

Although it has no legal standing nor associated direct financial penalties, the EPR applied ethics framework can influence various actors in bringing about value and benefits. Besides its normative value and peoples’ intrinsic desire to behave ethically, the framework has instrumental value. Failing to attend to ethical considerations can be socially costly resulting in, for example, clinicians being demotivated, healthcare personnel becoming less effective, and a decrease in the quality [9] and trust in the integrity of patient care [15]. Furthermore, financial implications may give rise to: funding being directed towards technologies that lack the support needed to make the desired impact [158]; vendor bargaining power that leads to a decrease in service quality and an increase in price; and clinical practices that are more costly than necessary. Designers, software engineers, vendors, end-users and other relevant actors can be guided by the framework to embed these and other ethical considerations into every stage of the EPR life-cycle.

Use of the framework may be motivated by different stakeholders’ desire to act responsibly and to maintain or enhance their reputation. Designers and developers want to ensure that their EPR system is robust, reliable and does not cause harm. Vendors want a product that meets the needs of their customers and develops a positive EPR market identity for their organisation. End-users want an EPR that can facilitate safe and effective healthcare service delivery. Even when there is a perceived or actual gain for one particular stakeholder through their engagement in, for example, “perverse incentives” or “vendor lock-in” practices, other stakeholders can use the framework to assess and address matters. For example, purchasers of an off the shelf EPR may require vendors to attest that their technologies are not influenced improperly by commercial interests. Similarly, concerns around EPR functionality or “vendor lock-in” may be mitigated by demanding that interoperability and data portability be designed and developed into the system.

The framework has the flexibility to deal with new and emerging EPR-related conditions. For example, the current COVID-19 pandemic has heightened an interest in the role of EPRs to support public health and epidemiological research, and delivery of remote/virtual healthcare. Managing a pandemic requires accurate and quick access to relevant health information. For example, in the UK, a unified dataset allowed rapid interrogation of health information of 17 million people to determine risk factors associated with death from COVID-19 [6]. In many other countries there is neither a comparable single, unfragmented dataset nor the technical infrastructure to query health information in an efficient and responsible way [159]. By showing what is at stake when timely data analytics capabilities are lacking, the pandemic may further guide how priorities for EPR data capture and sharing are established. However, health data analytics aspirations should not devalue the importance of relevant ethical values such as patient privacy or patient autonomy. Rather, the potential to use EPR-based data for public health purposes should be considered from the outset while primarily aiming to realise clinical benefit from the technology together with safeguarding patient autonomy, confidentiality and so forth.


Although the framework has been developed by a multidisciplinary team (TJ, CD, MF), its practical application has not yet been tested. However, a study is currently underway to examine the usability and utility of the framework from which guidelines for its operationalisation will emerge. The outcome of this study will be reported in due course. Additionally, as the framework is an ethical tool, it may fail to address legal concerns around EPRs. Nevertheless, it can complement relevant regulatory and legal considerations. Finally, the framework provides a broad and expansive overview of ethical challenges and opportunities associated with EPR technology across its life-cycle. The framework can therefore supplement other more specialised frameworks that discuss discrete challenges and opportunities in more detail thereby contributing to an EPR ethical toolbox.


Responsible and ethical adoption of EPRs into the healthcare pathway involves a complex and interrelated ensemble of people, processes and technology. To support the management of this complexity, a framework based on literature regarding EPR-related ethical issues of interest has been developed. The framework presents a taxonomy of context and core function considerations that can help guide identification of EPR-related ethical challenges and opportunities. It should be applied across all stages of the EPR life-cycle from concept through to practical use in order to ensure the required measures are in place to achieve high-quality, safe and ethical EPR-enabled healthcare delivery.

Availability of data and materials

Not applicable.



Artificial intelligence


Acceptable usage policy


Data protection impact assessment


Electronic patient records


General Data Protection Regulations


Healthcare personnel


Health Insurance Portability and Accountability Act


Role-based access controls


  1. 1.

    Royakkers L, Timmer J, Kool L, van Est R. Societal and ethical issues of digitization. Ethics Inf Technol. 2018;20(2):127–42.

    Article  Google Scholar 

  2. 2.

    Dittmer J. Applied ethics. The internet encyclopedia of philosophy.

  3. 3.

    Bodenheimer T, Sinsky C. From triple to quadruple aim: care of the patient requires care of the provider. Ann Family Med. 2014;12(6):573–6.

    Article  Google Scholar 

  4. 4.

    Pannunzio V, Kleinsmann M, Snelders D. Design research, eHealth, and the convergence revolution. arXiv e-prints [Internet]. 2019 September 01, 2019.

  5. 5.

    Sikka R, Morath JM, Leape L. The Quadruple Aim: care, health, cost and meaning in work. BMJ Quality Saf. 2015;24(10):608–10.

    Article  Google Scholar 

  6. 6.

    Williamson E, Walker AJ, Bhaskaran KJ, Bacon S, Bates C, Morton CE, et al. OpenSAFELY: factors associated with COVID-19-related hospital death in the linked electronic health records of 17 million adult NHS patients. medRxiv. 2020:2020.05.06.20092999.

  7. 7.

    Campillo-Artero C. When health technologies do not reach their effectiveness potential: a health service research perspective. Health Policy (Amsterdam, Netherlands). 2012;104(1):92–8.

    Article  Google Scholar 

  8. 8.

    Friedman DJ, Parrish RG, Ross DA. Electronic health records and US public health: current realities and future promise. Am J Public Health. 2013;103(9):1560–7.

    Article  Google Scholar 

  9. 9.

    Gawande A. Why doctors hate their computers The New Yorker. 2018.

  10. 10.

    Schulte F, Fry E. Electronic health records creating a ‘new era’ of health care fraud, officials say fortune: fortune media IP limited; 2019 [updated December 23, 2019.

  11. 11.

    Gillum J, Kao J, Larson J. Millions of Americans’ Medical Images and data are available on the internet. Anyone can take a peek. ProPublica: ProPublica inc.; 2019.

  12. 12.

    Tsou AY, Lehmann CU, Michel J, Solomon R, Possanza L, Gandhi T. Safe Practices for copy and paste in the EHR systematic review, recommendations, and novel model for health IT collaboration. Appl Clin Inf. 2017;8(1):12–34.

    Google Scholar 

  13. 13.

    Bowman S. Impact of electronic health record systems on information integrity: quality and safety implications. Perspectives in health information management. 2013;10(Fall):1c-c.

  14. 14.

    Simborg DW. Promoting electronic health record adoption. Is it the correct focus. JAMIA. 2008;15(2):127–9.

    Google Scholar 

  15. 15.

    Taitsman JK, VanLandingham A, Grimm CA. Commercial influences on electronic health records and adverse effects on clinical decision-making. JAMA Int Med. 2020;180(7):925–6.

    Article  Google Scholar 

  16. 16.

    Obermeyer Z, Powers B, Vogeli C, Mullainathan S. Dissecting racial bias in an algorithm used to manage the health of populations. Science. 2019;366(6464):447–53.

    Article  Google Scholar 

  17. 17.

    Beekman V, Brom FWA. Ethical tools to support systematic public deliberations about the ethical aspects of agricultural biotechnologies. J Agric Environ Ethics. 2007;20(1):3–12.

    Article  Google Scholar 

  18. 18.

    Jacquemard T, Doherty CP, Fitzsimons MB. Examination and diagnosis of electronic patient records and their associated ethics: a scoping literature review. BMC Med Ethics. 2020 (in press).

  19. 19.

    Van den Hoven J. Value sensitive design and responsible innovation. Responsible innovation: managing the responsible emergence of science and innovation in society; 2013. p. 75–83.

  20. 20.

    Vallor S. Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting. New York: Oxford University Press; 2016.

    Book  Google Scholar 

  21. 21.

    Reijers W, Wright D, Brey P, Weber K, Rodrigues R, O’Sullivan D, et al. Methods for practising ethics in research and innovation: a literature review, critical analysis and recommendations. Sci Eng Ethics. 2018;24(5):1437–81.

    Article  Google Scholar 

  22. 22.

    Greenhalgh T. How to improve success of technology projects in health and social care. Public Health Res Pract. 2018;28(3):4.

    Article  Google Scholar 

  23. 23.

    Spencer M. Brittleness and bureaucracy: software as a material for science. Perspect Sci. 2015;23(4):466–84.

    Article  Google Scholar 

  24. 24.

    Emanuel EJ, Wendler D, Grady C. What makes clinical research ethical? JAMA. 2000;283(20):2701–11.

    Article  Google Scholar 

  25. 25.

    Manca DP. Do electronic medical records improve quality of care? Yes Can Fam Physician. 2015;61(10):846–51.

    Google Scholar 

  26. 26.

    de Ruiter HP, Liaschenko J, Angus J. Problems with the electronic health record. Nurs Philos. 2016;17(1):49–58.

    Article  Google Scholar 

  27. 27.

    Haig SV. Ethical choice in the medical applications of information theory. Clin Orthop Relat Res. 2010;468(10):2672–7.

    Article  Google Scholar 

  28. 28.

    Sulmasy LS, Lopez AM, Horwitch CA, Amer Coll Ethics P. Ethical Implications of the Electronic Health Record: In the Service of the Patient. J Gen Int Med. 2017;32(8):935–9.

    Article  Google Scholar 

  29. 29.

    Mann SP, Savulescu J, Sahakian BJ. Facilitating the ethical use of health data for the benefit of society: electronic health records, consent and the duty of easy rescue. Philos Trans R Soc A Math Phys Eng Sci. 2016;374(2083).

  30. 30.

    Ipsos MORI, Social Research Institute. The One Way Mirror: Public attitudes to commercial access to health data. The Wellcome Trust; 2016 March 2016.

  31. 31.

    Van der Ploeg I. Positioning the Patient: Normative Analysis of Electronic Patient Records. Methods Inf Med. 2003;42(4):477–81.

    Article  Google Scholar 

  32. 32.

    Sittig DF, Singh H. Legal, ethical, and financial dilemmas in electronic health record adoption and use. Pediatrics. 2011;127(4):e1042–7.

    Article  Google Scholar 

  33. 33.

    Slabbert MN. Parental access to minors’ health records in the South African health care context: Concerns and recommendations. Med Law. 2005;24(4):743–59.

    Google Scholar 

  34. 34.

    Smolyansky BH, Stark LJ, Pendley JS, Robins PM, Price K. Confidentiality and electronic medical records for behavioral health records: the experience of pediatric psychologists at four children’s hospitals. Clin Pract Pediatr Psychol. 2013;1(1):18–27.

    Article  Google Scholar 

  35. 35.

    Nielsen BA. Confidentiality and electronic health records: keeping up with advances in technology and expectations for access. Clin Pract Pediatr Psychol. 2015;3(2):175–8.

    Article  Google Scholar 

  36. 36.

    Fry CL, Spriggs M, Arnold M, Pearce C. Unresolved ethical challenges for the Australian personally controlled electronic health record (pcehr) system: key informant interview findings. AJOB Empir Bioeth. 2014;5(4):30–6.

    Article  Google Scholar 

  37. 37.

    Hardey M, Payne S, Coleman P. ‘Scraps’: hidden nursing information and its influence on the delivery of care. J Adv Nurs. 2000;32(1):208–14.

    Article  Google Scholar 

  38. 38.

    Wallace IM. Is patient confidentiality compromised with the electronic health record? A Position Paper. CIN Comput Inform Nurs. 2015;33(2):58–62.

    Article  Google Scholar 

  39. 39.

    Vincent J. Google is absorbing DeepMind’s health care unit to create an ‘AI assistant for nurses and doctors’: Vox Media; 2018.

  40. 40.

    Caenazzo L, Tozzo P, Borovecki A. Ethical governance in biobanks linked to electronic health records. Eur Re Med Pharmacol Sci. 2015;19(21):4182–6.

    Google Scholar 

  41. 41.

    Francis LP. The physician-patient relationship and a national health information network. J Law Med Ethics. 2010;38(1):36–49.

    Article  Google Scholar 

  42. 42.

    Sade RM. Breaches of health information: are electronic records different from paper records? J Clin Ethics. 2010;21(1):39–41.

    Google Scholar 

  43. 43.

    Kim D, Schleiter K, Crigger BJ, McMahon JW, Benjamin RM, Douglas SP, et al. A physician’s role following a breach of electronic health information. J Clin Ethics. 2010;21(1):30–5.

    Google Scholar 

  44. 44.

    Kluge EHW. Medical narratives and patient analogs: The ethical implications of electronic patient records. Methods Inf Med. 1999;38(4–5):253–9.

    Google Scholar 

  45. 45.

    Kluge EHW. Health information, privacy, confidentiality and ethics. Int J Bio-Med Comput. 1994;35(Suppl.):23–7.

    Google Scholar 

  46. 46.

    Tang N, Eisenberg JM, Meyer GS. The roles of government in improving health care quality and safety. Jt Commun J Qual Saf. 2004;30(1):47–55.

    Google Scholar 

  47. 47.

    Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance).

  48. 48.

    Hunter P. The big health data sale. EMBO Rep. 2016;17(8):3.

    Article  Google Scholar 

  49. 49.

    DesRoches CM, Walker J, Delbanco T. Care partners and patient portals—faulty access, threats to privacy, and ample opportunity. JAMA Int Med. 2020.

  50. 50.

    Evans M. Hospitals give tech giants access to detailed medical records. Wall Street J. 2020.

  51. 51.

    Cowin LS, Riley TK, Heiler J, Gregory LR. The relevance of nurses and midwives code of conduct in Australia. Int Nurs Rev. 2019;66(3):320–8.

    Article  Google Scholar 

  52. 52.

    Allen-Ebrahimian B, Dorfman Z. Chinese coronavirus test maker agreed to build a Xinjiang gene bank Axios: Axios Media; 2020 [updated June 3rd, 2020.

  53. 53.

    Yang Y, Liu N, Wong S-L, Liu Q. China, coronavirus and surveillance: the messy reality of personal data. Financial Times. 2020 April 2nd, 2020.

  54. 54.

    Medical Council. Guide to professional conduct and ethics for registered medical professionals. 2016.

  55. 55.

    Nursing and Midwifery Council. The code: professional standards of practice and behaviour for nurses, midwives and nursing associates. London: The nursing and midwifery regulator for England, Wales, Scotland and Northern Ireland; 2015 October 10, 2018.

  56. 56.

    HSE, National Healthcare Records Management Advisory Group HSE Standards & Recommended Practices for Healthcare Records Management. 2011.

  57. 57.

    Health and Social Care Information Centre. Code of practice on confidential information. Exeter: Health and Social Care Information Centre; 2014 December, 2014. Contract No.: Version 1.0.

  58. 58.

    Beauchamp TL, Childress JF. Principles of biomedical ethics. 7th ed. Oxford: Oxford University Press; 2012.

    Google Scholar 

  59. 59.

    Opinion 3/2019 concerning the Questions and Answers on the interplay between the Clinical Trials Regulation (CTR) and the General Data Protection regulation (GDPR) (art. 70.1.b)), Opinion 3/2019 (2019).

  60. 60.

    Heywood R, Macaskill A, Williams K. Informed consent in hospital practice: health professionals’ perspectives and legal reflections. Med Law Rev. 2010;18(2):152–84.

    Article  Google Scholar 

  61. 61.

    Lakdawala N, Fontanella D, Grant-Kels JM. Ethical considerations in dermatologic photography. Clin Dermatol. 2012;30(5):486–91.

    Article  Google Scholar 

  62. 62.

    Kluge EHW. Informed consent and the security of the electronic health record (EHR): some policy considerations. Int J Med Inform. 2004;73(3):229–34.

    Article  Google Scholar 

  63. 63.

    Fairweather NB, Rogerson S. A moral approach to electronic patient records. Inform Health Soc Care. 2001;26(3):219–34.

    Google Scholar 

  64. 64.

    Fennelly O. Clinical information capture in the electronic health record: literature review and key considerations. Dublin: eHealth Ireland; 2019.

    Google Scholar 

  65. 65.

    Roberts A. Language, structure, and reuse in the electronic health record. AMA J Ethics. 2017;19(3):281–8.

    Article  Google Scholar 

  66. 66.

    Moros DA. The electronic medical record and the loss of narrative. Camb Q Healthc Ethics. 2017;26(2):328–31.

    Article  Google Scholar 

  67. 67.

    Berg M, Langenberg C, vd Berg I, Kwakkernaat J. Considerations for sociotechnical design: experiences with an electronic patient record in a clinical context. Int J Med Inform. 1998;52(1–3):243–51.

    Article  Google Scholar 

  68. 68.

    Franz B, Murphy JW. Electronic medical records and the technological imperative: the retrieval of dialogue in community-based primary care. Perspect Biol Med. 2015;58(4):480–92.

    Article  Google Scholar 

  69. 69.

    Accordino R, Kopple-Perry N, Gligorov N, Krieger S. The medical record as legal document: when can the patient dictate the content? An ethics case from the department of Neurology. Clin Ethics. 2014;9(1):53–6.

    Article  Google Scholar 

  70. 70.

    Spencer A, Low D. The challenge of the information culture for the paediatrician. Arch Dis Child. 2011;96(12):1167–72.

    Article  Google Scholar 

  71. 71.

    Gummadi S, Housri N, Zimmers TA, Koniaris LG. Electronic medical record: a balancing act of patient safety, privacy and health care delivery. Am J Med Sci. 2014;348(3):238–43.

    Article  Google Scholar 

  72. 72.

    Hoffman S, Podgurski A. Big bad data: law, public health, and biomedical databases. J Law Med Ethics. 2013;41(Suppl. 1):56–60.

    Article  Google Scholar 

  73. 73.

    Kopala B, Mitchell ME. Use of digital health records raises ethics concerns. JONA’s Healthc Law Ethics Regul. 2011;13(3):84–9.

    Article  Google Scholar 

  74. 74.

    Layman EJ. Ethical issues and the electronic health record. Health Care Manag. 2008;27(2):165–76.

    Article  Google Scholar 

  75. 75.

    Lee LM. Ethics and subsequent use of electronic health record data. J Biomed Inform. 2017;71:143–6.

    Article  Google Scholar 

  76. 76.

    Phillips W, Fleming D. Ethical concerns in the use of electronic medical records. Mo Med. 2009;106(5):328–33.

    Google Scholar 

  77. 77.

    Bernat JL. Ethical and quality pitfalls in electronic health records. Neurology. 2013;80(11):1057–61.

    Article  Google Scholar 

  78. 78.

    Weis JM, Levy PC. Copy, paste, and cloned notes in electronic health records;prevalence, benefi ts, risks, and best practice recommendations. Chest. 2014;145(3):632–8.

    Article  Google Scholar 

  79. 79.

    Ben-Assuli O. Electronic health records, adoption, quality of care, legal and privacy issues and their implementation in emergency departments. Health Policy (Amsterdam, Netherlands). 2015;119(3):287–97.

    Article  Google Scholar 

  80. 80.

    Wangenheim PM. Scribes, electronic health records, and the expectation of confidentiality. J Clin Ethics. 2018;29(3):240–3.

    Google Scholar 

  81. 81.

    Phenomenal H2020. Challenges to the use of health data—Dipak Kalra, EuroRec Institute and UCL London. 2016.

  82. 82.

    Schulte F, Fry E. Death by 1,000 clicks: where electronic health records went wrong: botched operation. Kaiser Health News. 2019;18:2019.

    Google Scholar 

  83. 83.

    Wainer J, Campos CJ, Salinas MD, Sigulem D. Security requirements for a lifelong electronic health record system: an opinion. Open Med Inform J. 2008;2:160–5.

    Article  Google Scholar 

  84. 84.

    Veronesi JF. Ethical issues in computerized medical records. Crit Care Nurs Q. 1999;22(3):75–80.

    Article  Google Scholar 

  85. 85.

    Quantin C, Coatrieux G, Allaert FA, Fassa M, Bourquard K, Boire JY, et al. New advanced technologies to provide decentralised and secure access to medical records: case studies in oncology. Cancer Inform. 2009;7:217–29.

    Article  Google Scholar 

  86. 86.

    Bhuyan SS, Bailey-DeLeeuw S, Wyant DK, Chang CF. Too much or too little? how much control should patients have over EHR data? J Med Syst. 2016;40(7).

  87. 87.

    Wynia MK, Torres GW, Lemieux J. Many physicians are willing to use patients’ electronic personal health records, but doctors differ by location, gender, and practice. Health Affairs (Project Hope). 2011;30(2):266–73.

    Article  Google Scholar 

  88. 88.

    Meslin EM, Schwartz PH. How bioethics principles can aid design of electronic health records to accommodate patient granular control. J Gen Intern Med. 2015;30(1):3–6.

    Article  Google Scholar 

  89. 89.

    Neame RLB. Privacy protection for personal health information and shared care records. Inform Prim Care. 2014;21(2):84–91.

    Google Scholar 

  90. 90.

    McSherry B. Ethical issues in HealthConnect’s shared electronic health record system. J Law Med. 2004;12(1):60–8.

    Google Scholar 

  91. 91.

    Fairweather NB, Rogerson S. A moral approach to electronic patient records. Med Inform Int Med. 2001;26(3):219–34.

    Article  Google Scholar 

  92. 92.

    Garrety K, McLoughlin I, Wilson R, Zelle G, Martin M. National electronic health records and the digital disruption of moral orders. Soc Sci Med. 2014;101:70–7.

    Article  Google Scholar 

  93. 93.

    Spriggs M, Arnold MV, Pearce CM, Fry C. Ethical questions must be considered for electronic health records. J Med Ethics. 2012;38(9):535–9.

    Article  Google Scholar 

  94. 94.

    Nishimura AA, Tarczy-Hornoch P, Shirts BH. Pragmatic and ethical challenges of incorporating the genome into the electronic medical record. Curr Genet Med Rep. 2014;2(4):201–11.

    Article  Google Scholar 

  95. 95.

    Anuradha C, Babu PBR. Securing privacy for confidential databases using anonymization. Middle East J Sci Res. 2012;12(12):1792–5.

    Google Scholar 

  96. 96.

    Williams H, Spencer K, Sanders C, Lund D, Whitley EA, Kaye J, Dixon WG. Dynamic consent: a possible solution to improve patient confidence and trust in how electronic patient records are used in medical research. JMIR Med Inform. 2015;3(1):e3.

    Article  Google Scholar 

  97. 97.

    Kluge EHW. Informed consent to the secondary use of EHRs: Informatic rights and their limitations. Studies in health technology and informatics. 2004. p. 635–8.

  98. 98.

    Davis KA, Smith LB. Ethical considerations about EHR-mediated results disclosure and pathology information presented via patient portals. AMA J Ethics. 2016;18(8):826–32.

    Article  Google Scholar 

  99. 99.

    McCarthy MW, Real de Asua D, Gabbay E, Fins JJ. Off the charts: Medical documentation and selective redaction in the age of transparency. Perspect Biol Med. 2018;61(1):118–29.

    Article  Google Scholar 

  100. 100.

    Angst CM. Protect my privacy or support the common-good? Ethical questions about electronic health information exchanges. J Bus Ethics. 2009;90(SUPPL. 2):169–78.

    Article  Google Scholar 

  101. 101.

    Cato KD, Bockting W, Larson E. Did i tell you that? Ethical issues related to using computational methods to discover non-disclosed patient characteristics. J Empir Res Hum Res Ethics JERHRE. 2016;11(3):214–9.

    Article  Google Scholar 

  102. 102.

    Lowrance WW. Learning from experience: privacy and the secondary use of data in health research. J Biolaw Bus. 2003;6(4):30–60.

    Google Scholar 

  103. 103.

    McLaughlin K, Coderre S. Finding the middle path in tracking former patients in the electronic health record for the purpose of learning. Acad Med. 2015;90(8):1007–9.

    Article  Google Scholar 

  104. 104.

    Brisson GE, Barnard C, Tyler PD, Liebovitz DM, Neely KJ. A framework for tracking former patients in the electronic health record using an educational registry. J Gen Intern Med. 2018;33(4):563–6.

    Article  Google Scholar 

  105. 105.

    Friedman C, Rigby M. Conceptualising and creating a global learning health system. Int J Med Inform. 2013;82(4):e63–71.

    Article  Google Scholar 

  106. 106.

    Goodman KW. Ethics, information technology, and public health: new challenges for the clinician-patient relationship. J Law Med Ethics. 2010;38(1):58–63.

    Article  Google Scholar 

  107. 107.

    Kaplan B. How should health data be used? Privacy, secondary use, and big data sales. Camb Q Healthc Ethics. 2016;25(2):312–29.

    Article  Google Scholar 

  108. 108.

    Kool L, Timmer J, Royakkers L, Van Est R. Urgent upgrade: protect public values in our digitized society. The Hague: Rathenau Instituut; 2017.

    Google Scholar 

  109. 109.

    Wilburn A. Nursing informatics: ethical considerations for adopting electronic records. NASN Sch Nurse (Print). 2018;33(3):150–3.

    Article  Google Scholar 

  110. 110.

    Okada M, Yamamoto K, Watanabe K. Conceptual model of health information ethics as a basis for computer-based instructions for electronic patient record systems. Stud Health Technol Inform. 2007;129(Pt 2):1442–6.

    Google Scholar 

  111. 111.

    Cederberg RA, Valenza JA. Ethics and the electronic health record in dental school clinics. J Dent Educ. 2012;76(5):584–9.

    Article  Google Scholar 

  112. 112.

    Satkoske VB, Parker LS. Practicing preventive ethics, protecting patients: challenges of the electronic health record. J Clin Ethics. 2010;21(1):36–8.

    Google Scholar 

  113. 113.

    Nielsen BA, Baum RA, Soares NS. Navigating ethical issues with electronic health records in developmental-behavioral pediatric practice. J Dev Behav Pediatr. 2013;34(1):45–51.

    Article  Google Scholar 

  114. 114.

    Ozair FF, Jamshed N, Sharma A, Aggarwal P. Ethical issues in electronic health records: a general overview. Perspect Clin Res. 2015;6(2):73–6.

    Article  Google Scholar 

  115. 115.

    Tehrani N. How digital health technology aids physicians. Int J Biomed. 2015;5(2):104.

    Article  Google Scholar 

  116. 116.

    Lo B. Professionalism in the age of computerised medical records. Singapore Med J. 2006;47(12):1018–22.

    Google Scholar 

  117. 117.

    Shenoy A, Appel JM. Safeguarding confidentiality in electronic health records. Camb Q Healthc Ethics. 2017;26(2):337–41.

    Article  Google Scholar 

  118. 118.

    Entzeridou E, Markopoulou E, Mollaki V. Public and physician’s expectations and ethical concerns about electronic health record: Benefits outweigh risks except for information security. Int J Med Inform. 2018;110:98–107.

    Article  Google Scholar 

  119. 119.

    Furano RF, Kushniruk A, Barnett J. Deriving a set of privacy specific heuristics for the assessment of PHRs (personal health records). Stud Health Technol Inform. 2017;234:125–30.

    Google Scholar 

  120. 120.

    Iacovino L, Reed B. Recordkeeping research tools in a multi-disciplinary context for cross-jurisdictional health records systems. Arch Sci. 2008;8(1):37–68.

    Article  Google Scholar 

  121. 121.

    Clemens NA. Privacy, consent, and the electronic mental health record: the person vs. the System. J Psychiatr Pract. 2012;18(1):46–50.

    Article  Google Scholar 

  122. 122.

    Ashton K, Sullivan A. Ethics and confidentiality for psychologists in academic health centers. J Clin Psychol Med Settings. 2018;25(3):240–9.

    Article  Google Scholar 

  123. 123.

    ASTM. ASTM E2369 - 12: Standard Specification for Continuity of Care Record (CCR). ASTM International: ASTM International; 2012.

  124. 124.

    Ow Yong LM, Tan AWL, Loo CLK, Lim ELP. Risk mitigation of shared electronic records system in campus institutions: medical social work practice in singapore. Soc Work Health Care. 2014;53(9):834–44.

    Article  Google Scholar 

  125. 125.

    Pirnejad H, Bal R, Stoop AP, Berg M. Inter-organisational communication networks in healthcare: centralised versus decentralised approaches. Int J Integr Care. 2007;7:e14.

    Article  Google Scholar 

  126. 126.

    Shoenbill K, Fost N, Tachinardi U, Mendonca EA. Genetic data and electronic health records: a discussion of ethical, logistical and technological considerations. J Am Med Inform Assoc. 2014;21(1):171–80.

    Article  Google Scholar 

  127. 127.

    Balka E, Tolar M. Everyday ethical dilemmas arising with electronic record use in primary care. Stud Health Technol Inform. 2011;169:285–9.

    Google Scholar 

  128. 128.

    Phillips W, Fleming DA. Moral and prudential considerations in adopting electronic medical records. Mo Med. 2010;107(4):234–9.

    Google Scholar 

  129. 129.

    Strain J, Botin L. A phenomenological perspective on clinical communication and interaction: the case of electronic health records. J Inf Commun Ethics Soc. 2007;5(1):20–32.

    Article  Google Scholar 

  130. 130.

    Greenhalgh T. Video consultations: a guide for practice. BJGP life: royal college of general practicioners, IRIHS research group at the University of Oxford; 2020 March 8, 2020.

  131. 131.

    Stein HF. Interfaces between electronic medical record (EMR/EHR) technology and people in American medicine: insight imagination, and relationships in clinical practice. J Okla State Med Assoc. 2012;105(8):316–9.

    Google Scholar 

  132. 132.

    Luo J. Current technologies for behavioral healthcare clinical practice. 2011. p. 11–26.

  133. 133.

    Samsuri S, Ismail Z, Ahmad R. Adopting a knowledge management concept in securing the privacy of electronic medical record systems. Advances in intelligent systems and computing. 2013. p. 547–58.

  134. 134.

    Wallace IM. Is patient confidentiality compromised with the electronic health record? A position paper. Computers, informatics, nursing: CIN. 2015;33(2):58–62 (quiz E1).

  135. 135.

    Williams RL, Taylor JF. Four steps to preserving adolescent confidentiality in an electronic health environment. Curr Opin Obstet Gynecol. 2016;28(5):393–8.

    Article  Google Scholar 

  136. 136.

    Kluge EHW. Professional ethics as basis for legal control of health care information. Int J Biomed Comput. 1996;43(1–2):33–7.

    Article  Google Scholar 

  137. 137.

    Spector-Bagdady K, Shuman AG. Reg-ent within the Learning Health System. Otolaryngol Head Neck Surg. 2018;158(3):405–6.

    Article  Google Scholar 

  138. 138.

    McBride S, Tietze M, Robichaux C, Stokes L, Weber E. Identifying and addressing ethical issues with use of electronic health records. Online J Issues Nurs. 2018;23(1).

  139. 139.

    McCarthy S, Meredith J, Bryant L, Hemsley B. Legal and ethical issues surrounding advance care directives in Australia: implications for the advance care planning document in the Australian my health record. J Law Med. 2017;25(1):136–49.

    Google Scholar 

  140. 140.

    Garrety K, McLoughlin I, Wilson R, Zelle G, Martin M. National electronic health records and the digital disruption of moral orders. Soc Sci Med. 1982;2014(101):70–7.

    Google Scholar 

  141. 141.

    Milton CL. Information sharing: transparency, nursing ethics, and practice implications with electronic medical records. Nurs Sci Q. 2009;22(3):214–9.

    Article  Google Scholar 

  142. 142.

    Tussey CM, Marcopulos BA, Bush SS. Evolving roles, innovative practice, and rapid technology growth: remaining ethical in modern clinical neuropsychology. Psychol Injury Law. 2015;8(4):281–8.

    Article  Google Scholar 

  143. 143.

    Liyanage H, Liaw ST, Di Iorio CT, Kuziemsky C, Schreiber R, Terry AL, et al. Building a privacy, ethics, and data access framework for real world computerised medical record system data: a Delphi Study. Contribution of the Primary Health Care Informatics Working Group. Yearb Med Inform. 2016;1:138–45.

    Article  Google Scholar 

  144. 144.

    Barber A. Computers for physicians: Never do harm. Care Manag J. 2012;13(4):194–9.

    Article  Google Scholar 

  145. 145.

    Stahl BC, Doherty NF, Shaw M, Janicke H. Critical theory as an approach to the ethics of information security. Sci Eng Ethics. 2014;20(3):675–99.

    Article  Google Scholar 

  146. 146.

    Whitehouse D, Duquenoy P. eHealth and ethics: theory, teaching, and practice. Information and Communication Technologies, Society and Human Beings: Theory and Framework2010. p. 454–65.

  147. 147.

    Robertson MD, Kerridge IH. “Through a glass, darkly”: the clinical and ethical implications of Munchausen syndrome. Med J Aust. 2009;191(4):217–9.

    Article  Google Scholar 

  148. 148.

    Klumpp TR. Electronic medical records and quality of cancer care. Curr Oncol Rep. 2013;15(6):588–94.

    Article  Google Scholar 

  149. 149.

    Hollister B, Bonham VL. Should electronic health record-derived social and behavioral data be used in precision medicine research? AMA J Ethics. 2018;20(9):E873–80.

    Article  Google Scholar 

  150. 150.

    Eggleston EM, Weitzman ER. Innovative uses of electronic health records and social media for public health surveillance. Curr Diabetes Rep. 2014;14(3).

  151. 151.

    Ross C, Swetlitz I. IBM pitched its Watson supercomputer as a revolution in cancer care. It’s nowhere close. Stat News. 2017;2017:5.

    Google Scholar 

  152. 152.

    Casanovas P, Mendelson D, Poblet M. A linked democracy approach for regulating public health data. Health Technol. 2017;7(4):519–37.

    Article  Google Scholar 

  153. 153.

    Dinh-Le C, Chuang R, Chokshi S, Mann D. Wearable health technology and electronic health record integration: scoping review and future directions. JMIR Mhealth Uhealth. 2019;7(9):e12861.

    Article  Google Scholar 

  154. 154.

    Palanica A, Flaschner P, Thommandram A, Li M, Fossat Y. Physicians’ perceptions of chatbots in health care: cross-sectional web-based survey. J Med Int Res. 2019;21(4):e12887.

    Google Scholar 

  155. 155.

    Ledford H. Millions of black people affected by racial bias in health-care algorithms. Nature. 2019.

  156. 156.

    O’Connor S. Virtual reality and avatars in health care. Clin Nurs Res. 2019;28(5):523–8.

    Article  Google Scholar 

  157. 157.

    The Health Insurance Portability and Accountability Act of 1996. , Pub. L. No. Pub. L. 104–191. Stat. Stat. 1936. (1996).

  158. 158.

    Carter P, Laurie GT, Dixon-Woods M. The social licence for research: why care data ran into trouble. J Med Ethics. 2015;41(5):404.

    Article  Google Scholar 

  159. 159.

    The Economist. The pandemic has spawned a new way to study medical records. The Economist. 2020 May 14th, 2020.

Download references


Not applicable.


This publication has emanated from research supported in part by a research grant from Science Foundation Ireland (SFI) under Grant Number 16/RC/3948 and co-funded under the European Regional Development Fund and by FutureNeuro industry partners. The funding enabled this research project but the funders had no other role in this research.

Author information




All authors have made substantial contributions to this research. TJ has contributed to the study concept and design, data analyses and the drafting of the manuscript. MF has contributed to the study concept, data analyses, drafting and critical revisions of the manuscript. CD has contributed to study concept, data analyses, and critical revision of the manuscript. All authors approved the final version.

Corresponding author

Correspondence to Tim Jacquemard.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Jacquemard, T., Doherty, C.P. & Fitzsimons, M.B. The anatomy of electronic patient record ethics: a framework to guide design, development, implementation, and use. BMC Med Ethics 22, 9 (2021).

Download citation


  • Electronic patient records
  • Electronic health records
  • Framework
  • Ethics
  • Electronic medical records
  • eHealth
  • Digitalisation