Skip to main content

Encompassing trust in medical AI from the perspective of medical students: a quantitative comparative study

Abstract

Background

In the years to come, artificial intelligence will become an indispensable tool in medical practice. The digital transformation will undoubtedly affect today’s medical students. This study focuses on trust from the perspective of three groups of medical students - students from Croatia, students from Slovakia, and international students studying in Slovakia.

Methods

A paper-pen survey was conducted using a non-probabilistic convenience sample. In the second half of 2022, 1715 students were surveyed at five faculties in Croatia and three in Slovakia.

Results

Specifically, 38.2% of students indicated familiarity with the concept of AI, while 44.8% believed they would use AI in the future. Patient readiness for the implementation of technologies was mostly assessed as being low. More than half of the students, 59.1%, believe that the implementation of digital technology (AI) will negatively impact the patient-physician relationship and 51,3% of students believe that patients will trust physicians less. The least agreement with the statement was observed among international students, while a higher agreement was expressed by Slovak and Croatian students 40.9% of Croatian students believe that users do not trust the healthcare system, 56.9% of Slovak students agree with this view, while only 17.3% of international students share this opinion. The ability to explain to patients how AI works if they were asked was statistically significantly different for the different student groups, international students expressed the lowest agreement, while the Slovak and Croatian students showed a higher agreement.

Conclusion

This study provides insight into medical students’ attitudes from Croatia, Slovakia, and international students regarding the role of artificial intelligence (AI) in the future healthcare system, with a particular emphasis on the concept of trust. A notable difference was observed between the three groups of students, with international students differing from their Croatian and Slovak colleagues. This study also highlights the importance of integrating AI topics into the medical curriculum, taking into account national social & cultural specificities that could negatively impact AI implementation if not carefully addressed.

Peer Review reports

Introduction

Technological advancements and artificial intelligence (AI) have transformed healthcare over the past few years. There has been a broad range of applications for AI in medicine, ranging from appointment scheduling and digitising health records to using algorithms to determine drug dosage [1]. The enthusiasm for the application of AI has extended to various medical specialties, such as radiology [2, 3], oncology [4], neurology [5], nephrology [6]. Changes in the field have also prompted many studies to focus on the attitudes of students and their choice of specialisation. Some interesting results that have emerged from the research include a shift in interest toward this specialisation, anticipated changes in daily work, the consideration of fears, and expectations [7,8,9]. Students represent an interesting group when researching the future of healthcare and their perceptions regarding the use of AI. Research has shown that in most cases, medical students agree with statements indicating that they understand what AI is [10, 11]. However, when asked to define it themselves, the majority are unable to do so [12]. The existing literature recognises the necessity of incorporating education on the use of AI into the medical curricula, highlighting that the current education in this area is neither sufficient nor satisfactory [11,12,13,14]. Although medical students expect AI to transform and revolutionise healthcare, they note that the current education on this topic is inadequate [15]. In Croatia, most medical faculties include medical informatics as a mandatory course in their curriculum (in the 2nd or 5th year of study), while no course directly focused on AI has been found. However, several elective courses, such as “Robotics in Medicine” and “Digital Technologies in the Healthcare System and E-Health,” can be found, which introduce students to AI through practical applications. Although there are no specific subjects on AI in the medical curricula in Slovakia, medical faculties organize lectures and workshops on AI for medical students. At the largest Slovak medical faculty in Bratislava, the topic of AI has been addressed for the last four years in the first-year medical ethics course. The medical students’ readiness for AI, which they should develop during their studies, has received more attention in the form of the Medical Artificial Intelligence Readiness Scale for Medical Students (MAIRS-MS) [16]. While some studies suggest what medical students should know about artificial intelligence in medicine [17], others highlight the need for health AI ethics in medical school education [18]. Students believe that AI will make medicine more exciting in the future and that AI should be a partner rather than a competitor [19]. They also think that receiving education in AI will greatly benefit their careers [20]. While significant progress has been observed in implementing AI across various applications, these are still early stages that require validation and identifying solutions for emerging ethical and social challenges [21]. Students have expressed fear about the reduced interaction with patients due to the integration of AI [14], decreased job opportunities, and the emergence of new ethical and social challenges [10]. They are also concerned that AI will increase patient risks, reduce physicians’ skills, and harm patients [22].

Implementing AI brings about changes that will impact the patient and physician relationship [23]. Adopting AI involves a patient-centred approach that promotes informed choices [24]. The relationship between physicians and patients has been evolving under the influence of social circumstances and technological progress. The information and digital age have provided patients with tools empowering them to take on an active role as co-decision-makers, unlike when a paternalistic model prevailed and only physicians had exclusive access to medical information [25, 26].

Trust is a crucial factor in the current model of the patient-physician relationship. As a complex concept from the perspective of both physicians and patients, trust is the foundation for successful health outcomes and a quality relationship between them [27]. Trust is deeply embedded in the physician-patient relationship, making it a fiduciary relationship. Inserting a new actor will bring disruption and potentially even the creation of new dyadic or triadic trusting relationships between physicians and AI, patients and AI, or even between patients, the physician and AI [35]. Due to technological advancements, trust relationships in healthcare will become even more of an issue, necessitating active reflection and action [28].

One of the most critical ethical values in the design, development, and deployment of medical AI is transparency. It is not merely a recommendation but a necessity, tied to the informed consent of the user (physician) who may or may not be fully aware of the underlying processes in the algorithmic decision-making. Thus, one of the most pressing issues, alongside transparency, is explainability [29]. Explainability and transparency are closely linked with the level of trust and trustworthiness; trust mainly refers to the belief that we can depend on someone or something, hence a gradual increase in reliability may lead to trust [30]. From a phenomenological perspective, trust in medical AI is an affective-cognitive state of the entities involved in these relationships, namely the trustor (the person who trusts) and the trustee (the entity to be trusted) [31]. In this instance, the trustor is a physician, and the trustee would be the medical AI system. As for the current ongoing discussion on whether medical AI can be trusted or only relied on [32,33,34], an interesting research question has emerged, specifically the need to examine whether future physicians perceive that this trust is possible or will be disruptive.

Methods

Research aims

In our study, we aimed to focus on the medical students’ attitudes towards the role of AI in the future of healthcare, particularly focusing on the concept of trust.

This study aims to explore:

  1. 1.

    How students perceive the phenomenon of trust in physician-patient relationship.

  2. 2.

    The perception of their own medical expertise in the context of AI use.

  3. 3.

    Students’ estimation of patient preparedness to embrace AI as part of everyday healthcare provision.

Additionally, the study investigated whether trust is a prerequisite for the physician-patient relationship in the context of AI implementation.

Participants and data collection

This study involved medical students from Croatia and Slovakia, two Eastern European countries with many similarities, such as in their history and states’ development, social circumstances, and healthcare challenges. International students from different societal backgrounds have also been included in the study and were observed in the analysis as a third group. This study was conducted between May 2022 and November 2022 at five medical schools in Croatia and three in Slovakia (Table 1). This study was conducted using a non-probabilistic convenience sample. The inclusion criteria were being a medical student in one of the medical schools in Croatia or Slovakia and being physically present at lectures where the researchers conducted the research. The study included students from all years of study, as was the practice in some other studies conducted on this topic [15, 20, 33, 36, 37, 39]The survey was conducted using the paper-pen method, except at one university in Slovakia where the students, after signing an informed consent form, received a URL link to the survey on the LimeSurvey platform. In agreement with the lecturers, the researchers arrived at the beginning of lectures, introduced the research, and asked for the students’ voluntary participation. Students who were interested in the study were asked to sign the informed consent form. In total, 1715 medical students participated. In the statistical analysis, 14 were excluded due to insufficient survey completion. The final sample consisted of 1701 medical students.

Design of the questionnaire

The research team developed a questionnaire, and the English version is available in supplementary files (Additional file 1). The survey and the questions were based on a prior qualitative study conducted in 2021 in Croatia [35], as well as the literature review of previous surveys conducted involving medical students, patients, and physicians [23, 36,37,38,39,40,41] As used in our qualitative study [35], the anticipatory ethics approach [42] was followed with the same scenario. To preserve the continuity between the qualitative and quantitative studies, we deliberatively decided to focus primarily on the ethical, legal and social issues by not using the existing MAIRS-MS [16]. The survey focused on six broad topics and explored the following regarding the participants: (1) their motivation for enrolling in medical studies and the self-reported knowledge of medical ethics and/or bioethics; (2) the attitudes related to the impact of AI on the patient-physician relationship; (3) their self-reported perception of understanding of artificial intelligence; (4) their propensity to use AI and digital technologies in future medical practice; (5) the perceived utility of AI in the future, and societal readiness and preparedness for implementation; and (6) their demographic characteristics. The questions included multiple-choice answers on a 5-point Likert scale (the participants were instructed to read the statements and express their agreement or disagreement). At the beginning of the survey, a short scenario (Additional file 2) was presented to the medical students based on the anticipatory ethics approach [42], followed by the survey questions. This short scenario focused on an AI-based virtual assistant used in a hospital context in 2030. The survey was pilot-tested with a small sample of first-year students from the researcher’s university to ensure questionnaire comprehension, clarity, and the time taken to answer the questionnaire. The survey was available in Croatian, Slovak, and English, the latter particularly for the international students studying Medicine in the English program. The part of the questionnaire related to the perception of patient readiness, which was taken for further analysis, consisted of four questions with a high level of internal consistency, as determined by the Cronbach’s alpha score of 0.810.

Data analysis

All statistical analyses were conducted using SPSS version 25 (IBM Corp. Armonk, NY, USA). The simple descriptive statistics have been presented in percentages. An independent t-test and one-way ANOVA were conducted to examine the group differences based on demographic determinants. Principal axis factoring was run on the questions about attitudes towards using AI technology in their future work.

Results

Demographics

A total of 1701 responses were collected from eight Schools of Medicine (Table 1). Among these, 771 students (45.3%) were from Croatia, and 930 (54.7%) were from Slovakia, comprising 587 (34.5%) Slovak students and 343 (20.2%) international students mainly arriving from Western European and Scandinavian countries. Overall, 63.7% (1084) were female, 34.5% (587) were male, while 30 (1.8%) participants’ answers for gender were missing. In this study, female students were more represented than male students, which is in line with gender structure trends in medical studies. The Eurostudent VI survey for Croatia (2019) shows that 77.6% of students in medicine and social care are female compared to 22.1% of male students [43]. In some other studies on medical students in Croatia, similar ratios as in this research have been observed between male and female students [44, 45]. Recent studies in Slovakia on the population of medical students also have a higher proportion of women than men in their samples [46, 47]. The most represented group consisted of first-year students, followed by fourth-year and fifth-year students. The lowest representation was among sixth-year students which is attributed to the sampling approach that included students only attending lectures at the Faculty of Medicine. Given the specificities of medical education, this group was often located in hospital centres and clinics, making them less accessible to researchers.

Table 1 Medical student’s demographic characteristics (N = 1701)

General attitudes on AI and trust within the patient-physician relationship

Regarding their acquaintance with the concept of artificial intelligence, a significant portion of students (38.6%) remained neutral, indicating neither agreement nor disagreement with the statement (Fig. 1). Additionally, 38.2% of students agreed with the assertion, while 23.2% negatively assessed their familiarity with the concept of AI. There was a statistically significant difference in the mean acquainted score between males and females, t(1162,09) = 7,928, P < .001, with males scoring higher (M = 3.45, SD = 1.014) than females (M = 3.05, SD = 0,977). Similar results were also seen when it came to the statement, “I expect to actively use artificial intelligence in my medical practice.” In this context, 39% of students remained neutral, 44.8% expressed an expectation to actively utilise artificial intelligence in their future medical practice, while 16.2% disagreed.

Fig. 1
figure 1

Student’s attitudes toward AI

Regarding trust within the patient-physician relationship, the medical students exhibit pronounced affirmative attitudes (Fig. 2). In response to the statement, “The patient and the physician should trust each other,” 80% of students strongly agreed, 16.8% agreed, 2.1% were neutral, and only 1.1% disagreed. For the statement, “The patient should trust the physician upon consulting him/her,” only 0.8% of students disagreed, 3% were neutral, while 96.2% of students agreed. Among the medical students who participated in this study, 2.9% disagreed with the assertion that “The physician is required to clarify to the patient how he or she came to a certain conclusion.” Here, 8.9% were neutral, and 89.2% agreed.

Fig. 2
figure 2

Student’s attitudes toward different aspects of patient-physician relationship

Based on the provided statements, a statistically significant difference was found among the Croatian, Slovak, and international students, as illustrated in Table 2. The international students were less likely to agree with the statements asserting that patients should trust the physician during consultations and must rely entirely on the physician’s opinion compared to Croatian and Slovak students. Conversely, they are more inclined to agree that patients respect the physicians’ time, unlike their Croatian and Slovak counterparts, who agreed with this to a lesser extent.

Table 2 Multiple comparisons

Trust in the healthcare system

Table 3 presents the percentage of agreement with the statement, “To what extent do you think users trust the healthcare system in the country you study in?” 40.9% of Croatian students believe that users do not trust the healthcare system, 56.9% of Slovak students agree with this view, while only 17.3% of international students share this opinion. A one-way ANOVA was conducted to determine whether the student groups’ perceptions of patient trust differed. The perception of patient trust in the healthcare system was statistically significantly different for the different student groups, Welch’s F(2, 106,211) = 901,153, P < .001. There was a difference in the mean between the Slovak students (M = 2.51, SD = 0.737), Croatian students (M = 2,75, SD = 0.847), and international students (M = 3.28, SD = 0.798), which was statistically significant (P < .001). Interestingly, the international students believe that users trust the Slovak healthcare system more than Slovak students, with a mean increase of 0.77, 95% CI [0.64, 0.9].

Table 3 Student perception of trust in the healthcare system among patients

Patient readiness to use AI

The construct of patient readiness consisted of the student’s perception of patient trust in technology, adaptability, digital literacy, and medical literacy. These aspects have been recognised as necessary for patients to be ready for use of technology. The range was from a minimum of 4 to a maximum of 20. A score of 4 was obtained if the student responded to all statements with “strongly disagree,” up to 20 if the student responded to all statements with “strongly agree”. A statistically significant difference (P < .001) in the perception of patient readiness was observed among Croatian, Slovak, and international students. The Croatian students gave, on average, the lowest scores for patient readiness (M = 8,40, SD = 2,814), followed by the Slovak students (M = 8,79, SD = 2,689), while the international students expressed the highest confidence in patient readiness to use AI technology in the future (M = 9,62, SD = 2,829).

Here, 59.1% of students agreed that implementing digital technologies will have a negative impact on the patient-physician relationship, at M = 3.62, SD = 1.009. No statistically significant difference was found based on student country of origin. On the other hand, there was a statistically significant difference of P < .001 among the students regarding the belief that patients will trust physicians less as more digital technologies are implemented. Here, 51,3% of students believe that patients will trust physicians less. The least agreement with the statement was observed among international students (M = 3.09, SD = 1.006), while a higher agreement was expressed by Slovak (M = 3.50, SD = 1.030) and Croatian students (M = 3.51, SD = 1.006).

The third aspect of trust focused on confidence in use. Here, 53.6% of students believe that if asked by a patient, they would be able to explain how the technology works. The ability to explain to patients how AI works if they were asked was statistically significantly different for the different student groups, Welch’s F(2, 856,821) = 12.294 P < .001. International students expressed the lowest agreement with the statement (M = 3.09, SD = 1.215), while the Slovak (M = 3.41, SD = 1.048) and Croatian (M = 3.47, SD = 1.096) students showed a higher agreement.

In the scenario (Annex I), AI was presented through the virtual assistant Cronko. The students were asked to assess how likely it was that they would react in a specific way if the diagnosis they provided significantly differed from that of the virtual assistant (AI) (Table 4). A statistically significant difference was found among the Slovak, Croatian, and international students. In this case, the international students expressed a lower likelihood of standing by their diagnostic conclusion and a higher mean score for rejecting their conclusion, favouring the AI’s opinion.

Table 4 Multiple comparisons - reaction to the difference in diagnosis

The students were also required to decide how patients should react if the diagnosis of the physician and AI significantly differed (Table 5). Here, 49.4% of students believe that patients should seek a third (expert) opinion, 42.1% thought that they should trust the physician, and 7.4% believe that they should consider both diagnoses and decide for themselves. Only a small number thought that they should trust the AI (0.7%) or seek a third opinion from another artificial intelligence system (0.4%).

Table 5 Crosstabulation of whom to trust and the country from which the students come

The crosstabulation analysis revealed that international students, at a lower percentage, believe that patients should trust the physician compared to Croatian and Slovak students. Based on Pearson’s Chi-square test (χ2 = 43,731, df = 8, P < .001), it was concluded that there is a dependence between the student’s country of origin and the opinion that the patient should have trust. The measure of association (Cramer’s V) indicates that there is a statistically significant weak association between the variables (φ = 0.114, P < .001).

Discussion

As far as the authors are aware, this is the first study providing the perspective of Eastern European countries regarding the attitude of medical students on the use of AI in medical practice. Previous studies have focused on Western countries such as Germany [48,49,50], Switzerland [37], the United Kingdom [39, 40], Canada [7, 10, 12], and Asian countries [11, 13, 51,52,53,54,55,56,57,58]. Although many expect that AI’s implementation in healthcare will occur in the coming years, only 44.8% of students believe they will use AI in the future. Here, 53.6% of students believed they would be able to explain to patients how AI technology works. Only 38.2% emphasised that they were (currently) familiar with the concept of AI. These results align with a study in Germany, where 64.3% of students expressed that they did not feel well-informed about AI in medicine [48]. It is important to note that previous research has observed a discrepancy between the perceived understanding of AI and the actual knowledge among medical students [9]. In the current era, medical education should set a goal to develop the skills that enable students to acquire knowledge about AI and successfully apply it in patient interactions, allowing them to convey information to patients in an understandable manner [59].

The prevailing view among Croatian and Slovak students was that users do not trust the healthcare system. This perception of a lack of trust aligns with research conducted on the general population. The EVS survey indicated that only 43% of Croatian citizens trust the healthcare system [60]. Studies have shown that a quarter of the population considers the healthcare system to be completely ineffective, and the majority believes that fundamental changes are needed, with the lowest levels of trust being expressed by social groups with the lowest levels of education [61]. The general level of satisfaction with the health care system in Slovakia recently reached 44%. When asked “To what extent do you trust conventional medicine in doctors and hospitals?” Slovakia fell to the bottom of the ranking with 55% of the population trusting conventional medicine compared to the European average. Looking at the reasons for Slovak dissatisfaction, the main reasons cited by Slovaks are the inability to get an appointment with a doctor (57%) and a bad personal or mediated negative experience with the care provided (51%) [62]. As previously highlighted, most international students come from Norway and other Scandinavian countries. Many studies show that trust in healthcare is exceptionally high in these countries [63,64,65]. Therefore, international students are expected to project the same perception of trust in the healthcare system onto the healthcare system of a different country outside their home country.

In Croatia and Slovakia, where trust in the healthcare system is relatively low and students perceive that patients do not have much trust in the system, it has been observed that students are more likely to believe that patients must fully trust their physicians during consultations and that patients are not respectful of the physician’s time. The implementation of AI requires collaborative cooperation between the patient and the physician, which necessitates mutual trust and understanding between them [66].Trust has been defined as “individuals’ calculated exposure to the risk of harm from the actions of an influential other” [31, 67] where harm signifies the extent of physical and/or psychological damage that can result from incorrectly calibrated trust decisions [31]. However, in the physician’s use of medical AI, the damage primarily manifests as harm to the patient and directly affects the physician-patient relationship [35, 68]. This also affects the reliability aspect and the physician’s trust in medical AI, as well as its acceptability and future use, which are directly related to trustworthiness.

Also, the different views of international students on issues of AI and medical trust may differ because these individuals mostly come from Western and Northern European countries where the shared decision-making model of the patient-physician relationship is strongly used in medical practice. The shared decision-making model avoids the trap of the two extremes where, on the one hand, the physician has a dominant role as the decision-maker and, on the other, the patient has an absolute position and makes the decision on his or her own. Modern medicine has moved from a paternalistic approach to a physician-patient partnership based on mutual discussion. It is very likely that international students from Western Europe are more accustomed to a system in which the emphasis on patient autonomy and ethical communication is important. The persistence of a paternalistic mentality in the healthcare system is noticeable in some post-communist or transitional countries [69, 70]. Although these countries are transforming and increasingly involving patients in decision-making, remnants of the old mentality still exist. The Slovak and Croatian students expressed more negative attitudes regarding patients respecting the time of physicians compared to international students. Similarly, they are more inclined to believe that patients should fully trust the physicians’ opinions. The attitudes of both Croatian and Slovak students towards trust between the patient and physician in the context of AI can be partly explained by the paternalistic model of the patient-physician relationship which is still to some extent present in these countries. Transitional countries, including Croatia and Slovakia, have specific cultural patterns in patient-physician communication, such as a lack of information sharing and a paternalistic approach to the patient [71]. In the region of Central and South-Eastern Europe, these issues have not been studied systematically [71]. However, Croatian researchers, following the Slovakian research team [72], have carried out a study of patient rights, focusing on patient-physician communication and the informed consent process [71]. The results of this study showed that communication during the process of obtaining informed consent in selected Croatian hospitals was based on the model of shared decision-making, but the paternalistic relationship was still present. We assume that due to the similar cultural and political background, this will probably be analogous in Slovakia, although to the best of our knowledge, such research has not been conducted recently. The case of the still existing medical paternalism in Slovakia, that has started a public debate, was the involuntary sterilisation of Roma women, which began in communist Czechoslovakia and continued into the 2000s. This case has contributed to ongoing mistrust of the national health system among Roma, impacting vaccine uptake and highlighting the need for improved communication and informed consent practices [73, 74].

In cases of conflict between the judgements of the physician and AI, our results demonstrate that more than half of the medical students consider that patients should look for a third (expert) opinion (49.4%) or trust the physician (42.1%). These results are similar to a German study [48] in which the majority (82.5%) stated that the physician’s decision should be followed. In such a disagreement, the international students were keener to reject their own decisions and favoured the AI than the Croatian and Slovak students despite frequenting and attending the same program as their Slovak colleagues. The new insights from our study represent a valuable contribution to the ongoing discussion [32,33,34] on the possibility of trusting medical AI from the perspectives of future physicians who will probably use AI in their everyday work.

In cases of different diagnoses, Croatian and Slovak students were more likely to believe that patients should rely on the physician’s opinion. Almost 90% of students think the physician must explain to the patient how they reached a conclusion. However, only 53.6% of students believe they could explain how AI technology works to a patient. This gap may pose a problem in healthcare due to inadequate explanations to patients’ and future physicians’ understanding and acceptance of AI diagnostic conclusions, especially when they differ. Future physicians must know how to use AI, understand and interpret the results, be aware of all risks, and explain it to patients in an understandable way [75].

Strengths and limitations

Based on our knowledge, no similar research has been conducted focusing on Eastern Europe, specifically Croatia and Slovakia, and emphasizing various aspects of trust that are crucial to consider in the context of medical AI. This study highlights the differences between medical students’ perceptions of trust and patient-physician relationships. The main limitation of this research was the sample selection which cannot be generalised due to its non-probabilistic nature. Due to technical and organisational difficulties, a convenience sample was the only available option. It is essential to consider that the research was conducted at the end of 2022 during the ongoing COVID-19 pandemic, which could have influenced the students’ attitudes within the healthcare system. International students filled out the questionnaire in English (not their first language) which could lead to misinterpretation or misunderstanding of specific questions.

Conclusions

This study provides insight into medical students’ attitudes from Croatia, Slovakia, and international students regarding the role of artificial intelligence (AI) in the future healthcare system, with a particular emphasis on the concept of trust. The insights from our study represent a valuable contribution to the ongoing debate on the possibility of trust in medical AI from the perspective of future physicians. Students agree that physicians and patients must trust each other; however, they also believe that implementing digital technologies will negatively impact the patient-physician relationship. A notable difference was observed between the three groups of students, with international students differing from their Croatian and Slovak colleagues. Croatian and Slovak students are more inclined to believe that patients will have less trust in them with the implementation of AI. Also, they are presenting certain paternalistic views. Additionally, Croatian and Slovak students exhibit higher confidence in their abilities (accuracy of diagnosis, ability to explain how AI functions) than international students. This study also highlights the importance of integrating AI topics into the medical curriculum, taking into account national specificities that could negatively impact AI implementation if not carefully addressed. Increasing explainability and trust through education about AI will contribute to better acceptance in the future, as well as to a stronger relationship between patients and physicians.

Data availability

The dataset generated by the survey research is available at the link: https://osf.io/2pyv9/files/osfstorage/6606a02b58fa490843e4f06b.

References

  1. Amisha F, Malik P, Pathania M, Rathaur VK. Overview of artificial intelligence in medicine. J Family Med Prim Care. 2019;8(7):2328.

    Article  Google Scholar 

  2. Reyes M, Meier R, Pereira S, et al. On the Interpretability of Artificial Intelligence in Radiology: challenges and opportunities. Radiol Artif Intell. 2020;2(3):e190043.

    Article  Google Scholar 

  3. Mehrizi MHR, Van Ooijen PMA, Homan M. Applications of artificial intelligence (AI) in diagnostic radiology: a technography study. Eur Radiol. 2020;31(4):1805–11.

    Article  Google Scholar 

  4. Dlamini Z, Francies FZ, Hull R, Marima R. Artificial intelligence (AI) and big data in cancer and precision oncology. Comput Struct Biotechnol J. 2020;18:2300–11.

    Article  Google Scholar 

  5. Kalani M, Anjankar A, Revolutionizing, Neurology. The role of Artificial intelligence in advancing diagnosis and treatment. Curēus. 2024.

  6. Bajaj T, Koyner JL. Artificial intelligence in acute kidney injury prediction. Adv Chronic Kidney Dis. 2022;29(5):450–60.

    Article  Google Scholar 

  7. Gong B, Nugent J, Guest W, et al. Influence of artificial intelligence on Canadian medical students’ preference for radiology specialty: a National Survey study. Acad Radiol. 2019;26(4):566–77.

    Article  Google Scholar 

  8. Capparos Galán G, Portero FS. Medical students’ perceptions of the impact of artificial intelligence in radiology. Radiología. 2022;64(6):516–24.

    Article  Google Scholar 

  9. Bin Dahmash A, Alabdulkareem M, Alfutais A, Kamel AM, Alkholaiwi F, Alshehri S et al. Artificial intelligence in radiology: does it impact medical students preference for radiology as their future career? BJR|Open. 2020;2(1):20200037.

  10. Mehta N, Harish V, Bilimoria K et al. Knowledge and attitudes on Artificial intelligence in Healthcare: a provincial survey study of medical students. MedEdPublish. 2021;10(1).

  11. Al Hadithy ZA, Al Lawati A, Al-Zadjali R et al. Knowledge, attitudes, and perceptions of Artificial Intelligence in Healthcare among Medical students at Sultan Qaboos University. Cureus. 2023;15(9).

  12. Teng M, Singla R, Yau O, Lamoureux D, Gupta A, Hu Z, et al. Health Care Students’ perspectives on Artificial Intelligence: Countrywide Survey in Canada. JMIR Med Educ. 2022;8(1):e33390.

    Article  Google Scholar 

  13. Abid S, Awan B, Ismail T, Sarwar N, Sarwar G, Tariq M. Artificial Intelligence: medical students attitude in District Peshawar Pakistan. Pakistan J Public Health. 2019;9(1):19–21.

    Article  Google Scholar 

  14. Bisdas S, Topriceanu C, Zakrzewska Z et al. Artificial Intelligence in Medicine: a multinational Multi-center survey on the medical and dental students’ perception. Front Public Health. 2021;9.

  15. Jebreen K, Radwan E, Kammoun-Rebai W, Alattar E, Radwan A, Safi W et al. Perceptions of undergraduate medical students on artificial intelligence in medicine: mixed-methods survey study from Palestine. BMC Med Educ. 2024;24(1).

  16. Karaca O, Çalişkan S, Demir K. Medical artificial intelligence readiness scale for medical students (MAIRS-MS) – development, validity and reliability study. BMC Med Educ. 2021;21(1).

  17. Park SH, Hyun K, Kim S, Park JH, Lim YS. What should medical students know about artificial intelligence in medicine? J Educational Evaluation Health Professions. 2019;16:18.

    Article  Google Scholar 

  18. Katznelson G, Gerke S. The need for health AI ethics in medical school education. Adv Health Sci Educ. 2021;26(4):1447–58.

    Article  Google Scholar 

  19. Bisdas S, Topriceanu CC, Zakrzewska Z, Irimia AV, Shakallis L, Subhash J et al. Artificial Intelligence in Medicine: a multinational Multi-center survey on the medical and dental students’ perception. Front Public Health. 2021;9.

  20. Tung AYZ, Dong LW. Malaysian medical students’ attitudes and readiness toward AI (Artificial Intelligence): a cross-sectional study. J Med Educ Curric Dev. 2023;10.

  21. Rajpurkar P, Chen E, Banerjee O, Topol EJ. AI in health and medicine. Nat Med. 2022;28(1):31–8.

    Article  Google Scholar 

  22. Boillat T, Nawaz FA, Rivas H. Readiness to Embrace Artificial intelligence among medical doctors and students: questionnaire-based study. JMIR Med Educ. 2022;8(2):e34973.

    Article  Google Scholar 

  23. Ongena Y, Haan M, Yakar D, Kwee TC. Patients’ views on the implementation of artificial intelligence in radiology: development and validation of a standardized questionnaire. Eur Radiol. 2019;30(2):1033–40.

    Article  Google Scholar 

  24. Quinn TP, Senadeera M, Jacobs S, Coghlan S, Le V. Trust and medical AI: the challenges we face and the expertise needed to overcome them. J Am Med Inform Assoc. 2020;28(4):890–4.

    Article  Google Scholar 

  25. Gerber BS, Eiser AR. The patient-physician relationship in the internet age: future prospects and the research agenda. JMIR J Med Internet Research/Journal Med Internet Res. 2001;3(2):e15.

    Google Scholar 

  26. Agarwal AK, Murinson BB. New dimensions in patient–physician interaction: values, autonomy, and medical information in the patient-centered clinical encounter. Rambam Maimonides Med J. 2012;3(3):e0017.

    Article  Google Scholar 

  27. Chandra S, Mohammadnezhad M, Ward P. Trust and Communication in a doctor- patient relationship: a literature review. J Healthc Commun. 2018;03(03).

  28. Cado V. Trust as a factor for higher performance in healthcare: COVID 19, digitalization, and positive patient experiences. IJQHC Commun. 2022;2(2).

  29. Gerdes A. The role of explainability in AI-supported medical decision-making. Discover Artif Intell. 2024;4(1).

  30. De Fine Licht K, Brülde B. On defining Reliance and Trust: purposes, conditions of adequacy, and new definitions. Philosophia. 2021;49(5):1981–2001.

    Article  Google Scholar 

  31. Hancock PA, Kessler TT, Kaplan AD, Stowers K, Brill JC, Billings DR et al. How and why humans trust: a meta-analysis and elaborated model. Front Psychol. 2023;14.

  32. Hatherley J. Limits of trust in medical AI. J Med Ethics. 2020;46(7):478–81.

    Article  Google Scholar 

  33. Kerasidou C, Kerasidou A, Büscher M, Wilkinson S. Before and beyond trust: reliance in medical AI. J Med Ethics. 2021;48(11):852–6.

    Article  Google Scholar 

  34. Ferrario A, Loi M, Viganò E. Trust does not need to be human: it is possible to trust medical AI. J Med Ethics. 2020;47(6):437–8.

    Article  Google Scholar 

  35. Čartolovni A, Malešević A, Poslon L. Critical analysis of the AI impact on the patient–physician relationship: a multi-stakeholder qualitative study. Digit Health. 2023;9.

  36. Coppola F, Faggioni L, Regge D, et al. Artificial intelligence: radiologists’ expectations and opinions gleaned from a nationwide online survey. Radiol Med. 2020;126(1):63–71.

    Article  Google Scholar 

  37. Van Der Hoek J, Huber AT, Leichtle AB, et al. A survey on the future of radiology among radiologists, medical students and surgeons: students and surgeons tend to be more skeptical about artificial intelligence and radiologists may fear that other disciplines take over. Eur J Radiol. 2019;121:108742.

    Article  Google Scholar 

  38. Abdullah R, Fakieh B. Health care employees’ perceptions of the use of artificial intelligence applications: survey study. JMIR J Med Internet Res. 2020;22(5):e17620.

    Article  Google Scholar 

  39. Blease C, Bernstein MH, Gaab J, et al. Computerization and the future of primary care: a survey of general practitioners in the UK. PLoS ONE. 2018;13(12):e0207418.

    Article  Google Scholar 

  40. Sit C, Srinivasan R, Amlani A et al. Attitudes and perceptions of UK medical students towards artificial intelligence and radiology: a multicentre survey. Insights into Imaging. 2020;11(1).

  41. Oh S, Kim JH, Choi SK, Lee HJ, Hong J, Kwon SH. Physician confidence in Artificial Intelligence: an online mobile survey. JMIR J Med Internet Res. 2019;21(3):e12422.

    Article  Google Scholar 

  42. York E, Conley SN. Creative anticipatory ethical reasoning with scenario analysis and design fiction. Sci Eng Ethics. 2020;26(6):2985–3016.

    Article  Google Scholar 

  43. Rimac I, Bovan K, Ogresta J. Nacionalo izvješće istraživanja EUROSTUDENT VI Za Hrvatsku. Ministarstvo znanosti i obrazovanja; 2019.

  44. Dragun R, Veček NN, Marendić M, Pribisalić A, Đivić G, Cena H, et al. Have Lifestyle habits and Psychological Well-being changed among adolescents and medical students due to COVID-19 Lockdown in Croatia? Nutrients. 2020;13(1):97.

    Article  Google Scholar 

  45. Đogaš V, Jerončić A, Marušić M, Marušić A. Who would students ask for help in academic cheating? Cross-sectional study of medical students in Croatia. BMC Med Educ. 2014;14(1).

  46. Sovicova M, Zibolenova J, Svihrova V, Hudeckova H. Odds ratio estimation of Medical Students’ attitudes towards COVID-19 vaccination. Int J Environ Res Public Health/International J Environ Res Public Health. 2021;18(13):6815.

    Article  Google Scholar 

  47. Faixová D, Jurinová Z, Faixová Z, Kyselovič J, Gažová A. Dietary changes during the examination period in medical students. EAS J Pharm Pharmacol. 2023;5(03):78–86.

    Article  Google Scholar 

  48. McLennan S, Meyer A, Schreyer K, Buyx A. German medical students´ views regarding artificial intelligence in medicine: a cross-sectional survey. PLOS Digit Health. 2022;1(10):e0000114.

    Article  Google Scholar 

  49. Gillissen A, Kochanek T, Zupanic M, Ehlers JP. Medical students’ perceptions towards digitization and Artificial Intelligence: a mixed-methods study. Healthcare. 2022;10(4):723.

    Article  Google Scholar 

  50. Moldt JA, Loda T, Mamlouk AM, Nieselt K, Fuhl W, Herrmann-Werner A. Chatbots for future docs: exploring medical students’ attitudes and knowledge towards artificial intelligence and medical chatbots. Med Educ Online. 2023;28(1).

  51. Syed W, Basil A, Al-Rawi M. Assessment of awareness, perceptions, and opinions towards Artificial Intelligence among Healthcare students in Riyadh, Saudi Arabia. Medicina. 2023;59(5):828.

    Article  Google Scholar 

  52. ‌Komasawa N, Nakano T, Terasaki F, Kawata R. Attitude survey toward artificial intelligence in medicine among Japanese medical students. Bull Osaka Med Pharm Univ. 2021;67(1–2):9–16.

    Google Scholar 

  53. Jha N, Shankar PR, Al-Betar MA, Mukhia R, Hada K, Palaian S. Undergraduate medical students’ and interns’ knowledge and perception of artificial intelligence in medicine. Adv Med Educ Pract. 2022;13:927–37.

    Article  Google Scholar 

  54. Swed S, Alibrahim H, Elkalagi NKH et al. Knowledge, attitude, and practice of artificial intelligence among doctors and medical students in Syria: a cross-sectional online survey. Front Artif Intell. 2022;5.

  55. Doumat G, Daher D, Ghanem NN, Khater B. Knowledge and attitudes of medical students in Lebanon toward artificial intelligence: a national survey study. Front Artif Intell. 2022;5.

  56. Buabbas AJ, Miskin B, Alnaqi A, et al. Investigating students’ perceptions towards Artificial Intelligence in Medical Education. Healthcare. 2023;11(9):1298.

    Article  Google Scholar 

  57. Kansal R, Bawa A, Bansal A et al. Differences in knowledge and perspectives on the usage of artificial intelligence among doctors and medical students of a developing country: a cross-sectional study. Curēus Published Online January 19, 2022.

  58. AlZaabi A, AlMaskari S, AalAbdulsalam A. Are physicians and medical students ready for artificial intelligence applications in healthcare? Digit Health. 2023;9:205520762311521.

    Article  Google Scholar 

  59. Pupic N, Ghaffari-Zadeh A, Hu R, et al. An evidence-based approach to artificial intelligence education for medical students: a systematic review. PLOS Digit Health. 2023;2(11):e0000255.

    Article  Google Scholar 

  60. Baloban J, Črpić G, Ježovita J. Vrednote u Hrvatskoj Od 1999. Do 2018. Prema European values study. Kršćanska sadašnjost; 2019.

  61. Popović S. Determinants of citizen’s attitudes and satisfaction with the Croatian health care system. Medicina. 2017;53(1):85–100.

    Google Scholar 

  62. STADA Health Report. 2024. Satisfaction with Healthcare System continues to decline. 2024.

  63. Price D, Bonsaksen T, Leung J, McClure-Thomas C, Ruffolo M, Lamph G et al. Factors Associated with Trust in Public Authorities among adults in Norway, United Kingdom, United States, and Australia two years after the COVID-19 outbreak. Int J Public Health. 2023;68.

  64. Skirbekk H, Magelssen M, Conradsen S. Trust in healthcare before and during the COVID-19 pandemic. BMC Public Health. 2023;23(1).

  65. Baroudi M, Goicolea I, Hurtig AK, San-Sebastian M. Social factors associated with trust in the health system in northern Sweden: a cross-sectional study. BMC Public Health. 2022;22(1).

  66. Jj C. Doctor-patient relationship: from medical paternalism to enhanced autonomy. Singapore Med J. 2002;43(3):152–5.

    Google Scholar 

  67. Hancock PA, Billings DR, Schaefer KE, Chen JYC, De Visser EJ, Parasuraman R. A Meta-analysis of factors affecting Trust in Human-Robot Interaction. Hum Factors. 2011;53(5):517–27.

    Article  Google Scholar 

  68. Čartolovni A, Tomičić A, Mosler EL. Ethical, legal, and social considerations of AI-based medical decision-support tools: a scoping review. Int J Med Informatics. 2022;161:104738.

    Article  Google Scholar 

  69. Vyshka G, Kruja J. Inapplicability of advance directives in a paternalistic setting: the case of a post-communist health system. BMC Med Ethics. 2011;12(1).

  70. Murgic L, Hébert PC, Sovic S, Pavlekovic G. Paternalism and autonomy: views of patients and providers in a transitional (post-communist) country. BMC Med Ethics. 2015;16(1).

  71. Vučemilo L, Ćurković M, Milošević M, Mustajbegović J, Borovečki A. Are physician-patient communication practices slowly changing in Croatia? – a cross-sectional questionnaire study. Croatian Med J. 2013;54(2):185–91.

    Article  Google Scholar 

  72. Nemcekova M, Ziakova K, Mistuna D, Kudlicka J. Respecting patients’ rights. Bull Med Ethics. 1998;140:13–8.

    Google Scholar 

  73. REPORT by Thomas Hammarberg, Commissioner for Human Rights of the Council of Europe. 2011. Online: https://rm.coe.int/16806db7c5

  74. The Advisory Committee on the. Framework Convention for the Protection of National Minorities. Fifth Opinion on Slovak Republic. 2022.

  75. McCoy LG, Nagaraj S, Morgado F, Harish V, Das S, Celi LA. What do medical students actually need to know about artificial intelligence? Npj Digit Med. 2020;3(1).

Download references

Funding

This work was supported by the Hrvatska zaklada za znanost (Croatian Science Foundation (CSF)) [grant number UIP-2019-04-3212] “(New) Ethical and Social Challenges of Digital Technologies in the Healthcare Domain”. The funder had no role in the design of this study and its execution, analyses, interpretation of the data, or decision to submit results.

Author information

Authors and Affiliations

Authors

Contributions

AČ and AM planned the study. MK assisted in the research implementation process. AM analysed the data, with contributions from MK and AČ. All authors contributed to the data interpretation and writing of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Anamaria Malešević.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the Catholic University of Croatia’s Ethics Committee on 21 January 2022 (Classification number: 641-03/21 − 03/03; registration number: 498 − 16/2-22-06). Participation in the research was anonymous and voluntary. Before completing the survey, participants were informed about the research objectives, data processing, and storage procedures and signed an informed consent form.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Supplementary Material 2

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Malešević, A., Kolesárová, M. & Čartolovni, A. Encompassing trust in medical AI from the perspective of medical students: a quantitative comparative study. BMC Med Ethics 25, 94 (2024). https://doi.org/10.1186/s12910-024-01092-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12910-024-01092-2

Keywords