Ethical Implications of AI in Healthcare

Finally, there is also a variety of ethical implications around the use of AI in healthcare. Healthcare decisions have been made almost exclusively by humans in the past, and the use of smart machines to make or assist with them raises issues of accountability, transparency, permission, and privacy.

Perhaps, the most difficult issue to address given today’s technologies is transparency. I mentioned above that AI algorithms—particularly deep learning algorithms used for image analysis—are virtually impossible to interpret or explain. If a patient is informed that an image has led to a diagnosis of cancer, he or she will likely want to know why. Deep learning algorithms, and even physicians who are generally familiar with their operation, may be unable to provide an explanation.

Mistakes will undoubtedly be made by AI systems in patient diagnosis and treatment, and it may be difficult to establish accountability for them. There are also likely to be incidents in which patients receive medical information from AI systems that they would prefer to receive from an empathetic clinician. Machine learning systems in healthcare may also be subject to algorithmic bias, perhaps predicting greater likelihood of disease on the basis of gender or race when those are not actually causal factors.44

We are likely to eventually encounter many ethical, medical, occupational, and technological changes with AI in healthcare. It is important that healthcare institutions, as well as governmental and regulatory bodies, establish monitoring and governance structures to monitor key issues, react in a responsible manner, and establish governance mechanisms to limit negative implications. AI is one of the more powerful and consequential technologies to impact human societies, so it will require continuous attention and thoughtful policy for many years.

The Future of AI in Healthcare

There can be little doubt that AI has an important role to play in the healthcare offerings of the future. In the form of machine learning, it is the primary capability behind the development of precision medicine—widely agreed to be a sorely needed advance in care. AI has already proven valuable for administrative applications, and shows promise for engagement and adherence purposes. Although early efforts at providing diagnosis and treatment recommendations have proven challenging, it seems likely that AI will ultimately make significant contributions to that domain. Given the rapid advances in AI for imaging analysis, it seems likely that most radiology and pathology images will be examined at some point by a machine. Speech and text recognition are already employed for tasks such as patient communication and capture of clinical notes, but their usage will increase.

The greatest challenge to AI in these healthcare domains is not whether the technologies will be capable enough to be useful, but rather ensuring their adoption in daily clinical practice. For widespread adoption to take place, AI systems must be approved by regulators, integrated with EHR systems, standardized to a sufficient degree that similar products work in a similar fashion, taught to clinicians, paid for by public or private payer organizations, and updated over time in the field. These challenges will ultimately be overcome, but they will take much longer to do so than it will take for the technologies themselves to mature. As a result, we expect to see limited use of AI in clinical practice within five years, and more extensive use within ten.

It also seems increasingly clear that AI systems will not replace human clinicians on a large scale, but rather will augment their efforts to care for patients. Over time, human clinicians may move toward tasks and job designs that draw on uniquely human skills such as empathy, persuasion, and big-picture care integration. Perhaps, the only healthcare providers who will lose their jobs over time may be those who refuse to work alongside AI.

This chapter is a revised and extended version of Davenport, T.H. and Kalakota, R., “The Potential for Artificial Intelligence in Healthcare,” Future Healthcare Journal, June 2019, DOI: 10.7861/futurehosp.6-2-94


  • 1. Shortliffe, E.H., Strategic action in health information technology: why the obvious has taken so lone. Health Affairs, Sept./Oct. 2005. https://doi.ore/10.1377/ hlthaff.24.5.1222
  • 2. Deloitte, State of AI in the enterprise. 2018. insights/us/articles/4780_State-of-AI-in-the-enterprise/AICognitiveSurvey2018_ Infographic.pdf
  • 3. Lee, Su-in, Celik, S., Logsdon, B., et al. A machine learning approach to integrate big data for precision medicine in acute myeloid leukemia. Nature Communications, 2018,42.
  • 4. Heaven, W.D. AI could help with the next pandemic—but not with this one. MIT Technology Review, March 12, 2020. 2020/03/12/905352/ai-could-help-with-the-next-pandemicbut-not-with-this-one/
  • 5. Sordo, M. Introduction to neural networks in healthcare. Open Clinical, Oct. 2002.
  • 6. Fakoor, R., Ladhak, F, Nazi, A., Huber, M. Using deep learning to enhance cancer diagnosis and classification. Proceedings of the 30th International Conference on Machine Learning 2013. Atlanta, GA.
  • 7. Vial, A. et al. The role of deep learning and radiomic feature extraction in cancer- specific predictive modelling: a review. Translational Cancer Research, 7:3, June 2018.
  • 8. Hao, K. We can’t trust AI systems built on deep learning alone. MIT Technology Review, Sept. 27, 2019. cant-trust-ai-systems-built-on-deep-learning-alone/
  • 9. Hussain, A., Malik, A., Halim, M.U., Ali, A.M. The use of robotics in surgery: a review. The International Journal of Clinical Practice, 2014.
  • 10. Bush, J. How AI is taking the scut work out of healthcare. Harvard Business Review, 2018.
  • 11. Buchanan, B.G., Shortliffe, E.H. Rule Based Expert Systems: TheMYCINExperiments of the Stanford Heuristic Programming Project. Reading, M A: Addison-Wesley. 1984.
  • 12. NHS (National Health Service) and Capita. Measuring shared decision making: a review of research evidence. Shared Decision Making Programme, 2012. https://www.
  • 13. Ross, C., Swetlitz, I. IBM pitched its Watson supercomputer as a revolution in cancer care. It’s nowhere close. Stat, 2017. watson-ibm-cancer/
  • 14. Strickland, E. How IBM overpromised and underdelivered on AI healthcare. IEEE Spectrum, April 2, 2019. how-ibm-watson-overpromised-and-underdelivered-on-ai-health-care
  • 15. Loria, K. Putting the AI in radiology. Radiology Today, 19:10, 2018. https://www.
  • 16. American College of Radiology Data Science Institute, FDA cleared AI algorithms,, accessed 6/2/20.
  • 17. Aushev, A. et al. Feature selection for the accurate prediction of septic and cardiogenic shock ICU mortality in the acute phase .PLoS One, 2018. journal.pone.0199089
  • 18. Schmidt-Erfurth, U. et al. Machine learning to analyze the prognostic value of current imaging biomarkers in neovascular age-related macular degeneration. Opthamology Retina, 2018.
  • 19. Aronson, S., Rehm, H. Building the foundation for genomic-based precision medicine. Nature, 526:336-342, 2015.
  • 20. Schinkel, M. et al. Clinical applications of artificial intelligence in sepsis: a narrative review. Computers in Biology and Medicine, 115, Dec. 2019. https://doi.Org/10.10l6/j. compbiomed.2019.103488
  • 21. Rysavy, M. Evidence-based medicine: a science of uncertainty and an art of probability. AMA Journal of Ethics, 2013. evidence-based-medicine-science-uncertainty-and-art-probability/2013-01/
  • 22. Rajkomar, A. Scalable and accurate deep learning with electronic health records. NPJ Digital Medicine, 2018.
  • 23. Shimabukuro, D. et al. Effect of a machine learning-based severe sepsis prediction algorithm on patient survival and hospital length of stay: a randomised clinical trial. BMJ Open Respiratory Research, 2017.
  • 24. Aicha, A.N., Englebienne, G., van Schooten, K.S., Pijnappels, M., Krose, B. Deep learning to predict falls in older adults based on daily-life trunk accelerometry. Sensors, 18:1654, 2018.
  • 25. Low, L.L. et al. Predicting 30-day readmissions: performance of the LACE index compared with a regression model among general medicine patients in Singapore. Biomed Research International, 2015.
  • 26. Davenport, T.H., Hongsermeier, T., Me Cord, K.A. Using Al to improve electronic health records. Harvard Business Review, 2018. using-ai-to-improve-electronic-health-records
  • 27. Commins, J. Nurses say distractions cut bedside time by 25%. Health Leaders, 2010. bedside-time-25
  • 28. Cutler, D.M., Wikler, E., Basch. P. Reducing administrative costs and improving the healthcare system. New England Journal of Medicine, 367(20):1875—1878, 2012.
  • 29. Utermohlen, K. Four robotic process automation (RPA) applications in the healthcare industry. Medium, 2018. https://medium.eom/@karl.utermohlen/4-robotic-process- automation-rpa-applications-in-the-healthcare-industry-4d449b24b6l3
  • 30. Davenport, T.H., New York Presbyterian, Workfusion, and the intelligent automaton of healthcare, Forbes, Sept. 15, 2019, tomdavenport/2019/09/15/new-york-presbyterian-workfusion-and-the-intelligent- automation-of-health-care-administration/#78e44aa57e77
  • 31. UserTesting. Healthcare chatbot apps are on the rise, but the overall customer experience falls short. UserTesting press release 2019.
  • 32. Kuan, B. Transforming healthcare with real world evidence integration solution, Tamr Inc blog post, June 27, 2019,


  • 33. Davenport, T.H., Miller, S., The future of work now: medical coder. Forbes, Jan. 3, 2020, https://www.forbes.eom/sites/tomdavenport/2020/01/03/the-future-of-work- now- medical-coding-with-ai/#2df63fc5282c
  • 34. Jamei, M. et al. Predicting all-cause risk of 30-day hospital readmission using artificial neural networks. PLoS One, 2017. article?id=10.1371/journal, pone.0197793
  • 35. Brown, M.T., Bussell, J.K. Medication adherence: WHO cares? Mayo Clinic Proceedings, April 2011.
  • 36. Berg, S. Nudge theory explored to boost medication adherence. American Medical Association website, 2018. advocacy/nudge-theory-explored-boost-medication-adherence
  • 37. Deloitte. From brawn to brains: the impact of technology on jobs in the U.K. 2015. uk-insights-from-brawns-to-brain.pdf
  • 38. McKinsey Global Institute. A future that works: automation, employment, and productivity. 2017. automation-and-the-future-of-work/a-future-that-works-automation-employment- and-productivity/
  • 39. Frey, C.B., Osborne, M. The future of employment: how susceptible are jobs to computerisation? Oxford Martin, Sept. 1, 2013. publications/the-future-of-employment/
  • 40. McKinsey Global Institute. A future that works: automation, employment, and productivity. 2017. mation-and-the-future-of-work/a-future-that-works-automation-employment-and- productivity
  • 41. Davenport, T.H., Dreyer, K. AI will change radiology, but it won’t replace radiologists. Harvard Business Review, 2018. change-radiology-but-it-wont-replace-radiologists
  • 42. Arndt, R. The slow upgrade to artificial intelligence. Modern Healthcare, 48:10, Mar. 2018. healthcare-makes-slow-impact/
  • 43. KPMG. Living in an AI world: achievements and challenges across five industries. 2020.
  • 44. Char, D.S., Shah, N.H., Magnus, D. Implementing machine learning in Healthcare— addressing ethical challenges. New England Journal of Medicine, 378:981-983, 2018.
< Prev   CONTENTS   Source