CC BY-NC-ND 4.0 · Yearb Med Inform 2023; 32(01): 138-145
DOI: 10.1055/s-0043-1768720
Section 4: Clinical Research Informatics
Survey

Health Equity in Clinical Research Informatics

Sigurd Maurud
University of Oslo, Oslo, Norway
,
Silje H. Henni
University of Oslo, Oslo, Norway
,
Anne Moen
University of Oslo, Oslo, Norway
› Author Affiliations

Summary

Objectives: Through a scoping review, we examine in this survey what ways health equity has been promoted in clinical research informatics with patient implications and especially published in the year of 2021 (and some in 2022).

Method: A scoping review was conducted guided by using methods described in the Joanna Briggs Institute Manual. The review process consisted of five stages: 1) development of aim and research question, 2) literature search, 3) literature screening and selection, 4) data extraction, and 5) accumulate and report results.

Results: From the 478 identified papers in 2021 on the topic of clinical research informatics with focus on health equity as a patient implication, 8 papers met our inclusion criteria. All included papers focused on artificial intelligence (AI) technology. The papers addressed health equity in clinical research informatics either through the exposure of inequity in AI-based solutions or using AI as a tool for promoting health equity in the delivery of healthcare services. While algorithmic bias poses a risk to health equity within AI-based solutions, AI has also uncovered inequity in traditional treatment and demonstrated effective complements and alternatives that promotes health equity.

Conclusions: Clinical research informatics with implications for patients still face challenges of ethical nature and clinical value. However, used prudently—for the right purpose in the right context—clinical research informatics could bring powerful tools in advancing health equity in patient care.



Publication History

Article published online:
06 July 2023

© 2023. IMIA and Thieme. This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

 
  • References

  • 1 Embi PJ, Payne PR. Clinical research informatics: challenges, opportunities and definition for an emerging domain. J Am Med Inform Assoc 2009;16(3):316-27. doi: 10.1197/jamia.M3005.
  • 2 Solomonides A. Review of Clinical Research Informatics. Yearb Med Inform 2020;29(1):193-202. doi: 10.1055/s-0040-1701988.
  • 3 Begley K, Begley C, Smith V. Shared decision-making and maternity care in the deep learning age: Acknowledging and overcoming inherited defeaters. J Eval Clin Pract 2021 Jun;27(3):497-503. doi: 10.1111/jep.13515.
  • 4 Harrison CJ, Sidey-Gibbons CJ. Machine learning in medicine: a practical introduction to natural language processing. BMC Med Res Methodol 2021;21(1):158. doi: 10.1186/s12874-021-01347-1.
  • 5 Nadkarni PM, Ohno-Machado L, Chapman WW. Natural language processing: An introduction. J Am Med Inform Assoc 2011;18(5):544-51. doi: 10.1136/amiajnl-2011-000464.
  • 6 Chute CG. From Notations to Data: The Digital Transformation of Clinical Research. In: Richesson RL, Andrews JE, editors. Clinical Research Informatics. Cham: Springer International Publishing; 2019. p. 17-25. doi: 10.1007/978-1-84882-448-5_2
  • 7 Senders JT, Arnaout O, Karhade AV, Dasenbrock HH, Gormley WB, Broekman ML, et al. Natural and Artificial Intelligence in Neurosurgery: A Systematic Review. Neurosurgery 2018;83(2):181-92. doi: 10.1093/neuros/nyx384.
  • 8 Librenza-Garcia D, Kotzian BJ, Yang J, Mwangi B, Cao B, Pereira Lima LN, et al. The impact of machine learning techniques in the study of bipolar disorder: A systematic review. Neurosci Biobehav Rev 2017;80:538-54. doi: 10.1016/j.neubiorev.2017.07.004.
  • 9 Rajpara SM, Botello AP, Townend J, Ormerod AD. Systematic review of dermoscopy and digital dermoscopy artificial intelligence for the diagnosis of melanoma. Br J Dermatol 2009;161(3):591-604. doi: 10.1111/j.1365-2133.2009.09093.x.
  • 10 van den Heever M, Mittal A Haydock M, Windsor J. The use of intelligent database systems in acute pancreatitis – A systematic review. Pancreatology 2013;14(1):9-16. doi: 10.1016/j.pan.2013.11.010.
  • 11 Gargeya R, Leng T. Automated Identification of Diabetic Retinopathy Using Deep Learning. Ophthalmology 2017;124(7):962-9. doi: 10.1016/j.ophtha.2017.02.008.
  • 12 Dallora AL, Eivazzadeh S, Mendes E, Berglund J, Anderberg P. Machine learning and microsimulation techniques on the prognosis of dementia: A systematic literature review. PLoS One 2017;12(6):e0179804-e.
  • 13 Cook BL, Progovac AM, Chen P, Mullin B, Hou S, Baca-Garcia E. Novel Use of Natural Language Processing (NLP) to Predict Suicidal Ideation and Psychiatric Symptoms in a Text-Based Mental Health Intervention in Madrid. Comput Math Methods Med 2016;2016:1-8. doi: 10.1155/2016/8708434.
  • 14 McCoy TH, Castro VM, Roberson AM, Snapper LA, Perlis RH. Improving Prediction of Suicide and Accidental Death After Discharge From General Hospitals With Natural Language Processing. JAMA Psychiatry 2016;73(10):1064-71. doi: 10.1001/jamapsychiatry.2016.2172.
  • 15 European Commission. Ethics Guidelines for Trustworthy AI; 2019.
  • 16 Murphy K, Di Ruggiero E, Upshur R, Willison DJ, Cai JC, MalhotraN, et al. Artificial intelligence for good health: a scoping review of the ethics literature. BMC Medical Ethics 2021;22(1):14. doi: 10.1186/s12910-021-00577-8.
  • 17 Price WN. Medical Malpractice and Black-Box Medicine. Cambridge University Press; 2018. p. 295-306.
  • 18 Burrell J. How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big data & society 2016;3(1):205395171562251. doi: 10.1177/2053951715622512.
  • 19 Bjerring JC, Busch J. Artificial Intelligence and Patient-Centered Decision-Making. Philosophy & technology 2021;34(2):349-71. doi: 10.1007/s13347-019-00391-6.
  • 20 Gilpin LH, Bau D, Yuan BZ, Bajwa A, Specter M, Kagal L. Explaining explanations: An overview of interpretability of machine learning 2018 IEEE 5th International Conference on Data Science and Advanced Analytics (DSAA); 2018. p. 80-9. doi:10.1109/DSAA.2018.00018.
  • 21 Clark CR, Wilkins CH, Rodriguez JA, Preininger AM, Harris J, DesAutels S, et al. Health Care Equity in the Use of Advanced Analytics and Artificial Intelligence Technologies in Primary Care. J Gen Intern Med 2021;36(10):3188-93. 10.1007/s11606-021-06846-x.
  • 22 Matheny M, Thadaney Israni S, Ahmed M, Whicher D. Artificial Intelligence in Health Care: The Hope, the Hype, the Promise, the Peril. Washington DC: National Academy of Medicine; 2019.
  • 23 Kickbusch I, Piselli D, Agrawal A, Balicer R, Banner O, Adelhardt M, et al. The Lancet and Financial Times Commission on governing health futures 2030: growing up in a digital world. Lancet 2021;398(10312):1727-76. doi: 10.1016/S0140-6736(21)01824-9.
  • 24 World Health Organization. Health equity and its determinants 2021 [updated Apr 6. Available from: https://www.who.int/publications/m/item/health-equity-and-its-determinants].
  • 25 Caliskan A, Bryson JJ, Narayanan A. Semantics derived automatically from language corpora contain human-like biases. Science 2017;356(6334):183-6. doi: 10.1126/science.aal4230.
  • 26 Char DS, Shah NH, Magnus D. Implementing Machine Learning in Health Care - Addressing Ethical Challenges. N Engl J Med 2018;378(11):981-3. doi: 10.1056/NEJMp1714229.
  • 27 Obermeyer Z, Powers B, Vogeli C, Mullainathan S. Dissecting racial bias in an algorithm used to manage the health of populations. Science 2019;366(6464):447-53. doi: 10.1126/science.aax2342.
  • 28 Vyas DA, Eisenstein LG, Jones DS. Hidden in Plain Sight — Reconsidering the Use of Race Correction in Clinical Algorithms. N Engl J Med 2020;383(9):874-82. doi: 10.1056/NEJMms2004740.
  • 29 Adamson AS, Smith A. Machine Learning and Health Care Disparities in Dermatology. JAMA Dermatol 2018;154(11):1247-8. doi: 10.1001/jamadermatol.2018.2348.
  • 30 Singh S. Racial biases in healthcare: Examining the contributions of Point of Care tools and unintended practitioner bias to patient treatment and diagnosis. Health (London) 2021:13634593211061215. doi: 10.1177/13634593211061215.
  • 31 United States Census Bureau. About the Topic of Race United States Census Bureau; 2022 [updated Mar 1, 2022; cited 2022 Nov 8. Available from: https://www.census.gov/topics/population/race/about.html].
  • 32 Bell M. ‘Race’, Ethnicity, and Racism in Europe. Oxford: Oxford: Oxford University Press; 2009.
  • 33 Chiang S, Picard RW, Chiong W, Moss R, Worrell GA, Rao VR, et al. Guidelines for Conducting Ethical Artificial Intelligence Research in Neurology: A Systematic Approach for Clinicians and Researchers. Neurology 2021;97(13):632-40. doi: 10.1212/WNL.maurud0000012570.
  • 34 Gianfrancesco MA, Tamang S, Yazdany J, Schmajuk G. Potential Biases in Machine Learning Algorithms Using Electronic Health Record Data. JAMA Intern Med 2018;178(11):1544-7. doi: 10.1001/jamainternmed.2018.3763.
  • 35 Parikh RB, Teeple S, Navathe AS. Addressing Bias in Artificial Intelligence in Health Care. JAMA 2019;322(24):2377-8.
  • 36 Mullainathan S, Obermeyer Z. Does Machine Learning Automate Moral Hazard and Error? Am Econ Rev 2017;107(5):476-80. doi: 10.1001/jama.2019.18058.
  • 37 Blumenthal-Barby JS, Krieger H. Cognitive Biases and Heuristics in Medical Decision Making: A Critical Review Using a Systematic Search Strategy. Med Decis Making 2015;35(4):539-57. doi: 10.1177/0272989X14547740.
  • 38 Starke G, De Clercq E, Elger BS. Towards a pragmatist dealing with algorithmic bias in medical machine learning. Med Health Care Philos 2021;24(3):341-9. doi: 10.1007/s11019-021-10008-5.
  • 39 Chauhan C, Gullapalli RR. Ethics of AI in Pathology: Current Paradigms and Emerging Issues. Am J Pathol 2021;191(10):1673-83. doi: 10.1016/j.ajpath.2021.06.011.
  • 40 European Commission. Communication from the Commission to The European Parliament, The European Council, The Council, The European Economic and Social Committee and The Comittee of the Regions: Coordinated Plan on Artificial Intelligence. Brussels: European Commission; 2018. p. 10.
  • 41 Rajkomar A, Hardt M, Howell MD, Corrado G, Chin MH. Ensuring fairness in machine learning to advance health equity. Ann Intern Med 2018;169(12):866-72. doi: 10.7326/M18-1990.
  • 42 Mougin F, Fultz Hollis K, Soualmia LF. Inclusive Digital Health. Yearb Med Inform 2022;31(01):2-6. doi: 10.1055/s-0042-1742540.
  • 43 Peters MDJ, Godfrey C, McInerney P, Munn Z, Tricco AC, Khalil H. Chapter 11: Scoping reviews. 2020 [cited 16 feb 2022]. In: JBI Manual for Evidence Synthesis. [cited 16 feb 2022]. [Available from: https://synthesismanual.jbi.global].
  • 44 Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ 2021;372:n71. doi: 10.1136/bmj.n71.
  • 45 Covidence systematic review software. Melbourne: Veritas Health Innovation; [Available from: www.covidence.org.
  • 46 Patra BG, Sharma MM, Vekaria V, Adekkanattu P, Patterson OV, Glicksberg B, et al. Extracting social determinants of health from electronic health records using natural language processing: a systematic review. J Am Med Inform Assoc 2021 Nov 25;28(12):2716-27. doi: 10.1093/jamia/ocab170.
  • 47 Pham Q, Gamble A, Hearn J, Cafazzo JA. The Need for Ethnoracial Equity in Artificial Intelligence for Diabetes Management: Review and Recommendations. J Med Internet Res 2021;23(2):e22320. doi: 10.2196/22320.
  • 48 Hazlehurst B, Green CA, Perrin NA, Brandes J, Carrell DS, Baer A, et al. Using natural language processing of clinical text to enhance identification of opioid‐related overdoses in electronic health records data. Pharmacoepidemiol Drug Saf 2019;28(8):1143-51. doi: 10.1002/pds.4810.
  • 49 Craig KJT, Fusco N, Gunnarsdottir T, Chamberland L, Snowdon JL, Kassler WJ. Leveraging Data and Digital Health Technologies to Assess and Impact Social Determinants of Health (SDoH): a State-of-the-Art Literature Review. Online J Public Health Inform 2021;13(3):E14. doi: 10.5210/ojphi.v13i3.11081.
  • 50 Navathe AS, Zhong F, Lei VJ, Chang FY, Sordo M, Topaz M, et al. Hospital Readmission and Social Risk Factors Identified from Physician Notes. Health Serv Res 2018;53(2):1110-36. doi: 10.1111/1475-6773.12670.
  • 51 Coley RY, Johnson E, Simon GE, Cruz M, Shortreed SM. Racial/Ethnic Disparities in the Performance of Prediction Models for Death by Suicide after Mental Health Visits. JAMA Psychiatry 2021;78(7):726-34. doi: 10.1001/jamapsychiatry.2021.0493.
  • 52 Hammarlund N. Racial treatment disparities after machine learning surgical risk-adjustment. Health Serv Outcomes Res Methodol 2021;21(2):248-86. doi: 10.1007/s10742-020-00231-7.
  • 53 Lu L, Anderson B, Ha R, D’Agostino A, Rudman SL, Ouyang D, et al. A language-matching model to improve equity and efficiency of COVID-19 contact tracing. Proc Natl Acad Sci U S A 2021;118(43):e2109443118. doi: 10.1073/pnas.2109443118.
  • 54 Pierson E, Cutler DM, Leskovec J, Mullainathan S, Obermeyer Z. An algorithmic approach to reducing unexplained pain disparities in underserved populations. Nat Med 2021;27(1):136-40. doi: 10.1038/s41591-020-01192-7.
  • 55 Thompson HM, Sharma B, Bhalla S, Boley R, McCluskey C, Dligach D, et al. Bias and fairness assessment of a natural language processing opioid misuse classifier: Detection and mitigation of electronic health record data disadvantages across racial subgroups. J Am Med Inform Assoc 2021;28(11):2393-403. doi: 10.1093/jamia/ocab148.
  • 56 Ribeiro MT, Singh S, Guestrin C. “Why Should I Trust You?”: Explaining the Predictions of Any Classifier. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD ‘16). Association for Computing Machinery, New York, NY, USA; 2016. p. 1135–44. doi: 10.1145/2939672.2939778.
  • 57 Biden J. Presidential Actions [Internet]. Washington D. C. : The White House; 2021. [Available from: https://www.whitehouse.gov/briefing-room/presidential-actions/2021/01/20/executive-order-advancing-racial-equity-and-support-for-underserved-communities-through-the-federal-government/].
  • 58 The White House. Advancing Equity and Racial Justice Through the Federal Government Washington DC: The White House; 2021 [cited 2023 Jan 20]. [Available from: https://www.whitehouse.gov/equity/].
  • 59 Ciecierski-Holmes T, Singh R, Axt M, Brenner S, Barteit S. Artificial intelligence for strengthening healthcare systems in low- and middle-income countries: a systematic scoping review. NPJ Digit Med 2022;5(1):162. doi: 10.1038/s41746-022-00700-y.
  • 60 Schneider E, Shah A, Doty M, Tikkanen R, Fields K, Williams II R. Mirror, Mirror 2021 – Reflecting Poorly: Healthcare in the U.S. Compared to Other High-Income Countries. The Commonwealth Fund; 2021 August 4. [Available from: https://www.commonwealthfund.org/publications/fund-reports/2021/aug/mirror-mirror-2021-reflecting-poorly].
  • 61 Adam H, Yang MY, Cato K, Baldini I Senteio C, Celi LA, et al. Write It Like You See It: Detectable Differences in Clinical Notes by Race Lead to Differential Model Recommendations. Proceedings of the 2022 AAAI/ACM Conference on AI, Ethics, and Society; Oxford, United Kingdom: Association for Computing Machinery; 2022. p. 7–21. doi: 10.1145/3514094.3534203.
  • 62 Bear Don’t Walk OJ, Reyes Nieva H, Lee SS, Elhadad N. A scoping review of ethics considerations in clinical natural language processing. JAMIA Open 2022;5(2):ooac039. doi: 10.1093/jamiaopen/ooac039.
  • 63 Bermudez-Lopez M, Marti-Antonio M, Castro-Boque E, Bretones MDM, Farras C, Torres G, et al. Development and Validation of a Personalized, Sex-Specific Prediction Algorithm of Severe Atheromatosis in Middle-Aged Asymptomatic Individuals: The ILERVAS Study. Front Cardiovasc Med 2022;9: 895917. doi: 10.3389/fcvm.2022.895917.
  • 64 Chaunzwa TL, Del Rey MQ, Bitterman DS. Clinical Informatics Approaches to Understand and Address Cancer Disparities. Yearb Med Inform 2022;31(1):121-30. doi: 10.1055/s-0042-1742511.
  • 65 D’Elia A, Gabbay M, Rodgers S, Kierans C, Jones E, Durrani I, et al. Artificial intelligence and health inequities in primary care: A systematic scoping review and framework. Fam Med Community Health 2022; 10(Suppl 1):e001670. doi: 10.1136/fmch-2022-001670.
  • 66 Das S, Shi X. Offspring GAN augments biased human genomic data. Proceedings of the 13th ACM International Conference on Bioinformatics, Computational Biology and Health Informatics; Northbrook, Illinois: Association for Computing Machinery; 2022. p. 1-10. doi: 10.1145/3535508.3545537.
  • 67 Delgado J, de Manuel A, Parra I, Moyano C, Rueda J, Guersenzvaig A, et al. Bias in algorithms of AI systems developed for COVID-19: A scoping review. J Bioeth Inq 2022, 19(3):407-19.
  • 68 Dixon BE, Holmes JH. Special Section on Inclusive Digital Health: Notable Papers on Addressing Bias, Equity, and Literacy to Strengthen Health Systems. Yearb Med Inform 2022;31(1):100-4.
  • 69 Estiri H, Strasser ZH, Rashidian S, Klann JG, Wagholikar KB, McCoy TH, et al. An objective framework for evaluating unrecognized bias in medical AI models predicting COVID-19 outcomes. J Am Med Inform Assoc 2022;29(8):1334-41. doi: 10.1093/jamia/ocac070.
  • 70 Golder S, O’Connor K, Wang Y, Stevens R, Gonzalez-Hernandez G. Best Practices on Big Data Analytics to Address Sex-Specific Biases in Our Understanding of the Etiology, Diagnosis, and Prognosis of Diseases. Annu Rev Biomed Data Sci 2022 Aug 10;5:251-67. doi: 10.1146/annurev-biodatasci-122120-025806.
  • 71 Huang J, Galal G, Etemadi M, Vaidyanathan M. Evaluation and Mitigation of Racial Bias in Clinical Machine Learning Models: Scoping Review. JMIR Med Inform 2022;10(5):e36388. doi: 10.2196/36388.
  • 72 LiY, Wang H, Luo Y. Improving Fairness in the Prediction of Heart Failure Length of Stay and Mortality by Integrating Social Determinants of Health. Circ Heart Fail 2022;15(11):E009473. doi: 10.1161/CIRCHEARTFAILURE.122.009473.
  • 73 Minot JR, Cheney N, Maier M, Elbers DC, Danforth CM, Dodds PS. Interpretable Bias Mitigation for Textual Data: Reducing Genderization in Patient Notes While Maintaining Classification Performance. ACM Trans Comput Healthcare 2022;3(4):1-41. doi: 10.1145/3524887.
  • 74 Plana D, Shung DL, Grimshaw AA, Saraf A, Sung JJY, Kann BH. Randomized Clinical Trials of Machine Learning Interventions in Health Care: A Systematic Review. JAMA Netw Open 2022;5(9):e2233946. doi: 10.1001/jamanetworkopen.2022.33946.
  • 75 Seker E, Talburt JR, Greer ML. Preprocessing to Address Bias in Healthcare Data. Stud Health Technol Inform 2022;294:327-31. doi: 10.3233/SHTI220468.
  • 76 Sikstrom L, Maslej MM, Hui K, Findlay Z, Buchman DZ, Hill SL. Conceptualising fairness: three pillars for medical algorithms and health equity. BMJ Health and Care Informatics 2022; 29(1):e100459. doi: 10.1136/bmjhci-2021-100459.
  • 77 Straw, I., Wu, H. Investigating for bias in healthcare algorithms: A sex-stratified analysis of supervised machine learning models in liver disease prediction. BMJ Health Care Inform 2022;29(1) 29(1):e100457. doi: 10.1136/bmjhci-2021-100457.
  • 78 Veinot TC, Clarke PJ, Romero DM, Buis LR, Dillahunt TR, Vydiswaran VVG, et al. Equitable Research PRAXIS: A Framework for Health Informatics Methods. Yearb Med Inform 2022;31(1):307-16. doi: 10.1055/s-0042-1742542.
  • 79 Velichkovska B, Gjoreski H, Denkovski D, Kalendar M, Mamandipoor B, Celi LA, et al. Vital signs as a source of racial bias. medRxiv 2022;04. doi: 10.1101/2022.02.03.22270291.
  • 80 Xu J, Xiao Y, Wang WH, Ning Y, Shenkman EA, Bian J, et al. Algorithmic fairness in computational medicine. EBioMedicine 2022 Oct;84:104250. doi: 10.1016/j.ebiom.2022.104250.
  • 81 Yan M, Pencina MJ, Boulware LE, Goldstein BA. Observability and its impact on differential bias for clinical prediction models. J Am Med Inform Assoc 2022;29(5):937-43. doi: 10.1093/jamia/ocac019.
  • 82 Hong C, Pencina MJ, Wojdyla DM, Hall JL, Judd SE, Cary M, et al. Predictive Accuracy of Stroke Risk Prediction Models Across Black and White Race, Sex, and Age Groups. JAMA 2023;329(4):306-17. doi: doi: 10.1001/jama.2022.24683.
  • 83 Park JI, Bozkurt S, Park JW, Lee S. Evaluation of race/ethnicity-specific survival machine learning models for Hispanic and Black patients with breast cancer. BMJ Health Care Inform 2023;30(1):e100666. doi: 10.1136/bmjhci-2022-100666.
  • 84 Zhang J, Zhang ZM. Ethics and governance of trustworthy medical artificial intelligence. BMC Med Inform Decis Mak 2023 Jan 13;23(1):7. doi: 10.1186/s12911-023-02103-9.