For example, knowing that coronary heart disease has traditionally been under-researched, underdiagnosed, and undertreated in women , could we develop algorithms that are specifically geared toward detecting or predicting female manifestations of the disease?
In an era of precision medicine, where we are increasingly moving from a one-size-fits-all to a personalized approach to healthcare, it’s an interesting avenue for further research – which should of course fully respect ethical, legal, and regulatory boundaries around the use of sensitive personal data.
In the future, I can imagine it will be possible to tune an algorithm to a specific target patient population for optimal and bias-free performance. This could be another step towards a fairer and more inclusive future for healthcare – supported by AI that not only acknowledges variation between different patient groups, but is designed to capture it.
 Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366, 447–45. https://doi.org/10.1126/science.aax2342
 HIMSS (2019), commissioned by Philips. Artificial intelligence and machine learning in healthcare. Survey conducted among 234 respondents in US healthcare organizations.
 Goddard, K., Roudsari, A., & Wyatt, J. (2012). Automation bias: a systematic review of frequency, effect mediators, and mitigators. JAMIA, 19(1), 121–127. https://doi.org/10.1136/amiajnl-2011-000089
 Feldman, S., Ammar, W., Lo, K., et al. (2019). Quantifying sex bias in clinical studies at scale with automated data extraction. JAMA, 2(7): e196700. https://doi.org/10.1001/jamanetworkopen.2019.6700
 Redwood, S., & Gill, P.S. (2013). Under-representation of minority ethnic groups in research - call for action. Br J Gen Pract. 63(612): 342-343. https://doi.org/10.3399/bjgp13X668456
 Popejoy, A., & Fullerton, S. (2016). Genomics is failing on diversity. Nature, 538(7624):161-164 http://doi.org/10.1038/538161a
 Adamson, A., & Smith, A. (2018). Machine learning and health care disparities in dermatology. JAMA Dermatology. 154(11):1247–1248. https://doi.org/10.1001/jamadermatol.2018.2348
 Röösli, E., Rice, B., & Hernandez-Boussard, T. (2020). Bias at warp speed: how AI may contribute to the disparities gap in the time of COVID-19. JAMIA, ocaa210. https://doi.org/10.1093/jamia/ocaa210
 Cirillo, D., Catuara-Solarz, S., Morey, C. et al. (2020). Sex and gender differences and biases in artificial intelligence for biomedicine and healthcare. npj Digit. Med. 3, 81. https://doi.org/10.1038/s41746-020-0288-5
 Goulart, B., Silgard, E., Baik, C. et al. (2019). Validity of natural language processing for ascertainment of EGFR and ALK test results in SEER cases of Stage IV Non-Small-Cell Lung Cancer. Journal of Clinical Cancer Informatics. 3:1-15. https://doi.org/10.1200/CCI.18.00098
 Mikhail G. W. (2005). Coronary heart disease in women. BMJ (Clinical research ed.), 331(7515), 467–468. https://doi.org/10.1136/bmj.331.7515.467
[i] Based on their analysis, the researchers later went on to work together with the developers of the algorithm to reduce the racial bias.