What if a physician assumed her patient was healthy, just because he seldom came to the clinic? Researchers [uncovered serious flaws](https://www.science.org/doi/10.1126/science.aax2342) in an artificial intelligence (AI) tool used by a UnitedHealthcare unit, which consistently ranked black patients as healthier than white patients with the same conditions–not because they were healthier, but because they incurred lower healthcare costs. It failed to recognize that lower spending was driven by barriers to healthcare access.
This is not an isolated finding; it is a warning about a larger problem affecting AI in healthcare. Unless designed with meaningful patient and community input from the start, AI will risk excluding the most vulnerable people and replicating existing biases like the above.
## The Urgent Need for AI in Healthcare
Never has AI been more needed in healthcare, as [Medicaid and other health programs are slashed](https://www.npr.org/2025/07/16/1255755387/1a-07-16-2025), jeopardizing health coverage for more than 10 million of the most vulnerable Americans. Recently, the Trump Administration unveiled an AI Action Plan, but it [failed to include, per The Brookings Institute](https://www.brookings.edu/articles/what-to-make-of-the-trump-administrations-ai-action-plan/) “mechanisms such as co-creation [and] participatory design…” to “serve citizens and humanity in fair, transparent, and accountable ways.”
I’ve spent more than three decades designing and testing global public health interventions and conducting research funded extensively by the National Institute of Health. My expertise is working in close partnership with communities, including people with lived experience during intervention analysis, design, implementation, publication, and presentation of the findings.
## Why the Lack of Patient Input?
When I see how AI is developing without patient input, I’m concerned. Unfortunately, when it comes to AI, those most impacted are rarely invited to help shape the technologies deciding their futures. A [2024 scoping review](https://journals.plos.org/digitalhealth/article?id=10.1371%2Fjournal.pdig.0000561) of 10,880 articles describing AI or machine learning healthcare applications found that fewer than 0.2% included any form of community engagement. Over 99% of so-called health “innovations” were created without consulting the people most affected by them.
In contrast, traditional health technologies like medical devices often involve [patients in the process close to half of the time](https://mdic.org/resources/patient-engagement-in-clinical-trials-survey-report/). Devices like this [insulin pump](https://www.medtronicdiabetes.com/products/minimed-780g-system) and [cardiac monitor](https://www.cardiovascular.abbott/us/en/hcp/products/cardiac-rhythm-management/insertable-cardiac-monitors.html) must undergo rigorous [FDA review](https://www.fda.gov/medical-devices/510k-clearances/medical-device-safety-and-510k-clearance-process), including [clinical validation, user testing](https://www.fda.gov/regulatory-information/reviews-and-research/fda-research-funding-frequently-asked-questions-faqs).




