Mitigating Bias for Medical Devices

by | Apr 14, 2021 | Compliance, Design Controls, Medical Devices, Regulatory

Human Factors Engineering is a major component of the Design Controls process for Medical Devices. Understanding how the users of your medical device will interpret the training and instructions for your product is just as important as making sure your device actually works. In addition to that, developers need to understand if there are any inherent biases in their devices that could disproportionally affect one demographic versus another. Last month, I wrote a blog about how the FDA was investigating pulse oximeters due to their reported false readings with darker skin pigmentations (you can read that blog here), but there are several more real-world examples.

Just this month, an assistance professor at the University of California, Los Angeles, published a perspective in the journal Science titled “Achieving fairness in medical devices”. Achuta Kadambi, part of UCLA’s Department of Electrical and Computer Engineering, wrote that medical devices are subject to physical, computational, and interpretation biases that can disadvantage certain groups.1 Kadambi cites the issue with pulse oximeters as one example of how darker skin pigmentations can be affected; he also points out that hip implants that are designed only considering male bone structure may lead to adverse outcomes in women.2

In addition to physical biases, there are also computational and interpretation biases for medical devices. This type of bias is especially important for medical devices embedded with Artificial Intelligence or Machine Learning. Kadambi illustrates an example where a chest X-ray database algorithm has a 60:40 split between images from men and women. Since there are more images of male chests for the algorithm to learn from, the SaMD could be biased and potentially lead to misdiagnoses of female patients. This type of bias and concern with AI/ML data is not new to the FDA, just last year the Agency warned that the lack of representative data on broader patient populations could lead to significant bias for AI/ML.3

As technology advances and medical devices become more and more sophisticated, manufacturers can expect to see a regulatory focus on Human Factors Engineering, specifically for bias towards certain demographics. This is a new realm of Design Controls that up till now has not been heavily explored. If you need help with your Design & Development process, EMMA International’s Design Controls experts are ready to assist! Give us a call at 248-987-4497 or email info@emmainternational.com to learn more today.


1Kadambi (April 2021) Achieving fairness in medical devices, retrieved on 04/11/2021 from: https://science.sciencemag.org/content/372/6537/30

2Hartman, MD, et al. (2009) Gender Issues in Total Hip Arthroplasty: Length, Offset, and Osteoporosis, Seminars in Arthroplasty, 20(1) 62-65.

3Slabodkin (Oct 2020) FDA, Philips warn of data bias in AI, Machine Learning devices retrieved on 04/11/2021 from: https://www.medtechdive.com/news/fda-philips-warn-of-data-bias-in-ai-machine-learning-devices/587734/

Madison Green

Madison Green

Director of Technical Operations - Mrs. Green serves as EMMA International’s Director of Technical Operations. She has experience in technical writing, nonconforming product management, issue evaluations, and implementing corrective and preventative actions in the pharmaceuticals and medical device industries. She has experience cross-functionally between R&D, lean manufacturing operations, and RA compliance. Mrs. Green also has academic and work experience with human health-risk engineering controls, physiological biophysics, and clinical research. Mrs. Green holds a Bachelor of Science in Biosystems Engineering with a concentration in Biomedical Engineering from Michigan State University. She is also a Certified Quality Auditor (CQA), and is currently pursuing her M.S. in Quality Management.

More Resources

Ready to learn more about working with us?

Pin It on Pinterest

Share This