Who Does Your Medical AI Miss? Age Bias in Tumor Segmentation

Who Does Your Medical AI Miss? Age Bias in Tumor Segmentation

If your AI segments tumors, who is it serving—and who might it be failing? A new audit of the MAMA-MIA breast cancer segmentation dataset examines fairness not in classification, but in segmentation—the step that often feeds every downstream decision.

  • Researchers evaluated automated segmentation quality by age, ethnicity, and data source.
  • They found a persistent age bias: younger patients received lower-quality segmentations, even after controlling for site.
  • They hypothesize physiological differences may make younger cases harder—challenging for both radiologists and algorithms.
  • Combining data from multiple sites can reshape or hide site-specific ethnic biases, so granular analysis matters.

Why it matters: segmentation bias can compound across clinical workflows and even amplify during model iteration. The takeaway is simple: audit labels and models by subgroup, report performance by site and demographics, and don’t assume fairness transfers across datasets.

Paper: http://arxiv.org/abs/2510.27421v1 • Authors: Aditya Parikh, Sneha Das, Aasa Feragen

Paper: http://arxiv.org/abs/2510.27421v1

Register: https://www.AiFeta.com

AI Healthcare MedicalImaging BreastCancer Fairness Bias Radiology Segmentation MachineLearning Ethics

Read more