Bias and Diversity Working Group.
Last updated August 12, 2024
Diverse data collection and curation strategies, as well as the mitigation of bias in data analysis within the MIDRC commons, are critically important to yield ethical AI algorithms that produce trustworthy results for all groups. MIDRC strives to mitigate bias in its study population, data collection, curation, and analysis.
Check out the bias and diversity group’s resources page; A bias awareness tool is available to help researchers identify and mitigate biases that may arise in the AI/ML development pipeline and an open-source diversity calculator is available on GitHub.
Members:
Karen Drukker, PhD, (lead), University of Chicago, Weijie Chen, PhD, US Food and Drug Administration, Judy Gichoya, PhD, (lead), Emory University, Maryellen Giger, PhD, University of Chicago, Nick Gruszauskas, PhD, University of Chicago, Jayashree Kalpathy-Cramer, PhD, University of Colorado, Hui Li, PhD, University of Chicago, Erin Mueller, University of Chicago, Rui Carlos Pereira De Sá, PhD, NIH, Kyle Myers, PhD, Puente Solutions, Robert Tomek, University of Chicago, Heather Whitney, PhD, University of Chicago, Zi Jill Zhang, MD, University of Pennsylvania