Session: Jurity: State of the Art Open Source Software for AI Fairness Evaluation

There are many software open source software packages for AI fairness tests. All popular packages, however, require protected group membership for each person in the data. Protected group membership is often based on information most people consider private, e.g. race or gender identity. Companies often do not collect this data, and many individuals are reluctant to share it–even with companies they trust. This lack of data prevents many practitioners and researchers from testing their AI models for unwanted bias.

In this session, we’ll introduce a state-of-the-art method for calculating fairness tests without collecting personal data on individuals and demonstrate its implementation in Jurity, an open source fairness testing package maintained by Fidelity Investments. First presented at the 2023 Learning and Intelligent Optimization conference in Lion, France, this technique has been tested with simulated and real-world data. Along the way, we’ll discuss how the open source community supports fairness evaluation, Jurity’s unique contributions to the community, and where opportunities still exist for open source developers to support AI fairness evaluation.

Presenters: