Aias: Fighting Systemic Bias in Artificial Intelligence Models

Aias

Tools for developers that help reduce system biases in Artificial Intelligence models during development.

Aias: Fighting Systemic Bias in Artificial Intelligence Models

Aias: Fighting Systemic Bias in Artificial Intelligence Models

HMW:  Give developers the tools to reduce the perpetuation of systemic biases (racial, etc) while developing Artificial Intelligence products?

Solution:  Aias uses Natural Langauge processing to check for at-risk columns in datasets. Using standard cross-validation procedures, Aias isolates at-risk data and determines the level of bias in the predictions made by the artificial intelligence model. These results are published online so that coders must take responsibility for biased code and create a culture of transparency and awareness for coders.