Tools for developers that help reduce system biases in Artificial Intelligence models during development.
HMW: Give developers the tools to reduce the perpetuation of systemic biases (racial, etc) while developing Artificial Intelligence products?
Solution: Aias uses Natural Langauge processing to check for at-risk columns in datasets. Using standard cross-validation procedures, Aias isolates at-risk data and determines the level of bias in the predictions made by the artificial intelligence model. These results are published online so that coders must take responsibility for biased code and create a culture of transparency and awareness for coders.