The importance of an equal treatment of individuals by AI models drastically grows due to the demands of
modern society. The potential discrimination or favoritism of specific groups of individuals is one of
the common perspectives for the evaluation of model behavior. However, most of the available fairness
tools require human intervention in the selection of subgroups of interest and therefore expert
knowledge. In this paper we propose a new tool, the ASDF-Dashboard, which automates the process of
subgroup fairness assessment. It automates the subgroup detection by applying a method based on
unsupervised clustering algorithms and pattern extraction to ease the usage also for non-expert
users.
For further information on the scientific details and the experimental results see our paper
Schäfer, J. & Wiese, L. (2022). Clustering-based Subgroup Detection for Automated Fairness Analysis.
In European Conference on Advances in Databases and Information Systems
(pp. 45-55). Springer, Cham.
The ASDF-Dashboard is available open-source at GitHub. You can also install and deploy the application on your own machine as a self-hosted instance following the instructions given in the repository. For example, if you want to keep sensitive data on your machine or network or want to share the application to your team, a self-hosted instance would do the job. This way you can restrict access to the web service and, e.g., configure a different/unlimited disk quota.