Source: wikibot/fairness-measure

= Fairness measure
{wiki=Fairness_measure}

Fairness measures refer to various metrics and methodologies used to assess and ensure fairness in the context of algorithms, machine learning models, and decision-making processes. The goal of these measures is to evaluate whether an algorithm behaves impartially and equitably across different groups or individuals, particularly in scenarios involving sensitive attributes such as race, gender, age, and socioeconomic status.