Equal Rights Trust launches call for evidence of AI and algorithmic discrimination

The use of artificial intelligence (AI) and algorithmic decision-making is spreading and rapidly transforming our daily lives. These technologies are now deployed in a wide range of sectors and in many areas of public and private decision-making. As the use of these technologies continues to proliferate, our understanding of their impact on human rights is beginning to emerge and cohere. This is particularly so in respect of their impacts on the enjoyment of rights to equality and non-discrimination. 

AI and algorithmic technologies can cause different forms of discrimination at different points in their development, implementation and operation. Discrimination can occur in a variety of ways, including because of biases or omissions in the data used to create or train an algorithm, the way in which an algorithm is coded, the operation of an algorithm when decisions are made, or because of other impacts resulting from the implementation or operation of these technologies.

These systems can cause, create and replicate discrimination on grounds including but not limited to sex, gender, race, ethnicity, disability, age and nationality, and at the intersection of grounds. They can and have produced directly and indirectly discriminatory results in areas including law enforcement, criminal justice, employment, education, healthcare, banking and loans, and online content moderation. Lack of transparency means that it can be difficult or even impossible to uncover whether or how these technologies have caused or perpetuated discrimination.

In response to the clear and growing evidence that these technologies are having discriminatory impacts, we are launching a global call for evidence of discrimination arising from the use of AI and algorithmic systems.

We are seeking information and evidence of cases or patterns of both actual and emerging, anticipated or potential discrimination arising from the use of AI and algorithmic systems. Cases or patterns may be relevant to any or multiple countries, but we are particularly interested in evidence from those working outside the Global North and West. Further information and an Annex with guidance on the evidence sought is available in English and Spanish:

The evidence we gather will contribute to a new initiative developed by the Equal Rights Trust in collaboration with Mary Kay Inc., which aims to shine a light on the discriminatory impacts of AI and algorithmic technologies and develop recommendations to ensure safeguards for the rights to equality and non-discrimination are integrated into global processes governing their development, deployment and use.

If you would like to receive any further guidance or discuss the examples you have in mind, please contact Ellie McDonald, Advocacy Officer at the Equal Rights Trust: ellie.mcdonald@equalrightstrust.org.