Are algorithms sexist? • (English interpretation) • FIFDH 2021

103 Aufrufe
Published
Artificial intelligence (AI) algorithms do what they are asked to do, what they are programmed to do. In this way, they reflect the image of society and suffer from the same biases, based on gender stereotypes and racism.

Facial recognition programmes struggle when applied to racialised people and women. Why is this? Because the algorithms used share the biases of their programmers. In fact, the algorithms at the basis of innumerable applications in daily life do not possess the neutrality we attribute to them: they reproduce the prejudices of their creators and, more generally, of society. With the Covid-19 pandemic, collection of personal data on citizens has intensified. This State surveillance coupled with algorithmic biases poses serious threats to democracy and human rights.

Co-presented with Service Agenda 21 – Ville durable de la Ville de Genève in the context of the Semaine de l’égalité 2021 titled “Elle-x-s sont dans la place ! », and the Bureau de promotion de l’égalité et de prévention des violences (BPEV)

The debate will be held in French and English with simultaneous English interpretation.

Moderator : Mehdi Atmani, Freelance investigative journalist and founder of the editorial production agency Flypaper

Speakers
• Isabelle Collet, Professor in Sciences of Education at the University of Geneva
• Julia Kloiber, Co-founder of Superrr Lab, a research and advocacy organization dedicated to promoting the ethical use of technology and the empowerment of people
• Teresa Scantamburlo, Post-doctoral researcher at the European Centre for Living Technology (ECLT) and co-founder of the AI4EU Observatory on Society and AI
Kategorien
Corona Virus aktuelle Videos
Kommentare deaktiviert.