DISMANTLING DISCRIMINATION

Dismantling the engineered bias and discrimination embedded into Artificial Intelligence (Ai) algorithms, and avoiding making the same errors in the future, can only be achieved by diversifying the Ai workforce and make-up of the research community (3min read).


Machine technologies come with embedded discrimination, whether intended or not, because the majority of those who created the underpinning algorithms are male; and of those men, many are white, American or European educated.

Dismantling the bias already embedded within existing coding and engineering, particularly within legacy equipment, will be one of the biggest challenges over the coming generation. It is important to stress that although some Ai-enabled technologies have a life cycle of only a few months before the next iteration, the coding it is built upon can stretch back years, occasionally decades.

Unconscious Bias

Unconscious bias stems from those biases that people internalise by just being in a society, group or family. Although individuals may try hard to challenge their own biases, in reality it is difficult. In tech, these biases feed through to coding, and can have devastating affect. The tech industry has been slow to deal with the issue of bias, and the education systems in USA, Europe and Asia have also failed to train for a more diverse workforce.

Bias concerns

“There have been a number of high-profile ‘embarrassments’ around bias that have made the public domain, including:

  • Sentencing algorithms piloted in US courts were statistically more inclined to discriminate against people of color,
  • Amazon’s experimental hiring tool to rank job candidates, which began to downgrade applicants who’d attended all women colleges or resumes with the word ‘women’s’ in them.

Such examples point to the potential of wide-scale damage if the question of bias is not tackled head on at this still early point in the AI revolution:

As AI systems are embedded in more social domains, they are playing a powerful role in the most intimate aspects of our lives: our health, our safety, our education, and our opportunities…“It’s essential that we are able to see and assess the ways that these systems treat some people differently than others, because they already influence the lives of millions.””

Tackling bias in an AI sector that’s already too ‘pale, male and stale’

Personal Experience

A personal experience of blatant bias was when being shown the back office of a major UK bank in the late 1990s. The hosts took great delight informing my colleagues and I of the algorithm that charges higher interest rates and make access to credit more difficult for people in the most deprived postcodes. The example they chose was SE15, Peckham, my postcode! The sniggering abruptly stopped on me mentioning where I lived; the bank did not win the business.

Innovation Hub

Raising awareness and encouraging events aimed specifically at issue of bias will be one of the aims of the Innovation Hub. It is worth noting that the EU classifies ‘consumers’ as ‘vulnerable users’, basically everyone will be impacted if this issue is not addressed.

This also requires adequate respect for potentially vulnerable persons and groups, such as workers, women, persons with disabilities, ethnic minorities, children, consumers or others at risk of exclusionEthics Guidelines for Trustworthy AI, (EU).

Further Reading

Tackling bias in an AI sector that’s already too ‘pale, male and stale’, Diginomics, Stuart Lauchlan, April 22, 2019

Tech Still Doesn’t Get Diversity. Here’s How To Fix It, WIRED, Michael Connor, 02.08.17 01:00pm

Gender Diversity in AI Research, NESTA, Wednesday, 17 July 2019, Konstantinos Stathoulopoulos, Juan Mateos-Garcia, Hannah Owen

Regulating Ethical Artificial Intelligence, Clayton Rice, Q.C. June 28, 2019

Ada Lovelace | Music by Numbers, Prof. David de Roure, January 2019, BBC Music Magazine

John M

Image: Breaking Away Berlin | John McKiernan

Related Articles: Ai Hub, Creating A Successful Hub and Elizabeth Garrett Anderson

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.