What Are the Risks of Discrimination by Algorithm?

A woman sitting alone in a conference room.

Not always fair: When humans are evaluated by algorithms, care must be taken. (Image: via Patrick Langer, KIT)

Not only companies, but state institutions, increasingly rely on automated decisions by algorithm-based systems. Their efficiency saves time and money, but also entails many risks of individuals or population groups being discriminated against.

This is the result of a study made by the Institute for Technology Assessment and Systems Analysis (ITAS) at Karlsruhe Institute of Technology (KIT) on behalf of the Federal Anti-Discrimination Agency. When granting a loan, selecting new staff members, or making legal decisions — in an increasing number of sectors, algorithms are applied to prepare human decisions or to make these decisions for humans.

Subscribe to our Newsletter!

Receive selected content straight into your inbox.

In an increasing number of sectors, algorithms are applied to prepare human decisions or to make these decisions for humans.
In an increasing number of sectors, algorithms are applied to prepare human decisions or to make these decisions for humans. (Image: via Pixabay)

Carsten Orwat, of the Institute for Technology Assessment and Systems Analysis (ITAS) of KIT, said:

These criteria include, in particular, age, gender, ethnic origin, religion, sexual orientation, and handicaps. On behalf of the Federal Anti-Discrimination Agency, Carsten Orwat studied in detail the causes of discrimination, its impact on society, and future options to reduce discrimination risks.

His study, entitled Diskriminierungsrisiken durch Verwendung von Algorithmen (Discrimination Risks by Using Algorithms), lists 47 examples to illustrate how algorithms can discriminate against people in various ways and how this can be detected and proved.

Real estate, loans, judicial matters, and more: Various examples of discrimination risks

For example, Orwat describes situations in the real estate and loan markets or in the court system. In the U.S., for instance, several cases have been documented in which algorithms within social media permitted targeted advertisements to be invisible to persons protected by the “Fair Housing Act,” such as migrants, people with handicaps, or with non-white skin color, the author says.

In Finland, a bank was sentenced to pay a fine because its algorithm for the automatic granting of online loans showed bias toward men over women and Finnish over native Swedish speakers. This unequal treatment is forbidden by Finnish anti-discrimination law.

When deciding on early releases from prison, U.S. judges use a much-disputed system that calculates risk scores. Journalists and human rights associations criticize the fact that this system systematically overestimates black people’s risk of re-offending. Carsten Orwat explained:

This happened in the U.S. in a system for food and health controls that was based on discriminating ratings of restaurants.

Recommendations of countermeasures

However, society must no longer accept these unequal treatments. The study lists several options to counteract discrimination by algorithms. “Preventative measures appear to be most reasonable,” Carsten Orwat says. Companies may ask anti-discrimination agencies to instruct their staff and IT experts and increase their awareness. Then, these persons will use datasets that do not reflect any discriminating practices or unequal treatments.

According to Orwat, the goal is to make future algorithms “discrimination-free by design.” This means that programs have to be checked during their initial development.

In the end, it is all about the protection of society’s values, such as equality or free development of the personality. To guarantee this in spite of the very rapid developments of “big data” and AI, it is necessary to improve anti-discrimination and data protection legislation at some point, Orwat points out.

Provided by: Karlsruhe Institute of Technology [Note: Materials may be edited for content and length.]

Follow us on TwitterFacebook, or Pinterest

Recommended Stories

Li Ling dancing.

The Regret of a Traitor (Part 1)

The story of Su Wu (140-60 B.C.), a Chinese diplomat and statesman of the Han ...

Hong Kong protest against the extradition bill.

Hong Kong Officially Withdraws Extradition Bill

After over 20 weeks into massive protests that have rocked the semi-autonomous city of Hong ...

A large company building.

Learning How to Be Grateful

A woman with excellent academic achievements, but who had never learned how to be grateful, ...

A painting of a mountain.

A Grateful Wolf

This story about a grateful wolf took place in the time of the Daoguang Emperor ...

An elephnat and its baby.

A Grateful Elephant

During the Song Dynasty, a hunter from Yangshan County (in present-day Guangdong Province) had caught ...

a username and password.

Easy Security Hack: Cracking Firewalls Using Spy Chips

A cybersecurity expert has revealed a way to secretly implant spy chips in popular hardware ...

King Zhuang of Chu.

A Consort’s Wisdom Saved the King and His Kingdom (Part 2)

King Zhuang was very fond of the company of his minister Yu Qiuzi and he ...

King Zhuang.

A Consort’s Wisdom Saved the King and His Kingdom (Part 1)

King Zhuang of the Chu State (reign 613-591 B.C.) was the most accomplished monarch among ...

A woman playing a guqin.

The Guqin: The Instrument of the Sages

The guqin, or qin, is a seven-stringed classical musical instrument that is a relative of ...

Send this to a friend