Ethical Concerns

The promises offered by AI/ML technologies can raise important ethical & policy concerns. These concerns are clearly outlined in the work of Mittelstadt et al. (2016). Below, we build on their work with some practical examples.

Road-Map

Inconclusive Evidence

Let us take the example of an algorithm used to predict the risk of heart failure. Algorithms are never 100% accurate. In case the algorithm has an accuracy of 85%, 15 patients out of 100 will be incorrectly diagnosed.

Example

Inscrutable Evidence

An example here is the one of “black boxes”, where the criteria which lead to a prediction is inscrutable and humans are left guessing. In healthcare, an AI systems could learn to make predictions based on factors that less related to a disease than the brand of MRI machine used, the time a blood test is taken or whether a patient was visited by a chaplain. 

Example

Misguided Evidence

An example is the COMPAS algorithm, used in the US to calculate the likelihood of recidivism in the Criminal Justice System. An investigation conducted by Pro-Publica revealed that, as the data on which it is based is biased against Black people, the algorithm generally outputs a higher risk-score for the latter than for White people.

Example

Unfair Outcomes

An example is the one of predictive policing. As it is often based on determining individuals’ threat levels by reference to commercial and social data, it can improperly link dark skin to higher threat levels. This can lead to more arrests for crimes in areas inhabited by people of color.

Transformative Effects

An example is profiling, the extensive collection of data which several companies conduct on their users. This practice is transforming the way we understand privacy, or the lack thereof; as our personal data become the fuel behind a company’s product.  

Traceability

An example is the one of self-driving cars. This is a case in which the vehicle aims at replacing the human operator in driving autonomously. Self-driving cars raise important issues of responsibility in the case of car accidents where no human driver was behind the wheel.

Example