Remedies Against Bias In Analytics Systems

John S Edwards, Eduardo Rodriguez

Research output: Contribution to journalArticle

Abstract

Advances in IT offer the possibility to develop ever more complex predictive and prescriptive systems based on analytics. Organizations are beginning to rely on the outputs from these systems without inspecting them, especially if they are embedded in the organization’s operational systems. This reliance could be misplaced if the systems contain bias. This could even be at best unethical and at worst illegal. Data, algorithms and machine learning methods are all potentially subject to bias.
In this article we explain the ways in which bias might arise in analytics systems, present some examples where this has happened, and give some suggestions as to how to prevent or reduce it.
We use a framework inspired by the work of Hammond, Keeney and Raiffa (1998, reprinted 2006) on psychological traps in human decision-making. Each of these traps “translates” into a potential type of bias for an analytics-based system.
Fortunately, this means that remedies to reduce bias in human decision-making also translate into potential remedies for algorithmic systems.
Original languageEnglish
JournalJournal of Business Analytics
Early online date29 Jun 2019
DOIs
Publication statusE-pub ahead of print - 29 Jun 2019

    Fingerprint

Bibliographical note

This is an Accepted Manuscript of an article published by Taylor & Francis Group in Journal of Business Analytics on 29 June 2019, available online at: http://www.tandfonline.com/10.1080/2573234X.2019.1633890

Keywords

  • Algorithms
  • Artificial intelligence
  • Bias
  • decision making
  • psychological traps

Cite this