Remedies Against Bias In Analytics Systems

John S Edwards, Eduardo Rodriguez

Research output: Contribution to journalArticlepeer-review


Advances in IT offer the possibility to develop ever more complex predictive and prescriptive systems based on analytics. Organizations are beginning to rely on the outputs from these systems without inspecting them, especially if they are embedded in the organization’s operational systems. This reliance could be misplaced if the systems contain bias. This could even be at best unethical and at worst illegal. Data, algorithms and machine learning methods are all potentially subject to bias.
In this article we explain the ways in which bias might arise in analytics systems, present some examples where this has happened, and give some suggestions as to how to prevent or reduce it.
We use a framework inspired by the work of Hammond, Keeney and Raiffa (1998, reprinted 2006) on psychological traps in human decision-making. Each of these traps “translates” into a potential type of bias for an analytics-based system.
Fortunately, this means that remedies to reduce bias in human decision-making also translate into potential remedies for algorithmic systems.
Original languageEnglish
Pages (from-to)74-87
JournalJournal of Business Analytics
Issue number1
Early online date29 Jun 2019
Publication statusPublished - 2019

Bibliographical note

This is an Accepted Manuscript of an article published by Taylor & Francis Group in Journal of Business Analytics on 29 June 2019, available online at:


  • Algorithms
  • Artificial intelligence
  • Bias
  • decision making
  • psychological traps


Dive into the research topics of 'Remedies Against Bias In Analytics Systems'. Together they form a unique fingerprint.

Cite this