This paper introduces a modelling framework which combines Data Envelopment Analysis and Markov Chains into an integrated decision aid. Markov Chains are typically used in contexts where a system (e.g. staff profile in a large organisation) is at the start of the planning horizon in a given state, and the aim is to transform the system to a new state by the end of the horizon. The planning horizon can involve several steps and the system transits to a new state after each step. The transition probabilities from one step to the next are influenced by both organisational and external (non-organisational) factors. We develop our generic methodology using as a vehicle the homogeneous Markov manpower planning system. The paper recognizes a gap in existing Markovian manpower planning methods to handle stochasticity and optimization in a more tractable manner and puts forward an approach to harness the power of DEA to fill this gap. In this context, the Decision Maker (DM) can specify potential anticipated future outcomes (e.g. personnel flows) and then use DEA to identify additional feasible courses of action through convexity. These feasible strategies can be evaluated according to the DM's judgement over potential future states of nature and then employed to guide the organisation in making interventions that would affect transition probabilities to improve the probability of attaining the ultimate state desired for the system. The paper includes a numerical illustration of the suggested approach, including data from a manpower planning model previously addressed using classical Markov modelling.
Bibliographical note© 2021, Elsevier. Licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International http://creativecommons.org/licenses/by-nc-nd/4.0/.
- Data Envelopment Analysis
- Markov Manpower Planning
- Markov processes
- Goal Programming