A risk-aware maintenance model based on a constrained Markov decision process

Jianyu Xu, Xiujie Zhao, Bin Liu

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)
10 Downloads (Pure)


The Markov Decision Process (MDP) model has been widely studied and used in sequential decision-making problems. In particular, it has been proved to be effective in maintenance policy optimization problems where the system state is assumed to continuously evolve under sequential maintenance policies. In traditional MDP models for maintenance, the long-run expected total discounted cost is taken as the objective function. The maintenance manager’s target is to evaluate an optimal policy that incurs the minimum expected total discounted cost through the corresponding MDP model. However, a significant drawback of these existing MDP-based maintenance strategies is that they fail to incorporate and characterize the safety issues of the system during the maintenance process. Therefore, in some applications that are sensitive to functional risks, such strategies fail to accommodate the requirement of risk awareness. In this study, we apply the concept of risk-aversion in the MDP maintenance model to develop risk-aware maintenance policies. Specifically, we use risk functions to measure some indexes of the system that reflect the safety level and formulate a safety constraint. Then, we summarize the problem as a constrained MDP model and use the linear programming approach to evaluate the proposed risk-aware optimal maintenance policy under concern.

Original languageEnglish
JournalIISE Transactions
Early online date15 Oct 2021
Publication statusE-pub ahead of print - 15 Oct 2021


  • risk aversion
  • condition-based maintenance
  • safety constraint
  • Markov decision process
  • imperfect repair


Dive into the research topics of 'A risk-aware maintenance model based on a constrained Markov decision process'. Together they form a unique fingerprint.

Cite this