MARKOV CHAIN MODELS IN PRACTICE: A REVIEW OF LOW COST SOFTWARE OPTIONS

Authors

  • Jiaru Ba Paul Merage School of Business, University of California, Irvine, 92697-3125
  • Cristina del Campo Corresponding author, Facultad de Ciencias Económicas y Empresariales, Campus de Somosaguas, Universidad Complutense de Madrid, 28223 Pozuelo de Alarcón (Madrid)
  • L. Robin Kelle Paul Merage School of Business, University of California, Irvine, 92697-3125

Keywords:

cervical cancer treatments, cost-effectiveness analysis, Markov decision trees, stationary transition probabilities

Abstract

Markov processes (or Markov chains) are used for modeling a phenomenon in which changes over time of a random variable comprise a sequence of values in the future, each of which depends only on the immediately preceding state, not on other past states. A Markov process (PM) is completely characterized by specifying the finite set S of possible states and the stationary probabilities (i.e. time-invariant) of transition between these states. The software most used in medical applications is produced by TreeAge, since it offers many advantages to the user. But, the cost of the Treeage software is relatively high. Therefore in this article two software alternatives are presented: Sto Tree and the zero cost add-in package "markovchain" implemented in R. An example of a cost-effectiveness analysis of two possible treatments for advanced cervical cancer, previously conducted with the Treeage software, is re-analyzed with these two low cost software packages. You can find a Spanish version of this paper in the following link: http://faculty.sites.uci.edu/lrkeller/publications

Downloads

Download data is not yet available.

Downloads

Published

2023-04-28

How to Cite

Ba, J., del Campo, C., & Robin Kelle, L. (2023). MARKOV CHAIN MODELS IN PRACTICE: A REVIEW OF LOW COST SOFTWARE OPTIONS. Investigación Operacional, 38(1). Retrieved from https://revistas.uh.cu/invoperacional/article/view/4420

Similar Articles

1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.