DISCRETE MARKOV DECISION PROCESS – INVENTORY CONTROL IN A SERVICE FACILITY SYSTEM

Authors

  • C. Selvakumar Research Department of Mathematical Sciences, Cardamom Planters' Association College, Bodinayakanur - 625 513, Tamil Nadu, India
  • P. Maheswari Research Department of Mathematical Sciences, Cardamom Planters' Association College, Bodinayakanur - 625 513, Tamil Nadu, India
  • C. Elango Research Department of Mathematical Sciences, Cardamom Planters' Association College, Bodinayakanur - 625 513, Tamil Nadu, India

Keywords:

Discrete time service facility system, discrete inventory system, (s, S) ordering policy, Markov Decision Processes (MDP)

Abstract

This article addresses the problem of inventory control in a service facility system in which ordering level is controlled by MDP. A discrete time service facility system is studied in which demand process follows a Bernoulli process. If a customer finds a free server, then he/ she enter the server immediately, otherwise join the waiting space with finite capacity say N. Inventory pool with maximum level S is maintained at service station to satisfy the customers. The (s, S) type policy is adopted to the inventory the lead times and service times are assumed to follow a geometric distribution. Average cost criteria based MDP is implemented to fix the optimal policy to be implemented. The results are illustrated numerically.

Downloads

Download data is not yet available.

Published

2024-06-05

How to Cite

Selvakumar, C., Maheswari, P., & Elango, C. (2024). DISCRETE MARKOV DECISION PROCESS – INVENTORY CONTROL IN A SERVICE FACILITY SYSTEM. Investigación Operacional, 41(6). Retrieved from https://revistas.uh.cu/invoperacional/article/view/9439

Similar Articles

1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.