Prediction of waiting time for entities in independent and merged queues using Markov chain and machine learning techniques
Serving customers for any organization is a factor in making profit and extending business. Waiting time has become one of the most critical features in measuring the performance of systems; managers try to predict and minimize this factor. On the other hand, any system can have different customers. Thus, defining priorities can increase profitability and customer loyalty. Two models can emerge in these situations: the independent queue lines and the merge queue line. For the first model, a non-linear programming model is presented by combining resource planning concepts and queueing theory. The non-linear model helps with minimizing the weighted sum of average waiting times by assigning servers to different priority groups. Also, we address a "Memoryless/Memoryless/3 Servers'' or M/M/3 queue system with two priorities. In other words, the system contains three servers and the distributions of arrival and service times are exponential. A dimension reduction method is used to convert the transition matrix from two-infinite dimensional to a one-infinite dimensional matrix with the help of Phase-type distribution. After solving the Quasi-birth-death process, to compare the results, a simulation method is applied; the average gap between the dimentionality reduction method and simulation results is about 7%. Moreover, two unsupervised learning methods, K-Means and Gaussian mixture model, are used principal component analysis to cluster customers in an example on the banking industry. In the last step of research, we investigate the M/M/3 with three priorities. The high complexity of the queuing systems with more than two priority groups has resulted in having few methods in the literature. Therefore, a novel simulation-driven machine learning algorithm is developed to solve the problem. Some supervised machine learning algorithms are applied to predict the dynamic waiting time for customers. Among them, a random forest regression with mean absolute error of 6.95 minutes has the best performance. This novel method can analyze any merged queueing system with an arbitrary number of priorities and servers.
Queueing systems, Markov Chain, Dimentionality Reduction, Quasi-Birth-Death, Machine Learning
Master of Science (M.Sc.)
Mathematics and Statistics