Education, Science, Technology, Innovation and Life
Open Access
Sign In

Predictive Analytics for Stock and Demand Balance Using Deep Q-Learning Algorithm

Download as PDF

DOI: 10.23977/datake.2023.010101 | Downloads: 44 | Views: 1354

Author(s)

Selinay Kayali 1, Safiye Turgay 1

Affiliation(s)

1 Department of Industrial Engineering, Sakarya University, Sakarya, Turkey

Corresponding Author

Safiye Turgay

ABSTRACT

Predicting and managing stock and demand balance is a critical aspect of supply chain management. In this paper, we propose a predictive analytics approach that leverages deep Q-learning algorithm to optimize stock and demand balance. Traditional forecasting methods often struggle to capture the complex dynamics and uncertainties associated with stock and demand fluctuations. The deep Q-learning algorithm, a reinforcement learning technique, offers a promising solution by learning optimal decision-making policies through interaction with the environment. The proposed approach utilizes historical stock and demand data to train a deep Q-learning model, enabling it to make accurate predictions about future stock levels and demand patterns. By considering factors such as seasonality, trends, and external variables, the model learns to adjust stock levels to meet demand while minimizing excess inventory or stock outs. To validate the effectiveness of the proposed approach, a case study conducted using real-world data from a supply chain network. The results demonstrate that the deep Q-learning algorithm outperforms traditional forecasting methods, achieving higher accuracy in predicting stock and demand balance. The implications of this research are significant for supply chain managers and decision-makers. By incorporating predictive analytics with the deep Q-learning algorithm, companies can enhance their inventory management strategies, reduce holding costs, minimize stock outs, and improve customer satisfaction.

KEYWORDS

Predictive analytics, stock and demand balance, deep Q-learning algorithm, supply chain management, inventory management, forecasting, reinforcement learning

CITE THIS PAPER

Selinay Kayali, Safiye Turgay, Predictive Analytics for Stock and Demand Balance Using Deep Q-Learning Algorithm. Data and Knowledge Engineering (2023) Vol. 1: 1-10. DOI: http://dx.doi.org/10.23977/datake.2023.010101.

REFERENCES

[1] Gao, X., Yu, J., Shen, W. T., Chang, Y., Zhang, S. B., Yang, M., Wu, B. (2021). Achieving low-entropy secure cloud data auditing with file and authenticatorde duplication. Information Sciences, 546, 177-191.
[2] Núñez-Molina, C., Fernández-Olivares, J., Pérez, R. (2022). Learning to select goals in Automated Planning with Deep-Q Learning. Expert Systems with Applications, 117265.
[3] Wang, G., Li, W., Gao, X., Xiao, B., Du, J. (2022). Multimodal Medical Image Fusion Based on Multichannel Coupled Neural P Systems and Max-Cloud Models in Spectral Total Variation Domain. Neuro computing.
[4] Chang, S., Zhao, C., Li, Y., Zhou, M., Fu, C., Qiao, H. (2021). Multi-Channel Graph Convolutional Network based End-Point Element Composition Prediction of Converter Steelmaking. IFAC-Papers On-Line, 54(3), 152-157.
[5] Yan, P., & Choudhury, S. (2021). Deep Q-learning enabled joint optimization of mobile edge computing multi-level task of loading. Computer Communications, 180, 271-283.
[6] Chen, X., Ulmer, M. W., Thomas, B. W. (2022). Deep Q-learning for same-day delivery with vehicles and drones. European Journal of Operational Research, 298(3), 939-952.
[7] Tan, W., Huang, L., Kataev, M. Y., Sun, Y., Zhao, L., Zhu, H. Xie, N. (2021). Method towards reconstructing collaborative business processes with cloud services using evolutionary deep Q- learning. Journal of Industrial Information Integration, 21, 100189.
[8] Kensert, A., Collaerts, G., Efthymiadis, K., Desmet, G., Cabooter, D. (2021). Deep Q-learning for the selection of optimali socratics couting runs in liquid chromatography. Journal of Chromatography A, 1638, 461900.
[9] Thang, H. D., Boukhatem, L., Kaneko, M., Nguyen-Thanh, N. (2021). Joint beam forming and user association with reduced CSI signaling in mobile environments: A Deep Q-learning approach. Computer Networks, 197, 108291.
[10] Alabdullah, M. H., Abido, M. A. (2022). Micro grid energy management using deep Q-network reinforcement learning. Alexandria Engineering Journal, 61(11), 9069-9078.
[11] Yu, K. H., Chen, Y. A., Jaimes, E., Wu, W. C., Liao, K. K., Liao, J. C., Wang, C. C. (2021). Optimization of thermal comfort, indoor quality, and energy-saving in campus classroom through deep Q learning. Case Studies in Thermal Engineering, 24, 100842.
[12] Sajedian, I., Lee, H., Rho, J. (2020). Design of high transmission color filters for solar cells directed by deep Q-learning. Solar Energy, 195, 670-676.
[13] Park, H., Sim, M. K., Choi, D. G. (2020). An intelligent financial portfolio trading strategy using deep Q-learning. Expert Systems with Applications, 158, 113573.
[14] Tong, Z., Chen, H., Deng, X., Li, K., Li, K. (2020). A scheduling scheme in the cloud computing environment using deep Q-learning. Information Sciences, 512, 1170-1191.
[15] Kim, C., Park, J. (2019). Designing online network intrusion detection using deep auto-encoder Q-learning. Computers & Electrical Engineering, 79, 106460.
[16] Hofmann, C., Krahe, C., Stricker, N., &Lanza, G. (2020). Autonomous production control for matrix production based on deep Q-learning. Procedia CIRP, 88, 25-30.
[17] Teng, Z., Zhang, B., Fan, J. (2020). Three-step action search networks with deep q-learning for real-time object tracking. PatternRecognition, 101, 107188.
[18] Chen, L., Huang, H., Feng, Y., Cheng, G., Huang, J., Liu, Z. (2020). Active one-shot learning by a deep Q-network strategy. Neurocomputing, 383, 324-335.
[19] Qin, Y., Li, Y., Zhuo, Z., Liu, Z., Liu, Y., Ye, C. (2021). Multi modal super-resolved q-space deep learning. Medical Image Analysis, 71, 102085.
[20] Jeong, G., & Kim, H. Y. (2019). Improving financial trading decisions using deep Q-learning: Predicting the number of shares, action strategies, and transfer learning. Expert Systems with Applications, 117, 125-138.
[21] Qiao, J., Wang, G., Li, W., Chen, M. (2018). An adaptive deep Q-learning strategy for hand written digit recognition. Neural Networks, 107, 61-71.
[22] Carta, S., Ferreira, A., Podda, A. S., Recupero, D. R., Sanna, A. (2021). Multi-DQN: An ensemble of Deep Q-learning agents for stock market forecasting. Expert systems with applications, 164, 113820.
[23] Sharma, A., Anand, S., Kaul, S. K. (2020). Intelligent querying for target tracking in camera Networks using deep q-learning with n-step boot strapping. Image and Vision Computing, 103, 104022.
[24] Hwangbo, S., Öner, M., Sin, G. (2019). Design of smart liquid-liquid extraction columns for downstream separations of bio pharmaceuticals using deep Q-learning algorithm. In Computer Aided Chemical Engineering (Vol. 46, pp. 271-276).
[25] Goumagias, N. D., Hristu-Varsakelis, D., Assael, Y. M. (2018). Using deep Q-learning to understand the tax evasion behavior of risk-averse firms. Expert Systems with Applications, 101, 258-270.

Downloads: 44
Visits: 1354

Sponsors, Associates, and Links


All published work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2016 - 2031 Clausius Scientific Press Inc. All Rights Reserved.