Open Access Open Access  Restricted Access Subscription Access

Deep Q-Learning Network-Based Energy and Network-Aware Optimization Model for Resources in Mobile Cloud Computing


Affiliations
1 Department of Computer Science and Engineering, Amity University, Uttar Pradesh, India
2 Firstsoft Technologies Private Ltd., Chennai, Tamil Nadu, India
 

Mobile Cloud Computing (MCC) enables computation offloading procedures and has become popular in resolving the resource limitations of mobile devices. To accomplish effective offloading in the mobile cloud, modeling the application execution environment with Quality of Service (QoS) is crucial. Hence, optimization of resource allocation and management plays a major role in ensuring the seamless execution of mobile applications. Recently cloud computing research has adopted the reinforcement learning models to optimize resource allocation and offloading. In addition, several optimization mechanisms have considered the network transmission rate while selecting the network resources. However, mitigating the response time becomes critical among the dynamically varying mobile cloud resources. Thus, this paper proposes a joint resource optimization methodology for the processing and network resources in the integrated mobile-network-cloud environment. The proposed approach presents the Energy and Network-Aware Optimization solution with the assistance of the Deep Q-learning Network (ENAO-DQN). Designing an energy and network-aware resource optimization strategy recognizes the quality factors that preserve the device energy while allocating the resources and executing the compute-intensive mobile applications. With the potential advantage of the Deep Q-learning model in decision-making, the ENAO-DQN approach optimally selects the network resources with the enrichment of the maximized rewards. Initially, the optimization algorithm prefetches the quality factors based on the mobile and application characteristics, wireless network parameters, and cloud resource characteristics. Secondly, it generates the allocation plan for the application-network resource pair based on the prefetched quality factors with the assistance of the enhanced deep reinforcement learning model. Thus, the experimental results demonstrate that the ENAO-DQN model outperforms the baseline mobile execution and cloud offloading models.

Keywords

Mobile Cloud Computing, Resource Allocation, Optimization, Energy Consumption, QoS, Deep Reinforcement Learning, Q-learning, Wireless Network Resource.
User
Notifications
Font Size

  • Fernando, Niroshinie Seng W. Loke, and Wenny Rahayu, “Mobile cloud computing: A survey”, Future Generation Computer Systems, pp.84-106, Vol.29, No.1, 2013.
  • Wu, Huaming. "Multi-objective decision-making for mobile cloud offloading: A survey”, IEEE Access, Vol.6, pp.3962-3976, 2018.
  • Zhou, Bowen, and Rajkumar Buyya, “Augmentation techniques for mobile cloud computing: A taxonomy, survey, and future directions”, ACM Computing Surveys (CSUR), Vol.51, No.1, pp.1-38, 2018.
  • Nayyer, M. Ziad, Imran Raza, and Syed Asad Hussain, “A survey of cloudlet-based mobile augmentation approaches for resource optimization”, ACM Computing Surveys (CSUR), Vol.51, No.5, pp.1-28, 2018.
  • Dinh Hoang T, Chonho Lee, Dusit Niyato, and Ping Wang, “A survey of mobile cloud computing: architecture, applications, and approaches”, Wireless communications and mobile computing, Vol.13, No.18, pp.1587-1611, 2013.
  • Noor, Talal H., Sherali Zeadally, Abdullah Alfazi, and Quan Z. Sheng, “Mobile cloud computing: Challenges and future research directions”, Journal of Network and Computer Applications, Vol.115, pp.70-85, 2018.
  • Ferrer, Ana Juan, Joan Manuel Marquès, and Josep Jorba, “Towards the decentralised cloud: Survey on approaches and challenges for mobile, ad hoc, and edge computing”, ACM Computing Surveys (CSUR), Vol.51, No.6, pp.1-36, 2019.
  • Parajuli, Nitesh, Abeer Alsadoon, P. W. C. Prasad, Rasha S. Ali, and Omar Hisham Alsadoon, “A recent review and a taxonomy for multimedia application in Mobile cloud computing based energy efficient transmission”, Multimedia Tools and Applications, Vol.79, No.41, pp.31567-31594, 2020.
  • Junior, Warley, Eduardo Oliveira, Albertinin Santos, and Kelvin Dias, “A context-sensitive offloading system using machine-learning classification algorithms for mobile cloud environment”, Future Generation Computer Systems, Vol.90, pp.503-520, 2019.
  • Duc, Thang Le, Rafael García Leiva, Paolo Casari, and Per-Olov Östberg, “Machine learning methods for reliable resource provisioning in edge-cloud computing: A survey”, ACM Computing Surveys (CSUR), Vol.52, No.5, pp.1-39, 2019.
  • Shakarami, Ali, Mostafa Ghobaei-Arani, and Ali Shahidinejad, “A survey on the computation offloading approaches in mobile edge computing: A machine learning-based perspective”, Computer Networks, p.107496, 2020.
  • Zhou, Guangyao, Wenhong Tian, and Rajkumar Buyya, “Deep Reinforcement Learning-based Methods for Resource Scheduling in Cloud Computing: A Review and Future Directions”, arXiv preprint arXiv:2105.04086, 2021.
  • Du, Wei, and Shifei Ding, “A survey on multi-agent deep reinforcement learning: from the perspective of challenges and applications”, Artificial Intelligence Review, Vol.54, No.5, pp.3215-3238, 2021.
  • Zhang, Weiwen, Yonggang Wen, Kyle Guan, Dan Kilper, Haiyun Luo, and D. Wu, “Energy-Optimal Mobile Cloud Computing under Stochastic Wireless Channel”, IEEE Transactions On Wireless Communications, Vol.12, No.9, pp.4569-4581, 2013.
  • Shu, Peng, Fangming Liu, Hai Jin, Min Chen, Feng Wen, Yupeng Qu, and Bo Li, “eTime:energy-efficient transmission between cloud and mobile devices”, Proceedings IEEE in INFOCOM, pp.195-199, 2013.
  • Mahinur, Fatema Tuz Zohra, and Amit Kumar Das, “Q-MAC: QoS and mobility aware optimal Resource Allocation for dynamic application offloading in mobile cloud computing”, In 2017 International Conference on Electrical, Computer and Communication Engineering (ECCE), pp.803-808, 2017.
  • Chen, Meng-Hsi, Ben Liang, and Min Dong, “Joint offloading and Resource Allocation for computation and communication in mobile cloud with computing access point”, In IEEE INFOCOM 2017-IEEE Conference on Computer Communications, pp.1-9, 2017.
  • Chunlin, Li, Zhou Min, and Luo Youlong, “Elastic resource provisioning in hybrid mobile cloud for computationally intensive mobile applications”, The Journal of Supercomputing, Vol.73, No.9, pp.3683-3714, 2017.
  • Chunlin, Li, Liu Yanpei, and Luo Youlong, “Energy‐aware cross‐layer Resource Allocation in mobile cloud”, International Journal of Communication Systems, Vol.30, No.12, p.e3258, 2017.
  • Zhang, Jing, Weiwei Xia, Feng Yan, and Lianfeng Shen, “Joint computation offloading and Resource Allocation in heterogeneous networks with mobile edge computing”, IEEE Access, Vol.6, pp.19324-19337, 2018.
  • Chen, Meng-Hsi, Ben Liang, and Min Dong, “Multi-user multi-task offloading and Resource Allocation in mobile cloud systems”, IEEE Transactions on Wireless Communications, Vol.17, No.10, pp.6790-6805, 2018.
  • Avgeris, Marios, Dimitrios Dechouniotis, Nikolaos Athanasopoulos, and Symeon Papavassiliou, “Adaptive Resource Allocation for computation offloading: A control-theoretic approach”, ACM Transactions on Internet Technology (TOIT), Vol.19, No.2, pp.1-20, 2019.
  • Alkhalaileh, Mohammad, Rodrigo N. Calheiros, Quang Vinh Nguyen, and Bahman Javadi, “Dynamic Resource Allocation in hybrid mobile cloud computing for data-intensive applications”, In International Conference on Green, Pervasive, and Cloud Computing, Springer, pp.176-191, 2019.
  • Malik, Saif UR, Hina Akram, Sukhpal Singh Gill, Haris Pervaiz, and Hassan Malik, “EFFORT: Energy efficient framework for offload communication in mobile cloud computing”, Software: Practice and Experience, Vol.51, No.9, pp.1896-1909, 2021.
  • Ali, Abid, Muhammad Munawar Iqbal, Harun Jamil, Faiza Qayyum, Sohail Jabbar, Omar Cheikhrouhou, Mohammed Baz, and Faisal Jamil, “An efficient dynamic-decision based task scheduler for task offloading optimization and energy management in mobile cloud computing”, Sensors, Vol.21, No.13, p.4527, 2021.
  • Mahesar, Abdul Rasheed, Abdullah Lakhan, Dileep Kumar Sajnani, and Irfan Ali Jamali, “Hybrid delay optimization and workload assignment in mobile edge cloud networks”, Open Access Library Journal, Vol.5, No.9, pp.1-12, 2018.
  • Li, Ji, Hui Gao, Tiejun Lv, and Yueming Lu, “Deep reinforcement learning based computation offloading and resource allocation for MEC”, In 2018 IEEE Wireless Communications and Networking Conference (WCNC), pp.1-6, 2018.
  • Nawrocki, Piotr, and Bartlomiej Sniezynski, “Adaptive service management in mobile cloud computing by means of supervised and reinforcement learning”, Journal of Network and Systems Management, Vol.26, No.1, pp.1-22, 2018.
  • Ali, Zaiwar, Lei Jiao, Thar Baker, Ghulam Abbas, Ziaul Haq Abbas, and Sadia Khaf, “A deep learning approach for energy efficient computational offloading in mobile edge computing”, IEEE Access, Vol.7, pp.149623-149633, 2019.
  • Wang, Jiadai, Lei Zhao, Jiajia Liu, and Nei Kato, “Smart resource allocation for mobile edge computing: A deep reinforcement learning approach”, IEEE Transactions on emerging topics in computing, 2019.
  • Eshratifar, Amir Erfan, Mohammad Saeed Abrishami, and Massoud Pedram, “JointDNN: An efficient training and inference engine for intelligent mobile cloud computing services”, IEEE Transactions on Mobile Computing, 2019.
  • Alfakih, Taha, Mohammad Mehedi Hassan, Abdu Gumaei, Claudio Savaglio, and Giancarlo Fortino, “Task offloading and resource allocation for mobile edge computing by deep reinforcement learning based on SARSA”, IEEE Access, Vol.8, pp.54074-54084, 2020.
  • Qu, Guanjin, Huaming Wu, Ruidong Li, and Pengfei Jiao, “Dmro: A deep meta reinforcement learning-based task offloading framework for edge-cloud computing”, IEEE Transactions on Network and Service Management, Vol.18, no.3, pp.3448-3459, 2021.
  • Shakarami, Ali, Ali Shahidinejad, and Mostafa Ghobaei-Arani, “An autonomous computation offloading strategy in Mobile Edge Computing: A deep learning-based hybrid approach”, Journal of Network and Computer Applications, Vol.178, p.102974, 2021.
  • Duardo Cuervo, Aruna Balasubramanian, Dae-ki Cho, Alec Wolman, Stefan Saroiu, Ranveer Chandra, and Paramvir Bahl, “MAUI: Making Smartphones Last Longer with Code Offload”, ACM, In Proceedings of 8th international conference on Mobile systems, applications, and services, pp.49-62, 2010.

Abstract Views: 254

PDF Views: 1




  • Deep Q-Learning Network-Based Energy and Network-Aware Optimization Model for Resources in Mobile Cloud Computing

Abstract Views: 254  |  PDF Views: 1

Authors

Puneet Sharma
Department of Computer Science and Engineering, Amity University, Uttar Pradesh, India
T. Sakthivel
Firstsoft Technologies Private Ltd., Chennai, Tamil Nadu, India
Deepak Arora
Department of Computer Science and Engineering, Amity University, Uttar Pradesh, India

Abstract


Mobile Cloud Computing (MCC) enables computation offloading procedures and has become popular in resolving the resource limitations of mobile devices. To accomplish effective offloading in the mobile cloud, modeling the application execution environment with Quality of Service (QoS) is crucial. Hence, optimization of resource allocation and management plays a major role in ensuring the seamless execution of mobile applications. Recently cloud computing research has adopted the reinforcement learning models to optimize resource allocation and offloading. In addition, several optimization mechanisms have considered the network transmission rate while selecting the network resources. However, mitigating the response time becomes critical among the dynamically varying mobile cloud resources. Thus, this paper proposes a joint resource optimization methodology for the processing and network resources in the integrated mobile-network-cloud environment. The proposed approach presents the Energy and Network-Aware Optimization solution with the assistance of the Deep Q-learning Network (ENAO-DQN). Designing an energy and network-aware resource optimization strategy recognizes the quality factors that preserve the device energy while allocating the resources and executing the compute-intensive mobile applications. With the potential advantage of the Deep Q-learning model in decision-making, the ENAO-DQN approach optimally selects the network resources with the enrichment of the maximized rewards. Initially, the optimization algorithm prefetches the quality factors based on the mobile and application characteristics, wireless network parameters, and cloud resource characteristics. Secondly, it generates the allocation plan for the application-network resource pair based on the prefetched quality factors with the assistance of the enhanced deep reinforcement learning model. Thus, the experimental results demonstrate that the ENAO-DQN model outperforms the baseline mobile execution and cloud offloading models.

Keywords


Mobile Cloud Computing, Resource Allocation, Optimization, Energy Consumption, QoS, Deep Reinforcement Learning, Q-learning, Wireless Network Resource.

References





DOI: https://doi.org/10.22247/ijcna%2F2022%2F212561