Open Access Open Access  Restricted Access Subscription Access

Deep Q-Learning Network-Based Energy and Network-Aware Optimization Model for Resources in Mobile Cloud Computing


Affiliations
1 Department of Computer Science and Engineering, Amity University, Uttar Pradesh, India
2 Firstsoft Technologies Private Ltd., Chennai, Tamil Nadu, India
 

Mobile Cloud Computing (MCC) enables computation offloading procedures and has become popular in resolving the resource limitations of mobile devices. To accomplish effective offloading in the mobile cloud, modeling the application execution environment with Quality of Service (QoS) is crucial. Hence, optimization of resource allocation and management plays a major role in ensuring the seamless execution of mobile applications. Recently cloud computing research has adopted the reinforcement learning models to optimize resource allocation and offloading. In addition, several optimization mechanisms have considered the network transmission rate while selecting the network resources. However, mitigating the response time becomes critical among the dynamically varying mobile cloud resources. Thus, this paper proposes a joint resource optimization methodology for the processing and network resources in the integrated mobile-network-cloud environment. The proposed approach presents the Energy and Network-Aware Optimization solution with the assistance of the Deep Q-learning Network (ENAO-DQN). Designing an energy and network-aware resource optimization strategy recognizes the quality factors that preserve the device energy while allocating the resources and executing the compute-intensive mobile applications. With the potential advantage of the Deep Q-learning model in decision-making, the ENAO-DQN approach optimally selects the network resources with the enrichment of the maximized rewards. Initially, the optimization algorithm prefetches the quality factors based on the mobile and application characteristics, wireless network parameters, and cloud resource characteristics. Secondly, it generates the allocation plan for the application-network resource pair based on the prefetched quality factors with the assistance of the enhanced deep reinforcement learning model. Thus, the experimental results demonstrate that the ENAO-DQN model outperforms the baseline mobile execution and cloud offloading models.

Keywords

Mobile Cloud Computing, Resource Allocation, Optimization, Energy Consumption, QoS, Deep Reinforcement Learning, Q-learning, Wireless Network Resource.
User
Notifications
Font Size


  • Deep Q-Learning Network-Based Energy and Network-Aware Optimization Model for Resources in Mobile Cloud Computing

Abstract Views: 354  |  PDF Views: 1

Authors

Puneet Sharma
Department of Computer Science and Engineering, Amity University, Uttar Pradesh, India
T. Sakthivel
Firstsoft Technologies Private Ltd., Chennai, Tamil Nadu, India
Deepak Arora
Department of Computer Science and Engineering, Amity University, Uttar Pradesh, India

Abstract


Mobile Cloud Computing (MCC) enables computation offloading procedures and has become popular in resolving the resource limitations of mobile devices. To accomplish effective offloading in the mobile cloud, modeling the application execution environment with Quality of Service (QoS) is crucial. Hence, optimization of resource allocation and management plays a major role in ensuring the seamless execution of mobile applications. Recently cloud computing research has adopted the reinforcement learning models to optimize resource allocation and offloading. In addition, several optimization mechanisms have considered the network transmission rate while selecting the network resources. However, mitigating the response time becomes critical among the dynamically varying mobile cloud resources. Thus, this paper proposes a joint resource optimization methodology for the processing and network resources in the integrated mobile-network-cloud environment. The proposed approach presents the Energy and Network-Aware Optimization solution with the assistance of the Deep Q-learning Network (ENAO-DQN). Designing an energy and network-aware resource optimization strategy recognizes the quality factors that preserve the device energy while allocating the resources and executing the compute-intensive mobile applications. With the potential advantage of the Deep Q-learning model in decision-making, the ENAO-DQN approach optimally selects the network resources with the enrichment of the maximized rewards. Initially, the optimization algorithm prefetches the quality factors based on the mobile and application characteristics, wireless network parameters, and cloud resource characteristics. Secondly, it generates the allocation plan for the application-network resource pair based on the prefetched quality factors with the assistance of the enhanced deep reinforcement learning model. Thus, the experimental results demonstrate that the ENAO-DQN model outperforms the baseline mobile execution and cloud offloading models.

Keywords


Mobile Cloud Computing, Resource Allocation, Optimization, Energy Consumption, QoS, Deep Reinforcement Learning, Q-learning, Wireless Network Resource.

References





DOI: https://doi.org/10.22247/ijcna%2F2022%2F212561