A Price-based Energy-aware Offloading Mechanism Using Q-learning in Fog-Cloud Environments
In this research, we present an approach based on reinforcement learning to encourage users to offload their tasks instead of running locally. After formulating the problem with queuing theory, an algorithm using Q-learning is proposed. Evaluating the performance of the proposed method against traditional and state-of-the-art methods shows that it has a significant advantage. The proposed method, on average, consumes less energy compared to other methods. Also, it reduces the execution time of tasks and leads to less consumption of network resources.
Steps to reproduce
Please have a look at ReadMe.pdf file.