In this paper, we study radio and energy resource management in renewable energy-powered wireless networks, where base stations (BSs) are powered by both on-grid and renewable energy sources and can share their harvested energy with each other. To efficiently manage those resources, we propose a hierarchical and distributed resource management framework based on deep reinforcement learning. The proposed framework minimizes the on-grid energy consumption while satisfying the data rate requirement of each user. It is composed of three different policies in a distributed and hierarchical way. An intercell interference coordination policy constrains the transmission power at each BS to coordinate the intercell interference among the BSs. Under the power constraints, a distributed radio resource allocation policy of each BS determines its own user scheduling and power control. Lastly, an energy sharing policy manages the energy resources of the BSs by sharing the harvested energy via power lines between them. Through the simulation, we demonstrate that the proposed framework can effectively reduce the on-grid energy consumption while satisfying the data rate requirements.
Bibliographical notePublisher Copyright:
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- Electrical and Electronic Engineering
- Applied Mathematics