Radio and Energy Resource Management in Renewable Energy-Powered Wireless Networks with Deep Reinforcement Learning

Hyun Suk Lee, Do Yup Kim, Jang Won Lee

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

In this paper, we study radio and energy resource management in renewable energy-powered wireless networks, where base stations (BSs) are powered by both on-grid and renewable energy sources and can share their harvested energy with each other. To efficiently manage those resources, we propose a hierarchical and distributed resource management framework based on deep reinforcement learning. The proposed framework minimizes the on-grid energy consumption while satisfying the data rate requirement of each user. It is composed of three different policies in a distributed and hierarchical way. An intercell interference coordination policy constrains the transmission power at each BS to coordinate the intercell interference among the BSs. Under the power constraints, a distributed radio resource allocation policy of each BS determines its own user scheduling and power control. Lastly, an energy sharing policy manages the energy resources of the BSs by sharing the harvested energy via power lines between them. Through the simulation, we demonstrate that the proposed framework can effectively reduce the on-grid energy consumption while satisfying the data rate requirements.

Original languageEnglish
JournalIEEE Transactions on Wireless Communications
DOIs
Publication statusAccepted/In press - 2022

Bibliographical note

Publisher Copyright:
IEEE

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Electrical and Electronic Engineering
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Radio and Energy Resource Management in Renewable Energy-Powered Wireless Networks with Deep Reinforcement Learning'. Together they form a unique fingerprint.

Cite this