Dynamic pricing and energy consumption scheduling with reinforcement learning

Byung Gook Kim, Yu Zhang, Mihaela Van Der Schaar, Jang-Won Lee

Research output: Contribution to journalArticle

34 Citations (Scopus)

Abstract

In this paper, we study a dynamic pricing and energy consumption scheduling problem in the microgrid where the service provider acts as a broker between the utility company and customers by purchasing electric energy from the utility company and selling it to the customers. For the service provider, even though dynamic pricing is an efficient tool to manage the microgrid, the implementation of dynamic pricing is highly challenging due to the lack of the customer-side information and the various types of uncertainties in the microgrid. Similarly, the customers also face challenges in scheduling their energy consumption due to the uncertainty of the retail electricity price. In order to overcome the challenges of implementing dynamic pricing and energy consumption scheduling, we develop reinforcement learning algorithms that allow each of the service provider and the customers to learn its strategy without a priori information about the microgrid. Through numerical results, we show that the proposed reinforcement learning-based dynamic pricing algorithm can effectively work without a priori information about the system dynamics and the proposed energy consumption scheduling algorithm further reduces the system cost thanks to the learning capability of each customer.

Original languageEnglish
Article number7321806
Pages (from-to)2187-2198
Number of pages12
JournalIEEE Transactions on Smart Grid
Volume7
Issue number5
DOIs
Publication statusPublished - 2016 Sep 1

Fingerprint

Reinforcement learning
Energy utilization
Scheduling
Costs
Purchasing
Scheduling algorithms
Learning algorithms
Industry
Sales
Dynamical systems
Electricity

All Science Journal Classification (ASJC) codes

  • Computer Science(all)

Cite this

Kim, Byung Gook ; Zhang, Yu ; Van Der Schaar, Mihaela ; Lee, Jang-Won. / Dynamic pricing and energy consumption scheduling with reinforcement learning. In: IEEE Transactions on Smart Grid. 2016 ; Vol. 7, No. 5. pp. 2187-2198.
@article{ef5f71853e7441ba90dc2a2bc7831529,
title = "Dynamic pricing and energy consumption scheduling with reinforcement learning",
abstract = "In this paper, we study a dynamic pricing and energy consumption scheduling problem in the microgrid where the service provider acts as a broker between the utility company and customers by purchasing electric energy from the utility company and selling it to the customers. For the service provider, even though dynamic pricing is an efficient tool to manage the microgrid, the implementation of dynamic pricing is highly challenging due to the lack of the customer-side information and the various types of uncertainties in the microgrid. Similarly, the customers also face challenges in scheduling their energy consumption due to the uncertainty of the retail electricity price. In order to overcome the challenges of implementing dynamic pricing and energy consumption scheduling, we develop reinforcement learning algorithms that allow each of the service provider and the customers to learn its strategy without a priori information about the microgrid. Through numerical results, we show that the proposed reinforcement learning-based dynamic pricing algorithm can effectively work without a priori information about the system dynamics and the proposed energy consumption scheduling algorithm further reduces the system cost thanks to the learning capability of each customer.",
author = "Kim, {Byung Gook} and Yu Zhang and {Van Der Schaar}, Mihaela and Jang-Won Lee",
year = "2016",
month = "9",
day = "1",
doi = "10.1109/TSG.2015.2495145",
language = "English",
volume = "7",
pages = "2187--2198",
journal = "IEEE Transactions on Smart Grid",
issn = "1949-3053",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "5",

}

Dynamic pricing and energy consumption scheduling with reinforcement learning. / Kim, Byung Gook; Zhang, Yu; Van Der Schaar, Mihaela; Lee, Jang-Won.

In: IEEE Transactions on Smart Grid, Vol. 7, No. 5, 7321806, 01.09.2016, p. 2187-2198.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Dynamic pricing and energy consumption scheduling with reinforcement learning

AU - Kim, Byung Gook

AU - Zhang, Yu

AU - Van Der Schaar, Mihaela

AU - Lee, Jang-Won

PY - 2016/9/1

Y1 - 2016/9/1

N2 - In this paper, we study a dynamic pricing and energy consumption scheduling problem in the microgrid where the service provider acts as a broker between the utility company and customers by purchasing electric energy from the utility company and selling it to the customers. For the service provider, even though dynamic pricing is an efficient tool to manage the microgrid, the implementation of dynamic pricing is highly challenging due to the lack of the customer-side information and the various types of uncertainties in the microgrid. Similarly, the customers also face challenges in scheduling their energy consumption due to the uncertainty of the retail electricity price. In order to overcome the challenges of implementing dynamic pricing and energy consumption scheduling, we develop reinforcement learning algorithms that allow each of the service provider and the customers to learn its strategy without a priori information about the microgrid. Through numerical results, we show that the proposed reinforcement learning-based dynamic pricing algorithm can effectively work without a priori information about the system dynamics and the proposed energy consumption scheduling algorithm further reduces the system cost thanks to the learning capability of each customer.

AB - In this paper, we study a dynamic pricing and energy consumption scheduling problem in the microgrid where the service provider acts as a broker between the utility company and customers by purchasing electric energy from the utility company and selling it to the customers. For the service provider, even though dynamic pricing is an efficient tool to manage the microgrid, the implementation of dynamic pricing is highly challenging due to the lack of the customer-side information and the various types of uncertainties in the microgrid. Similarly, the customers also face challenges in scheduling their energy consumption due to the uncertainty of the retail electricity price. In order to overcome the challenges of implementing dynamic pricing and energy consumption scheduling, we develop reinforcement learning algorithms that allow each of the service provider and the customers to learn its strategy without a priori information about the microgrid. Through numerical results, we show that the proposed reinforcement learning-based dynamic pricing algorithm can effectively work without a priori information about the system dynamics and the proposed energy consumption scheduling algorithm further reduces the system cost thanks to the learning capability of each customer.

UR - http://www.scopus.com/inward/record.url?scp=84946762039&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84946762039&partnerID=8YFLogxK

U2 - 10.1109/TSG.2015.2495145

DO - 10.1109/TSG.2015.2495145

M3 - Article

VL - 7

SP - 2187

EP - 2198

JO - IEEE Transactions on Smart Grid

JF - IEEE Transactions on Smart Grid

SN - 1949-3053

IS - 5

M1 - 7321806

ER -