Energy efficient mobile computation offloading via online prefetching

Seung Woo Ko, Kaibin Huang, Seong-Lyun Kim, Hyukjin Chae

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Conventional mobile computation offloading relies on offline prefetching that fetches user-specific data to the cloud prior to computing. For computing depending on real-time inputs, the offline operation can result in fetching large volumes of redundant data over wireless channels and unnecessarily consumes mobile-transmission energy. To address this issue, we propose the novel technique of online prefetching for a large-scale program with numerous tasks, which seamlessly integrates task-level computation prediction and real-time prefetching within the program runtime. The technique not only reduces mobile-energy consumption by avoiding excessive fetching but also shortens the program runtime by parallel fetching and computing enabled by prediction. By modeling the sequential task transition in an offloaded program as a Markov chain, stochastic optimization is applied to design the online-fetching policies to minimize mobile-energy consumption for transmitting fetched data over fading channels under a deadline constraint. The optimal policies for slow and fast fading are shown to have a similar threshold-based structure that selects candidates for the next task by applying a threshold on their likelihoods and furthermore uses them controlling the corresponding sizes of prefetched data. In addition, computation prediction for online prefetching is shown theoretically to always achieve energy reduction.

Original languageEnglish
Title of host publication2017 IEEE International Conference on Communications, ICC 2017
EditorsMerouane Debbah, David Gesbert, Abdelhamid Mellouk
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781467389990
DOIs
Publication statusPublished - 2017 Jul 28
Event2017 IEEE International Conference on Communications, ICC 2017 - Paris, France
Duration: 2017 May 212017 May 25

Other

Other2017 IEEE International Conference on Communications, ICC 2017
CountryFrance
CityParis
Period17/5/2117/5/25

Fingerprint

Energy utilization
Fading channels
Markov processes

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Electrical and Electronic Engineering

Cite this

Ko, S. W., Huang, K., Kim, S-L., & Chae, H. (2017). Energy efficient mobile computation offloading via online prefetching. In M. Debbah, D. Gesbert, & A. Mellouk (Eds.), 2017 IEEE International Conference on Communications, ICC 2017 [7997341] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICC.2017.7997341
Ko, Seung Woo ; Huang, Kaibin ; Kim, Seong-Lyun ; Chae, Hyukjin. / Energy efficient mobile computation offloading via online prefetching. 2017 IEEE International Conference on Communications, ICC 2017. editor / Merouane Debbah ; David Gesbert ; Abdelhamid Mellouk. Institute of Electrical and Electronics Engineers Inc., 2017.
@inproceedings{de9203aa03be455782cb30a437138f95,
title = "Energy efficient mobile computation offloading via online prefetching",
abstract = "Conventional mobile computation offloading relies on offline prefetching that fetches user-specific data to the cloud prior to computing. For computing depending on real-time inputs, the offline operation can result in fetching large volumes of redundant data over wireless channels and unnecessarily consumes mobile-transmission energy. To address this issue, we propose the novel technique of online prefetching for a large-scale program with numerous tasks, which seamlessly integrates task-level computation prediction and real-time prefetching within the program runtime. The technique not only reduces mobile-energy consumption by avoiding excessive fetching but also shortens the program runtime by parallel fetching and computing enabled by prediction. By modeling the sequential task transition in an offloaded program as a Markov chain, stochastic optimization is applied to design the online-fetching policies to minimize mobile-energy consumption for transmitting fetched data over fading channels under a deadline constraint. The optimal policies for slow and fast fading are shown to have a similar threshold-based structure that selects candidates for the next task by applying a threshold on their likelihoods and furthermore uses them controlling the corresponding sizes of prefetched data. In addition, computation prediction for online prefetching is shown theoretically to always achieve energy reduction.",
author = "Ko, {Seung Woo} and Kaibin Huang and Seong-Lyun Kim and Hyukjin Chae",
year = "2017",
month = "7",
day = "28",
doi = "10.1109/ICC.2017.7997341",
language = "English",
editor = "Merouane Debbah and David Gesbert and Abdelhamid Mellouk",
booktitle = "2017 IEEE International Conference on Communications, ICC 2017",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

Ko, SW, Huang, K, Kim, S-L & Chae, H 2017, Energy efficient mobile computation offloading via online prefetching. in M Debbah, D Gesbert & A Mellouk (eds), 2017 IEEE International Conference on Communications, ICC 2017., 7997341, Institute of Electrical and Electronics Engineers Inc., 2017 IEEE International Conference on Communications, ICC 2017, Paris, France, 17/5/21. https://doi.org/10.1109/ICC.2017.7997341

Energy efficient mobile computation offloading via online prefetching. / Ko, Seung Woo; Huang, Kaibin; Kim, Seong-Lyun; Chae, Hyukjin.

2017 IEEE International Conference on Communications, ICC 2017. ed. / Merouane Debbah; David Gesbert; Abdelhamid Mellouk. Institute of Electrical and Electronics Engineers Inc., 2017. 7997341.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Energy efficient mobile computation offloading via online prefetching

AU - Ko, Seung Woo

AU - Huang, Kaibin

AU - Kim, Seong-Lyun

AU - Chae, Hyukjin

PY - 2017/7/28

Y1 - 2017/7/28

N2 - Conventional mobile computation offloading relies on offline prefetching that fetches user-specific data to the cloud prior to computing. For computing depending on real-time inputs, the offline operation can result in fetching large volumes of redundant data over wireless channels and unnecessarily consumes mobile-transmission energy. To address this issue, we propose the novel technique of online prefetching for a large-scale program with numerous tasks, which seamlessly integrates task-level computation prediction and real-time prefetching within the program runtime. The technique not only reduces mobile-energy consumption by avoiding excessive fetching but also shortens the program runtime by parallel fetching and computing enabled by prediction. By modeling the sequential task transition in an offloaded program as a Markov chain, stochastic optimization is applied to design the online-fetching policies to minimize mobile-energy consumption for transmitting fetched data over fading channels under a deadline constraint. The optimal policies for slow and fast fading are shown to have a similar threshold-based structure that selects candidates for the next task by applying a threshold on their likelihoods and furthermore uses them controlling the corresponding sizes of prefetched data. In addition, computation prediction for online prefetching is shown theoretically to always achieve energy reduction.

AB - Conventional mobile computation offloading relies on offline prefetching that fetches user-specific data to the cloud prior to computing. For computing depending on real-time inputs, the offline operation can result in fetching large volumes of redundant data over wireless channels and unnecessarily consumes mobile-transmission energy. To address this issue, we propose the novel technique of online prefetching for a large-scale program with numerous tasks, which seamlessly integrates task-level computation prediction and real-time prefetching within the program runtime. The technique not only reduces mobile-energy consumption by avoiding excessive fetching but also shortens the program runtime by parallel fetching and computing enabled by prediction. By modeling the sequential task transition in an offloaded program as a Markov chain, stochastic optimization is applied to design the online-fetching policies to minimize mobile-energy consumption for transmitting fetched data over fading channels under a deadline constraint. The optimal policies for slow and fast fading are shown to have a similar threshold-based structure that selects candidates for the next task by applying a threshold on their likelihoods and furthermore uses them controlling the corresponding sizes of prefetched data. In addition, computation prediction for online prefetching is shown theoretically to always achieve energy reduction.

UR - http://www.scopus.com/inward/record.url?scp=85028297591&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85028297591&partnerID=8YFLogxK

U2 - 10.1109/ICC.2017.7997341

DO - 10.1109/ICC.2017.7997341

M3 - Conference contribution

AN - SCOPUS:85028297591

BT - 2017 IEEE International Conference on Communications, ICC 2017

A2 - Debbah, Merouane

A2 - Gesbert, David

A2 - Mellouk, Abdelhamid

PB - Institute of Electrical and Electronics Engineers Inc.

ER -

Ko SW, Huang K, Kim S-L, Chae H. Energy efficient mobile computation offloading via online prefetching. In Debbah M, Gesbert D, Mellouk A, editors, 2017 IEEE International Conference on Communications, ICC 2017. Institute of Electrical and Electronics Engineers Inc. 2017. 7997341 https://doi.org/10.1109/ICC.2017.7997341