Power and Temperature-Aware Workflow Scheduling Considering Deadline Constraint in Cloud
- PDF / 1,853,065 Bytes
- 17 Pages / 595.276 x 790.866 pts Page_size
- 106 Downloads / 196 Views
RESEARCH ARTICLE-COMPUTER ENGINEERING AND COMPUTER SCIENCE
Power and Temperature-Aware Workflow Scheduling Considering Deadline Constraint in Cloud Rama Rani1 · Ritu Garg1 Received: 1 April 2020 / Accepted: 13 August 2020 © King Fahd University of Petroleum & Minerals 2020
Abstract Cloud data centers consume substantial amount of energy to meet up the growing demand of cloud resources. Recently, increase in enormous data processing and computing in data centers has resulted rigorous energy consumption that leads to large amount of carbon footprint and creates the harmful impact on the environment. The two main components of a data center that are contributing to energy consumption: computing energy and cooling energy. In this paper, we have considered the problem of scheduling parallel applications consisting of dependent tasks having precedence constraints in heterogeneous cloud computing environment. Recently many heuristics have been developed to optimize the execution time, i.e., makespan only without giving much deliberation to the computing energy and cooling energy needed for cooling the data center while performing the tasks. Reducing energy consumption leads to reduction in operating costs. Therefore, we proposed an algorithm namely power and temperature-aware workflow-scheduling (PATA-WSD) algorithm with userspecified SLA constraint (deadline) to optimize computing as well as cooling energy. The proposed algorithm minimizes the energy consumption (computing + cooling) of executing tasks along with satisfying deadline constraints with the use of dynamic voltage and frequency scaling technique. The simulation results are obtained using random task graphs and realtime scientific workflows such as Cybershake and Montage. Results demonstrated that the proposed algorithm optimizes the computing and cooling energy consumption. Keywords Workflow scheduling · DVFS · Computing energy · Cooling energy · Cloud computing
1 Introduction Cloud computing is making the availability of computer resources like computing power and data storage on demand without the direct intervention by the user. It is a group of networking elements that provides services, which do not need to be individually addressed by the users. It allows companies to minimize the upfront IT infrastructure costs. Although cloud computing enables organizations to realize great benefits by minimizing operational and administrative costs, it suffers from the problem of high-energy consumption. Due to increased demand of cloud computing resources, energy consumption in the cloud data centers (CDC) increases which is
B
Rama Rani [email protected] Ritu Garg [email protected]
1
consumed by the computing and cooling equipments. Nowadays, green cloud computing concept [1, 2] has continued to gain impetus as data center consumes a lot of energy which further leads to increased carbon emission and energy cost. In past few years, data centers around the world used 416 TW (or 3% of the total electricity) which is 40% more than the total UK electric
Data Loading...