Published on Jun 05, 2023
Green computing is defined as the atudy and practice of designing , manufacturing, using, and disposing of computers, servers, and associated subsystems-such as monitors, printers, storage devices, and networking and communications systems-efficiently and effectively with minimal or no impact on the environment." The goals of green computing are similar to green chemistry; reduce the use of hazardous materials, maximize energy efficiency during the product's lifetime, and promote the recyclability or biodegradability of defunct products and factory waste. Research continues into key areas such as making the use of computers as energy-efficient as possible, and designing algorithms and systems for efficiency-related computer technologies.
There are several approaches to green computing,namely
• Product longetivity
• Algorithmic efficeincy
• Resource allocation
• Virtualisation
• Power management etc.
Need of Green Computing in Clouds
Modern data centers, operating under the Cloud computing model are hosting a variety of applications ranging from those that run for a few seconds (e.g. serving requests of web applications such as e-commerce and social networks portals with transient workloads) to those that run for longer periods of time (e.g. simulations or large data set processing) on shared hardware platforms. The need to manage multiple applications in a data center creates the challenge of on-demand resource provisioning and allocation in response to time-varying workloads. Normally, data center resources are statically allocated to applications, based on peak load characteristics, in order to maintain isolation and provide performance guarantees.
Until recently, high performance has been the sole concern in data center deployments and this demand has been fulfilled without paying much attention to energy consumption. The average data center consumes as much energy as 25,000 households [20]. As energy costs are increasing while availability dwindles, there is a need to shift focus from optimising data center resource management for pure performance to optimising for energy efficiency while maintaining high service level performance. According to certain reports, the total estimated energy bill for data centers in 2010 is $11.5 billion and energy costs in a typical data center double every five years.
Dynamic Voltage frequency scaling technique(DVFS)
Every electronic circutory will have an operating clock associated with it. The operatin frequency of this clock is adjusted so that the supply voltage is regulated. Thus, this method heavily depends on the hardware and is not controllabale according to the varying needs. The power savings are also low compared to other approaches. The power savings to cost incurred ratio is also low.
Resource allocation or virtual machine migration techniques
In a cloud computing environment,every physical machine hosts a number of virtual machines upon which the applications are run. These virtual machines can be transfered across the hosts according to the varying needs and avaialble resources.The VM migration method focusses on transferring VMs in such a way that the power increase is least. The most power efficient nodes are selected and the VMs are transfered across to them. This method is dealt in detail later.