Green Software Engineering: A Study on Energy-Efficient Design and Deployment in Cloud Infrastructure

Abstract

In the face of growing concerns over environmental sustainability, green software engineering has emerged as a crucial discipline within cloud computing to reduce energy consumption and minimize environmental impact. Cloud data centers, which host a large portion of modern computing infrastructure, are significant contributors to global energy consumption. As cloud adoption increases, so does the need for energy-efficient systems. This paper reviews energy-efficient design and deployment strategies within cloud infrastructure, focusing on how green software engineering practices can optimize resource usage and reduce carbon footprints. The paper explores various energy-saving technologies, including virtualization, dynamic resource allocation, and energy-aware scheduling, and evaluates their effectiveness in reducing cloud infrastructure energy demands. The challenges and limitations of implementing green software engineering practices in cloud systems are also discussed, with insights into future research directions for more sustainable cloud computing.

Share and Cite:

Jin, J. N., Ji, E. K. and Zhang, Q. (2025) Green Software Engineering: A Study on Energy-Efficient Design and Deployment in Cloud Infrastructure. Journal of Data Analysis and Information Processing, 13, 241-254. doi: 10.4236/jdaip.2025.133014.

1. Introduction

As the world continues to embrace cloud computing, the environmental impact of cloud infrastructure has come under increasing scrutiny [1]. Cloud data centers, which store and process vast amounts of data, consume enormous amounts of energy to power their servers, cooling systems, and other infrastructure components [2]. According to recent studies, cloud data centers account for a significant percentage of global energy consumption, making the need for energy-efficient cloud systems more urgent than ever [3].

Figure 1 illustrates the projected data center energy consumption in the U.S. and globally from 2014 to 2028. U.S. energy use remained stable around 60 TWh from 2014 to 2016, then began to rise steadily with the adoption of GPU-accelerated servers, reaching 176 TWh by 2023. By 2028, U.S. consumption is projected to range between 325 and 580 TWh, depending on technological and operational scenarios. Global energy use follows a similar upward trend, reaching approximately 850 TWh by 2028. The chart highlights the accelerating demand for data center energy driven by AI and emerging technologies.

Figure 1. Global data center energy consumption trends (2014-2028). Data sources: International Energy Agency (IEA) 2024 report; Lawrence Berkeley National Laboratory 2024 US Data Center Energy Usage Report.

Green software engineering is an emerging approach that seeks to reduce the environmental footprint of cloud systems by designing and deploying software that optimizes resource usage, minimizes energy consumption, and enhances sustainability [4]. Unlike traditional software engineering practices, green software engineering emphasizes not only functionality and performance, but also the environmental impact of the software systems being developed [5].

The growing awareness of climate change and the increasing demand for cloud services have prompted the industry to adopt more sustainable practices [6]. In this context, cloud providers are exploring ways to reduce energy consumption without compromising performance. Techniques such as dynamic resource allocation, energy-efficient virtualization, and automated workload management are becoming integral components of modern cloud infrastructure [7].

This paper explores energy-efficient design and deployment strategies for cloud infrastructures, with a focus on the role of green software engineering. By examining both design-time and deployment-time strategies, the paper aims to provide an overview of how cloud systems can be optimized for energy efficiency. Additionally, we discuss the various challenges and limitations faced by organizations in implementing these practices and provide insights into future research directions aimed at advancing sustainable cloud computing.

2. Methodology

To ensure a comprehensive and objective review, we followed a structured literature search strategy. We conducted database searches using IEEE Xplore, ScienceDirect, SpringerLink, and Google Scholar. Keywords included “green software engineering”, “energy-efficient cloud”, “virtualization energy optimization”, and “cloud sustainability”. The search spanned publications from 2015 to 2024. Inclusion criteria required that studies present empirical evidence, optimization models, or case analyses related to energy-efficient design and deployment in cloud infrastructure. Articles focused solely on hardware-level or policy frameworks without a software component were excluded.

3. Energy-Efficient Design in Cloud Infrastructure

The design of cloud infrastructure plays a critical role in determining its overall energy consumption [8]. In traditional cloud systems, resources are often over-provisioned to ensure availability and performance. While this ensures that cloud applications perform optimally under various conditions, it can lead to significant energy wastage, especially during periods of low demand. The goal of energy-efficient design in cloud systems is to minimize resource consumption while maintaining or even improving performance.

One of the primary strategies for achieving energy efficiency in cloud infrastructure is virtualization. Virtualization allows multiple Virtual Machines (VMs) or containers to run on a single physical server, enabling more efficient use of hardware resources. By consolidating workloads onto fewer physical servers, cloud providers can reduce the number of active machines and minimize power consumption [9]. In addition, containerization technologies, such as Docker and Kubernetes, allow for lightweight and highly efficient resource utilization, enabling faster scaling and better energy management [10].

Dynamic resource allocation is another key component of energy-efficient cloud design. Cloud providers typically offer resources based on static configurations that do not adjust to real-time workload demands [11]. As a result, cloud systems may waste energy by keeping unused resources active. Dynamic allocation of resources allows for the allocation of computing power based on actual demand, ensuring that resources are only provisioned when necessary. Technologies such as auto-scaling and elastic computing enable cloud systems to adjust the number of active instances or virtual machines dynamically based on workload requirements, reducing energy consumption during periods of low demand.

Furthermore, energy-aware scheduling is crucial for reducing energy consumption in distributed cloud systems [12]. Scheduling refers to how tasks or workloads are assigned to servers and virtual machines. By implementing energy-aware scheduling algorithms, cloud providers can optimize the placement of workloads in a way that minimizes power usage. For instance, placing tasks on servers with lower energy consumption or consolidating workloads onto fewer servers during off-peak hours can help reduce the overall energy footprint of the cloud infrastructure. In addition, green algorithms are being developed to take into account factors such as energy consumption, load balancing, and temperature when scheduling workloads in cloud data centers [13].

Another critical aspect of energy-efficient cloud infrastructure design is the use of low-power hardware [14]. Cloud providers are increasingly investing in energy-efficient processors, storage devices, and networking components to reduce the energy consumption of their data centers. ARM-based processors, for example, are known for their lower power consumption compared to traditional x86 processors, making them an attractive option for energy-conscious cloud providers. Similarly, Solid-State Drives (SSDs) are often preferred over traditional Hard Disk Drives (HDDs) due to their lower power consumption and faster read/write speeds [15].

In summary, the design of energy-efficient cloud infrastructure involves optimizing the use of resources through virtualization, dynamic resource allocation, and energy-aware scheduling. By integrating these strategies into cloud system design, providers can reduce energy consumption and enhance the sustainability of their operations [16]. The use of low-power hardware and efficient scheduling algorithms further contributes to minimizing the environmental impact of cloud systems.

4. Deployment Strategies for Green Cloud Systems

The deployment phase of cloud systems is as crucial to energy efficiency as the design phase. Once cloud infrastructure is designed with energy-efficient principles, it is equally important to implement strategies that ensure these efficiencies are maintained during deployment. Traditional cloud deployment often involves over-provisioning resources to handle potential peaks in demand, leading to unnecessary energy consumption during off-peak hours [17]. In contrast, green cloud deployment strategies focus on the dynamic management of resources to ensure they are used optimally.

One of the primary strategies for energy-efficient cloud deployment is resource scaling. Cloud systems are often designed to scale dynamically in response to changes in workload demand. However, energy-aware scaling goes a step further by ensuring that the scaling process is efficient in terms of energy consumption. Elastic computing, a concept in which resources such as computing power, storage, and bandwidth are automatically allocated and de-allocated based on real-time demand, plays a significant role in ensuring that cloud systems use only the resources necessary for the workload [18]. By leveraging auto-scaling capabilities, cloud systems can expand or shrink the amount of resources allocated to services in real-time, minimizing waste during low-demand periods.

Figure 2 illustrates the improvement in Power Usage Effectiveness (PUE) across U.S. data centers from 2007 to 2023. The average PUE, which reflects the ratio of total facility energy to IT equipment energy, declined steadily from approximately 2.5 in 2007 to around 1.4 in 2023, indicating enhanced energy efficiency. Hyperscale data centers—those operated by major cloud providers—achieved even lower PUEs, improving from about 2.0 in 2007 to 1.1 by 2023. This trend highlights the industry’s continued focus on optimizing infrastructure efficiency, especially through advanced cooling and power management technologies.

Figure 2. Power Usage Effectiveness (PUE) improvement in data centers (2007-2023). Data source: 2024 US Data Center Energy Usage Report.

The effectiveness of green deployment strategies is evident in the improvement of Power Usage Effectiveness (PUE)—a widely used metric defined as the ratio of total facility energy to IT equipment energy. A lower PUE indicates higher energy efficiency, as it reflects less overhead for cooling and power conversion.

In addition to resource scaling, load balancing techniques are also critical for energy-efficient deployment [19]. Cloud service providers use load balancing to distribute workloads evenly across servers or virtual machines. In green cloud systems, load balancing strategies are enhanced by taking energy consumption into account. Energy-aware load balancing algorithms can ensure that workloads are directed to the most energy-efficient servers, balancing performance requirements with minimal power usage. This is particularly relevant in multi-tenant environments where cloud resources are shared between different users or applications. Ensuring fair resource distribution while also prioritizing energy efficiency can help reduce unnecessary energy consumption across the entire infrastructure [20].

Another deployment strategy for energy-efficient cloud systems is energy-aware scheduling [21]. Scheduling tasks or workloads in cloud systems determines which servers or virtual machines will execute which tasks at a given time. In traditional systems, scheduling focuses primarily on performance and availability. However, in green cloud systems, energy-aware scheduling incorporates energy usage data and optimizes the placement of tasks across the cloud infrastructure [22]. For example, workloads can be consolidated onto fewer servers during off-peak hours, or tasks can be scheduled based on the energy consumption of the hardware hosting them. The aim is to run tasks on servers that are energy-efficient and avoid overloading certain systems, which could lead to higher power usage.

Additionally, green data centers are a key aspect of sustainable cloud deployment [23]. Modern cloud providers are increasingly moving toward energy-efficient data centers, which utilize renewable energy sources, advanced cooling systems, and low-power hardware. These data centers are designed to be as energy efficient as possible, reducing their environmental footprint. Cloud providers are also adopting advanced cooling technologies, such as liquid cooling and free-air cooling, which require significantly less energy than traditional air-conditioning systems. Moreover, by ensuring that data centers are built with energy efficiency in mind, organizations can further reduce the overall energy consumption of their cloud systems.

In conclusion, energy-efficient deployment strategies in cloud systems focus on the dynamic allocation of resources, load balancing, and energy-aware scheduling. These strategies, when combined with green data centers and elastic computing, can help minimize energy consumption, reduce waste, and improve the overall sustainability of cloud services. Through these practices, cloud providers can ensure that their operations remain efficient and eco-friendly while still meeting the performance demands of modern applications.

5. Impact of Green Software Engineering on Cloud Sustainability

Green software engineering plays a pivotal role in promoting sustainability in cloud systems [24]. By integrating energy-efficient practices into the software development lifecycle, organizations can reduce the environmental impact of cloud computing without sacrificing performance or functionality [25]. The environmental benefits of green software engineering are multifaceted, as they touch on both energy efficiency and resource optimization.

One of the key impacts of green software engineering is the reduction of energy consumption in cloud systems [26]. Traditional software engineering focuses primarily on functionality, performance, and scalability, often without considering the environmental impact. Green software engineering, however, integrates energy efficiency as a fundamental design principle. By optimizing algorithms, improving code efficiency, and minimizing resource usage, software can run more efficiently, requiring fewer computational resources and reducing energy consumption. For example, software can be designed to use less CPU power, leading to lower overall server energy usage [27]. The energy efficiency of the software directly impacts the data center’s energy footprint, resulting in fewer resources needed to run applications and processes.

Green software engineering also contributes to longer hardware lifespans. By writing more efficient code that minimizes the demand for hardware resources, software reduces the load on servers and infrastructure [5]. This means that servers can operate at lower capacities and for longer periods without requiring frequent upgrades or replacements. In turn, this leads to a reduction in the amount of e-waste produced by data centers and cloud providers. The optimization of hardware usage through more efficient software also decreases the need for frequent hardware replacements, thus lowering the environmental impact of maintaining cloud infrastructure.

Another significant impact is the cost savings that result from energy-efficient cloud systems. Cloud providers typically pass on the costs of running data centers, including electricity and cooling expenses, to their customers. By adopting green software engineering practices, providers can reduce operational costs, which in turn can be passed on to customers in the form of more affordable cloud services. As businesses become more cost-conscious, they will increasingly value energy-efficient services that help reduce their overall expenditure while also promoting sustainability.

Green software engineering promotes hardware longevity by reducing the computational intensity of tasks. Efficient code execution generates less heat and draws less power, leading to lower thermal stress and reduced fan usage. This mitigates wear on hardware components such as CPUs, GPUs, and storage systems. For example, Google reported that optimizing background app behavior in Android systems extended SSD lifespan by reducing write cycles. Similarly, algorithms that minimize CPU spikes can prevent premature hardware degradation caused by thermal cycling.

Furthermore, green software engineering practices support corporate sustainability goals [28]. Organizations are under growing pressure from regulators, investors, and customers to reduce their carbon footprints and adopt more sustainable practices. By incorporating green software engineering into cloud development, companies can demonstrate their commitment to sustainability and environmental responsibility [29]. This not only improves their brand image but also helps them comply with environmental regulations and sustainability certifications, which are becoming more stringent in industries like finance, healthcare, and telecommunications.

In conclusion, green software engineering plays a critical role in enhancing cloud sustainability. By focusing on energy efficiency, resource optimization, and hardware longevity, green software engineering practices help reduce the environmental impact of cloud systems. Additionally, these practices lead to cost savings for cloud providers and their customers while also supporting corporate sustainability goals. The integration of energy-efficient design and deployment strategies into the cloud ecosystem is essential for the long-term viability of cloud computing as a sustainable service model.

6. Challenges and Limitations

While green software engineering offers significant benefits in terms of energy efficiency and sustainability, its widespread adoption faces several challenges. The integration of green practices into cloud computing systems and the software development lifecycle requires overcoming technical, organizational, and financial barriers.

One of the primary challenges in implementing green software engineering is the lack of awareness and expertise in energy-efficient software design. Many software developers and cloud engineers are not fully aware of the potential environmental impact of their work or the techniques available to improve energy efficiency [30]. Software design often prioritizes functionality, performance, and scalability over energy consumption, and integrating green design principles into these priorities can be difficult. There is a need for better education and training for developers to raise awareness of the benefits of green software engineering and to equip them with the knowledge of energy-efficient programming techniques.

Another significant challenge is the technical complexity of optimizing software for energy efficiency. Many cloud applications are highly complex, involving large-scale data processing, extensive computational resources, and dynamic workloads [31]. Writing energy-efficient software for such complex systems often requires specialized algorithms and optimization techniques, which can be time-consuming and difficult to implement. Additionally, green software engineering may conflict with other software design goals, such as maintaining high performance, scalability, or user experience. Striking the right balance between energy efficiency and other performance criteria is a difficult task, especially in cloud environments where workloads can vary dramatically.

From an organizational perspective, there are financial barriers to adopting green software engineering practices. For cloud service providers, the upfront costs of investing in energy-efficient hardware, implementing new software designs, and training staff can be significant [3]. While green software engineering ultimately leads to cost savings in the long run, many organizations may be reluctant to invest in these practices without immediate, tangible returns. Furthermore, energy efficiency is often not considered a top priority by business leaders, who are more focused on factors such as cost reduction, market competition, and service reliability. Overcoming this disconnect and demonstrating the long-term financial benefits of green software engineering is key to fostering greater adoption.

Another barrier to the widespread implementation of green software engineering is the lack of standardized metrics for measuring energy consumption in cloud systems [32]. The absence of industry-wide benchmarks for energy-efficient software design makes it difficult to assess the effectiveness of green practices and compare the energy performance of different cloud providers. Without standardized metrics, it becomes challenging for organizations to track improvements or identify areas where energy consumption can be reduced further. The development of common standards for measuring energy efficiency and the establishment of clear performance metrics will be essential for promoting the adoption of green practices in the cloud industry.

Finally, the complexity of cloud infrastructure itself presents a challenge to green software engineering. Modern cloud systems are built on complex virtualization technologies and multi-layered software stacks, which can make it difficult to optimize energy usage across the entire infrastructure [33]. For instance, resource scheduling and virtual machine placement algorithms often focus on maximizing performance and minimizing response time rather than on optimizing energy consumption. Green algorithms that prioritize energy efficiency, while still meeting performance goals, need to be integrated into existing cloud management frameworks, a task that requires extensive redesign and testing of cloud infrastructure components [34].

7. Future Directions and Research in Green Software Engineering

The future of green software engineering in cloud systems looks promising, with emerging technologies and research paving the way for more sustainable and energy-efficient cloud computing practices. As the demand for cloud services grows, the need for eco-friendly software solutions will become even more pressing, and the integration of energy efficiency into the software development lifecycle will become a critical factor for long-term sustainability.

Figure 3 presents a conceptual framework for how these emerging technologies can be integrated into a comprehensive green software engineering approach, showing the interconnections between different technological layers from hardware to applications.

Figure 3. Green software engineering technology stack.

One of the key future directions in green software engineering is the development of AI-driven optimization algorithms for energy-efficient software design [35]. Machine learning and artificial intelligence offer the potential to automate the process of optimizing software for energy efficiency. AI can be used to analyze large datasets from cloud environments, predict resource usage patterns, and adjust energy consumption strategies in real-time. For example, AI-based resource scheduling could dynamically allocate computing resources in a way that minimizes energy consumption without sacrificing performance. Additionally, AI could assist in optimizing the power consumption of data centers by adjusting cooling systems and server configurations based on real-time energy usage data [36].

Another promising area of research is the integration of green software engineering with edge computing. Edge computing, which brings computation closer to the data source, introduces new opportunities for energy savings. By reducing the need for data to travel to centralized cloud data centers, edge computing can decrease network traffic and reduce latency [37]. However, managing energy consumption at the edge, particularly in small devices with limited resources, requires innovative software designs. Research in this area could focus on developing low-power algorithms and energy-efficient data processing techniques for edge devices, ensuring that these systems can contribute to the overall sustainability of cloud infrastructure.

The development of energy-efficient algorithms and sustainable programming practices is another critical area for future research. While some progress has been made in optimizing algorithms for energy efficiency, much of the existing research focuses on individual components of cloud systems. Future research will need to focus on optimizing entire systems and workflows for energy efficiency, considering factors such as data storage, processing power, and network usage [38]. Creating algorithms that can dynamically adjust their energy usage depending on workload demands will be key to reducing the overall energy footprint of cloud systems.

Moreover, there is a growing need for the standardization of energy-efficiency metrics and benchmarking frameworks for cloud systems. As the cloud computing industry grows, having standardized ways to measure and report energy consumption will help cloud providers compare their performance and track progress toward more sustainable practices [39]. These standards can also facilitate certification programs for green cloud providers, helping customers make informed decisions about their energy use. Developing metrics to measure energy efficiency across multiple cloud environments will allow for better tracking of environmental impact and provide a clearer picture of the success of green software engineering initiatives.

Finally, as cloud providers and enterprises look to improve sustainability, research into Sustainable Development Goals (SDGs) and their intersection with green software engineering will become more important. There is increasing pressure for businesses to align their operations with the UN SDGs, particularly goals related to affordable and clean energy and climate action. By adopting green software engineering practices, cloud providers can contribute to achieving these goals while also reaping the benefits of cost reduction and environmental responsibility.

8. Conclusions

In conclusion, green software engineering is a critical component of building sustainable and energy-efficient cloud infrastructures. As the demand for cloud services continues to grow, it is essential to adopt practices that optimize energy consumption without compromising performance or user experience. Through the optimization of algorithms, resource scaling, and the integration of low-power hardware, cloud providers can reduce their energy consumption and contribute to a more sustainable future.

However, the adoption of green software engineering faces several challenges, including the need for greater awareness, expertise, and the integration of energy-efficient practices into existing cloud infrastructures. Overcoming these challenges requires collaboration between software developers, cloud providers, and researchers to develop new technologies and strategies for energy-efficient cloud systems.

Looking ahead, the future of green software engineering is promising, with advancements in AI-driven optimization, edge computing, and the development of energy-efficient algorithms playing a key role in making cloud systems more sustainable. As organizations and providers continue to prioritize energy efficiency and eco-friendly software design, the cloud computing industry will play a significant role in achieving global sustainability goals and reducing the environmental impact of technology.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Achar, S. (2022) Cloud Computing: Toward Sustainable Processes and Better Environmental Impact. Journal of Computer Hardware Engineering (JCHE), 1, 1-9.
[2] Katal, A., Dahiya, S. and Choudhury, T. (2022) Energy Efficiency in Cloud Computing Data Centers: A Survey on Software Technologies. Cluster Computing, 26, 1845-1875.
https://doi.org/10.1007/s10586-022-03713-0
[3] Bharany, S., Sharma, S., Khalaf, O.I., Abdulsahib, G.M., Al Humaimeedy, A.S., Aldhyani, T.H.H., et al. (2022) A Systematic Survey on Energy-Efficient Techniques in Sustainable Cloud Computing. Sustainability, 14, Article 6256.
https://doi.org/10.3390/su14106256
[4] Matthew, U.O., Asuni, O. and Fatai, L.O. (2024) Green Software Engineering Development Paradigm: An Approach to a Sustainable Renewable Energy Future. In: Sharma, A.K., Chanderwal, N., Prajapati, A., Singh, P. and Kansal, M., Eds., Advancing Software Engineering Through AI, Federated Learning, and Large Language Models, IGI Global, 281-294.
https://doi.org/10.4018/979-8-3693-3502-4.ch018
[5] Atadoga, A., Umoga, U.J., Lottu, O.A. and Sodiy, E.O. (2024) Tools, Techniques, and Trends in Sustainable Software Engineering: A Critical Review of Current Practices and Future Directions. World Journal of Advanced Engineering Technology and Sciences, 11, 231-239.
https://doi.org/10.30574/wjaets.2024.11.1.0051
[6] Singh, M., Tuli, S., Butcher, R.J., Kaur, R. and Gill, S.S. (2021) Dynamic Shift from Cloud Computing to Industry 4.0: Eco-Friendly Choice or Climate Change Threat. In: Krause, P. and Xhafa, F., Eds., IoT-Based Intelligent Modelling for Environmental and Ecological Engineering, Springer International Publishing, 275-293.
https://doi.org/10.1007/978-3-030-71172-6_12
[7] Aldossary, M. (2021) A Review of Dynamic Resource Management in Cloud Computing Environments. Computer Systems Science and Engineering, 36, 461-476.
https://doi.org/10.32604/csse.2021.014975
[8] Ahvar, E., Orgerie, A. and Lebre, A. (2022) Estimating Energy Consumption of Cloud, Fog, and Edge Computing Infrastructures. IEEE Transactions on Sustainable Computing, 7, 277-288.
https://doi.org/10.1109/tsusc.2019.2905900
[9] Uddin, M., Hamdi, M., Alghamdi, A., Alrizq, M., Memon, M.S., Abdelhaq, M., et al. (2021) Server Consolidation: A Technique to Enhance Cloud Data Center Power Efficiency and Overall Cost of Ownership. International Journal of Distributed Sensor Networks, 17, 1-13.
https://doi.org/10.1177/1550147721997218
[10] Vasireddy, I., Ramya, G. and Kandi, P. (2023) Kubernetes and Docker Load Balancing: State-of-the-Art Techniques and Challenges. International Journal of Innovative Research in Engineering and Management, 10, 49-54.
https://doi.org/10.55524/ijirem.2023.10.6.7
[11] Qureshi, M.S., Qureshi, M.B., Fayaz, M., Zakarya, M., Aslam, S. and Shah, A. (2020) Time and Cost Efficient Cloud Resource Allocation for Real-Time Data-Intensive Smart Systems. Energies, 13, Article 5706.
https://doi.org/10.3390/en13215706
[12] Aslanpour, M.S., Toosi, A.N., Cheema, M.A. and Gaire, R. (2022) Energy-Aware Resource Scheduling for Serverless Edge Computing. 2022 22nd IEEE International Symposium on Cluster, Cloud and Internet Computing (CCGrid), Taormina, 16-19 May 2022, 190-199.
https://doi.org/10.1109/ccgrid54584.2022.00028
[13] Pei, P., Huo, Z., Martínez, O.S. and Crespo, R.G. (2020) Minimal Green Energy Consumption and Workload Management for Data Centers on Smart City Platforms. Sustainability, 12, Article 3140.
https://doi.org/10.3390/su12083140
[14] Katal, A., Dahiya, S. and Choudhury, T. (2021) Energy Efficiency in Cloud Computing Data Center: A Survey on Hardware Technologies. Cluster Computing, 25, 675-705.
https://doi.org/10.1007/s10586-021-03431-z
[15] Riggs, H., Tufail, S., Parvez, I. and Sarwat, A. (2020) Survey of Solid State Drives, Characteristics, Technology, and Applications. 2020 SoutheastCon, Raleigh, 28-29 March 2020, 1-6.
https://doi.org/10.1109/southeastcon44009.2020.9249760
[16] Buyya, R., Ilager, S. and Arroba, P. (2023) Energy-Efficiency and Sustainability in New Generation Cloud Computing: A Vision and Directions for Integrated Management of Data Centre Resources and Workloads. Software: Practice and Experience, 54, 24-38.
https://doi.org/10.1002/spe.3248
[17] Pasham, S.D. (2018) Dynamic Resource Provisioning in Cloud Environments Using Predictive Analytics. International Journal of Engineering and Computer Science, 7, 23517-23520.
https://doi.org/10.18535/ijecs/v7i1.09
[18] Gureya, D.D. (2021) Resource Allocation for Data-Intensive Services in the Cloud. Master’s Thesis, KTH Royal Institute of Technology.
[19] Singh, M., Bhardwaj, P., Bhardwaj, R. and Narayan, S. (2024) Advancing Scalability and Efficiency in Distributed Network Computing through Innovative Resource Allocation and Load Balancing Strategies. In: Kahraman, C., Cevik Onar, S., Cebi, S., Oztaysi, B., Tolga, A.C. and Ucal Sari, I., Eds., Intelligent and Fuzzy Systems., Springer Nature Switzerland, 722-740.
https://doi.org/10.1007/978-3-031-67195-1_80
[20] Aniekan, A.U., Peter, E.O., Onyinyechukwu, C., Bright, N., Adetomilola, V.F. and Kenneth, I.I. (2024) Incorporating Energy Efficiency in Urban Planning: A Review of Policies and Best Practices. Engineering Science & Technology Journal, 5, 83-98.
[21] Medara, R. and Singh, R.S. (2022) A Review on Energy-Aware Scheduling Techniques for Workflows in IAAs Clouds. Wireless Personal Communications, 125, 1545-1584.
https://doi.org/10.1007/s11277-022-09621-1
[22] Peng, Z., Barzegar, B., Yarahmadi, M., Motameni, H. and Pirouzmand, P. (2020) Energy-Aware Scheduling of Workflow Using a Heuristic Method on Green Cloud. Scientific Programming, 2020, Article ID: 8898059.
https://doi.org/10.1155/2020/8898059
[23] Saxena, D., Singh, A.K., Lee, C. and Buyya, R. (2023) A Sustainable and Secure Load Management Model for Green Cloud Data Centres. Scientific Reports, 13, Article No. 491.
https://doi.org/10.1038/s41598-023-27703-3
[24] Ganesan, M., Kor, A., Pattinson, C. and Rondeau, E. (2020) Green Cloud Software Engineering for Big Data Processing. Sustainability, 12, Article 9255.
https://doi.org/10.3390/su12219255
[25] Kumar, S. and Buyya, R. (2012) Green Cloud Computing and Environmental Sustain-ability. In: Murugesan, S. and Gangadharan, G.R., Eds., Harnessing Green IT: Principles and Practices, Wiley, 315-339.
https://doi.org/10.1002/9781118305393.ch16
[26] Jain, A., Mishra, M., Peddoju, S.K. and Jain, N. (2013) Energy Efficient Computing-Green Cloud Computing. 2013 International Conference on Energy Efficient Technologies for Sustainability, Nagercoil, 10-12 April 2013, 978-982.
https://doi.org/10.1109/iceets.2013.6533519
[27] Assefa, B.G. and Özkasap, Ö. (2019) A Survey of Energy Efficiency in SDN: Software-Based Methods and Optimization Models. Journal of Network and Computer Applications, 137, 127-143.
https://doi.org/10.1016/j.jnca.2019.04.001
[28] Calero, C. and Piattini, M. (2015) Introduction to Green in Software Engineering. In: Calero, C. and Piattini, M., Eds., Green in Software Engineering, Springer, 3-27.
https://doi.org/10.1007/978-3-319-08581-4_1
[29] Pazienza, A., Baselli, G., Vinci, D.C. and Trussoni, M.V. (2024) A Holistic Approach to Environmentally Sustainable Computing. Innovations in Systems and Software Engineering, 20, 347-371.
https://doi.org/10.1007/s11334-023-00548-9
[30] Ahmad Ibrahim, S.R., Yahaya, J. and Sallehudin, H. (2022) Green Software Process Factors: A Qualitative Study. Sustainability, 14, Article 11180.
https://doi.org/10.3390/su141811180
[31] Hamzaoui, I., Duthil, B., Courboulay, V. and Medromi, H. (2020) A Survey on the Current Challenges of Energy-Efficient Cloud Resources Management. SN Computer Science, 1, Article No. 73.
https://doi.org/10.1007/s42979-020-0078-9
[32] Ghahramani, M.H., Zhou, M. and Hon, C.T. (2017) Toward Cloud Computing QOs Architecture: Analysis of Cloud Systems and Cloud Services. IEEE/CAA Journal of Automatica Sinica, 4, 6-18.
https://doi.org/10.1109/jas.2017.7510313
[33] Brooks, N., Vance, C. and Ames, D. (2025) Cloud Computing: A Review of Evolution, Challenges, and Emerging Trends. Journal of Computer Science and Software Applications, 5, 1-17.
[34] Jing, S., Ali, S., She, K. and Zhong, Y. (2011) State-of-the-Art Research Study for Green Cloud Computing. The Journal of Supercomputing, 65, 445-468.
https://doi.org/10.1007/s11227-011-0722-1
[35] Babu, C.V.S., M., S.S. and Rufus, S. (2025) Green Software Development: Integrating AI for Energy Efficiency. In: Suresh Babu, C.V., Ed., Sustainable Information Security in the Age of AI and Green Computing, IGI Global, 157-174.
https://doi.org/10.4018/979-8-3693-8034-5.ch008
[36] Athavale, J., Yoda, M. and Joshi, Y. (2021) Genetic Algorithm Based Cooling Energy Optimization of Data Centers. International Journal of Numerical Methods for Heat & Fluid Flow, 31, 3148-3168.
https://doi.org/10.1108/hff-01-2020-0036
[37] Caiazza, C., Giordano, S., Luconi, V. and Vecchio, A. (2022) Edge Computing vs Centralized Cloud: Impact of Communication Latency on the Energy Consumption of LTE Terminal Nodes. Computer Communications, 194, 213-225.
https://doi.org/10.1016/j.comcom.2022.07.026
[38] Khattar, N., Sidhu, J. and Singh, J. (2019) Toward Energy-Efficient Cloud Computing: A Survey of Dynamic Power Management and Heuristics-Based Optimization Techniques. The Journal of Supercomputing, 75, 4750-4810.
https://doi.org/10.1007/s11227-019-02764-2
[39] Veselova, V. (2023) Data Center Sustainability Reporting: Advancement of Assessment Methodology for Energy Consumption by Virtualized Resources in Data Centers.
https://www.theseus.fi/handle/10024/802361

Copyright © 2025 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.