1
|
Khatibi SMH, Ali J. Harnessing the power of machine learning for crop improvement and sustainable production. FRONTIERS IN PLANT SCIENCE 2024; 15:1417912. [PMID: 39188546 PMCID: PMC11346375 DOI: 10.3389/fpls.2024.1417912] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/15/2024] [Accepted: 07/15/2024] [Indexed: 08/28/2024]
Abstract
Crop improvement and production domains encounter large amounts of expanding data with multi-layer complexity that forces researchers to use machine-learning approaches to establish predictive and informative models to understand the sophisticated mechanisms underlying these processes. All machine-learning approaches aim to fit models to target data; nevertheless, it should be noted that a wide range of specialized methods might initially appear confusing. The principal objective of this study is to offer researchers an explicit introduction to some of the essential machine-learning approaches and their applications, comprising the most modern and utilized methods that have gained widespread adoption in crop improvement or similar domains. This article explicitly explains how different machine-learning methods could be applied for given agricultural data, highlights newly emerging techniques for machine-learning users, and lays out technical strategies for agri/crop research practitioners and researchers.
Collapse
Affiliation(s)
| | - Jauhar Ali
- Rice Breeding Platform, International Rice Research Institute, Los Baños, Laguna, Philippines
| |
Collapse
|
2
|
Isinkaye FO, Olusanya MO, Singh PK. Deep learning and content-based filtering techniques for improving plant disease identification and treatment recommendations: A comprehensive review. Heliyon 2024; 10:e29583. [PMID: 38737274 PMCID: PMC11088271 DOI: 10.1016/j.heliyon.2024.e29583] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Revised: 03/30/2024] [Accepted: 04/10/2024] [Indexed: 05/14/2024] Open
Abstract
The importance of identifying plant diseases has risen recently due to the adverse effect they have on agricultutal production. Plant diseases have been a big concern in agriculture, as they affect crop production, and constitute a major threat to global food security. In the domain of modern agriculture, effective plant disease management is vital to ensure healthy crop yields and sustainable practices. Traditional means of identifying plant disease are faced with lots of challenges and the need for better and efficient detection methods cannot be overemphazised. The emergence of advanced technologies, particularly deep learning and content-based filtering techniques, if integrated together can changed the way plant diseases are identified and treated. Such as speedy and correct identification of plant diseases and efficient treatment recommendations which are keys for sustainable food production. In this work, We try to investigate the current state of research, identified gaps and limitations in knowledge, and suggests future directions for researchers, experts and farmers that could help to provide better ways of mitigating plant disease problems.
Collapse
Affiliation(s)
- Folasade Olubusola Isinkaye
- Department of Computer Science and Information Technology, Sol Plaatje University Kimberley, 8301, South Africa
| | - Michael Olusoji Olusanya
- Department of Computer Science and Information Technology, Sol Plaatje University Kimberley, 8301, South Africa
| | - Pramod Kumar Singh
- Department of Computer Science and Engineering, ABV-Indian Institute of Information Technology and Management Gwalior, Gwalior, 474015, MP, India
| |
Collapse
|
3
|
Zhao R, Wang X, Wei Y, He X, Xu H. Machine Learning Applied to Electron Beam Lithography to Accelerate Process Optimization of a Contact Hole Layer. ACS APPLIED MATERIALS & INTERFACES 2024; 16:22465-22470. [PMID: 38626412 DOI: 10.1021/acsami.3c18889] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 04/18/2024]
Abstract
Determining the lithographic process conditions with high-resolution patterning plays a crucial role in accelerating chip manufacturing. However, lithography imaging is an extremely complex nonlinear system, and obtaining suitable process conditions requires extensive experimental attempts. This severely creates a bottleneck in optimizing and controlling the lithographic process conditions. Herein, we report a process optimization solution for a contact layer of metal oxide nanoparticle photoresists by combining electron beam lithography (EBL) experiments with machine learning. In this solution, a long short-term memory (LSTM) network and a support vector machine (SVM) model are used to establish the contact hole imaging and process condition classification models, respectively. By combining SVM with the LSTM network, the process conditions that simultaneously satisfy the requirements of the contact hole width and local critical dimension uniformity tolerance can be screened. The verification results demonstrate that the horizontal and vertical contact widths predicted by the LSTM network are highly consistent with the EBL experimental results, and the classification model shows good accuracy, providing a reference for process optimization of a contact layer.
Collapse
Affiliation(s)
- Rongbo Zhao
- Institute of Nuclear and New Energy Technology, Tsinghua University, Beijing 100084, China
| | - Xiaolin Wang
- Institute of Nuclear and New Energy Technology, Tsinghua University, Beijing 100084, China
| | - Yayi Wei
- Institute of Microelectronics of Chinese Academy of Sciences, Beijing 100029, China
- University of Chinese Academy of Sciences, Beijing 100049, China
| | - Xiangming He
- Institute of Nuclear and New Energy Technology, Tsinghua University, Beijing 100084, China
| | - Hong Xu
- Institute of Nuclear and New Energy Technology, Tsinghua University, Beijing 100084, China
| |
Collapse
|
4
|
Alazemi T, Darwish M, Radi M. Renewable energy sources integration via machine learning modelling: A systematic literature review. Heliyon 2024; 10:e26088. [PMID: 38404865 PMCID: PMC10884864 DOI: 10.1016/j.heliyon.2024.e26088] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2022] [Revised: 01/25/2024] [Accepted: 02/07/2024] [Indexed: 02/27/2024] Open
Abstract
The use of renewable energy sources (RESs) at the distribution level has become increasingly appealing in terms of costs and technology, expecting a massive diffusion in the near future and placing several challenges to the power grid. Since RESs depend on stochastic energy sources -solar radiation, temperature and wind speed, among others- they introduce a high level of uncertainty to the grid, leading to power imbalance and deteriorating the network stability. In this scenario, managing and forecasting RES uncertainty is vital to successfully integrate them into the power grids. Traditionally, physical- and statistical-based models have been used to predict RES power outputs. Nevertheless, the former are computationally expensive since they rely on solving complex mathematical models of the atmospheric dynamics, whereas the latter usually consider linear models, preventing them from addressing challenging forecasting scenarios. In recent years, the advances in machine learning techniques, which can learn from historical data, allowing the analysis of large-scale datasets either under non-uniform characteristics or noisy data, have provided researchers with powerful data-driven tools that can outperform traditional methods. In this paper, a systematic literature review is conducted to identify the most widely used machine learning-based approaches to forecast RES power outputs. The results show that deep artificial neural networks, especially long-short term memory networks, which can accurately model the autoregressive nature of RES power output, and ensemble strategies, which allow successfully handling large amounts of highly fluctuating data, are the best suited ones. In addition, the most promising results of integrating the forecasted output into decision-making problems, such as unit commitment, to address economic, operational and managerial grid challenges are discussed, and solid directions for future research are provided.
Collapse
Affiliation(s)
- Talal Alazemi
- Brunel University London Kingston Lane Uxbridge, Middlesex, UB8 3PH, United Kingdom
| | - Mohamed Darwish
- Brunel University London Kingston Lane Uxbridge, Middlesex, UB8 3PH, United Kingdom
| | - Mohammed Radi
- UK Power Networks, Pocock House, 237 Southwark Bridge Rd, London, SE1 6NP, United Kingdom
| |
Collapse
|
5
|
Zhao R, Wang X, Xu H, Wei Y, He X. Machine learning in electron beam lithography to boost photoresist formulation design for high-resolution patterning. NANOSCALE 2024; 16:4212-4218. [PMID: 38328883 DOI: 10.1039/d3nr04819e] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/09/2024]
Abstract
The reduction of the critical dimension (CD) usually improves the resolution of patterns and performance of chips. In chip manufacturing, electron beam lithography (EBL) is a promising technology for preparing sub-10 nm patterns, and its imaging resolution is primarily determined by the photoresist formulation. However, the smaller CDs are mainly achieved by optimizing process conditions, and little attention has been paid to the photoresist formulation optimization. Screening suitable photoresist formulations remains a significant challenge due to the considerable time and high cost. Herein, we report a formulation optimization technique of a metal oxide nanoparticle photoresist that combines EBL experiments with a machine learning long short-term memory (LSTM) network. Using the LSTM network, a CD photoresist evaluation model is established. Leveraging the CD model, a photoresist formulation optimizer is developed with a line width of 26 nm. The verification results demonstrate that the CDs predicted by the LSTM network are basically consistent with the EBL experimental results, and the photoresist formulations that meet the CD requirements can be screened. This work opens up a novel perspective to boost photoresist formulation design for high-resolution patterning with artificial intelligence and provides guidance for EBL experiments.
Collapse
Affiliation(s)
- Rongbo Zhao
- Institute of Nuclear and New Energy Technology, Tsinghua University, Beijing, 100084, China.
| | - Xiaolin Wang
- Institute of Nuclear and New Energy Technology, Tsinghua University, Beijing, 100084, China.
| | - Hong Xu
- Institute of Nuclear and New Energy Technology, Tsinghua University, Beijing, 100084, China.
| | - Yayi Wei
- Institute of Microelectronics of Chinese Academy of Sciences, Beijing 100029, China
- University of Chinese Academy of Sciences, Beijing 100049, China
| | - Xiangming He
- Institute of Nuclear and New Energy Technology, Tsinghua University, Beijing, 100084, China.
| |
Collapse
|
6
|
Xu X, Wei A, Tang S, Liu Q, Shi H, Sun W. Prediction of nitrous oxide emission of a municipal wastewater treatment plant using LSTM-based deep learning models. ENVIRONMENTAL SCIENCE AND POLLUTION RESEARCH INTERNATIONAL 2024; 31:2167-2186. [PMID: 38055175 DOI: 10.1007/s11356-023-31250-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/04/2023] [Accepted: 11/22/2023] [Indexed: 12/07/2023]
Abstract
Accurate assessment of greenhouse gas emissions from wastewater treatment plants is crucial for mitigating climate change. N2O is a potent greenhouse gas that is emitted from wastewater treatment plants during the biological denitrification process. In this study, we developed and evaluated deep learning models for predicting N2O emissions from a WWTP in Switzerland. Six key parameters were selected to obtain the optimal LSTM model by adjusting experimental parameter conditions. The optimal parameter condition was achieved with 150 neurons, the tanh activation function, the RMSprop optimization algorithm, a learning rate of 0.001, no dropout regularization, and a batch size of 128. Under the same conditions, we compared the performance of recurrent neural networks (RNNs) and Long Short-Term Memory (LSTM) networks. We found that LSTM models outperformed RNN models in predicting N2O emissions. The optimal LSTM model achieved a 36% improvement in mean absolute error (MAE), a 19% improvement in root mean squared error (RMSE), and a 6.92% improvement in R2 score compared to the RNN model. Additionally, LSTM models demonstrated better resilience to sudden changes in the target sequence, exhibiting a 9.54% higher percentage of explained variance compared to RNNs. These results highlight the potential of LSTM models for accurate and robust prediction of N2O emissions from wastewater treatment plants, contributing to effective greenhouse gas mitigation strategies.
Collapse
Affiliation(s)
- Xiaozhen Xu
- Shaanxi Key Laboratory of Earth Surface System and Environmental Carrying Capacity, College of Urban and Environmental Sciences, Northwest University, Xi'an, 710127, Shaanxi, China
| | - Anlei Wei
- Shaanxi Key Laboratory of Earth Surface System and Environmental Carrying Capacity, College of Urban and Environmental Sciences, Northwest University, Xi'an, 710127, Shaanxi, China.
| | - Songjun Tang
- Shaanxi Key Laboratory of Earth Surface System and Environmental Carrying Capacity, College of Urban and Environmental Sciences, Northwest University, Xi'an, 710127, Shaanxi, China
| | - Qi Liu
- Shaanxi Key Laboratory of Earth Surface System and Environmental Carrying Capacity, College of Urban and Environmental Sciences, Northwest University, Xi'an, 710127, Shaanxi, China
| | - Hanxiao Shi
- Shaanxi Key Laboratory of Earth Surface System and Environmental Carrying Capacity, College of Urban and Environmental Sciences, Northwest University, Xi'an, 710127, Shaanxi, China
| | - Wei Sun
- School of Geography and Planning, Sun Yat-Sen University, Guangzhou, 510275, Guangdong, China
| |
Collapse
|
7
|
Ye W, Chen X, Li P, Tao Y, Wang Z, Gao C, Cheng J, Li F, Yi D, Wei Z, Yi D, Wu Y. OEDL: an optimized ensemble deep learning method for the prediction of acute ischemic stroke prognoses using union features. Front Neurol 2023; 14:1158555. [PMID: 37416306 PMCID: PMC10321134 DOI: 10.3389/fneur.2023.1158555] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2023] [Accepted: 05/22/2023] [Indexed: 07/08/2023] Open
Abstract
Background Early stroke prognosis assessments are critical for decision-making regarding therapeutic intervention. We introduced the concepts of data combination, method integration, and algorithm parallelization, aiming to build an integrated deep learning model based on a combination of clinical and radiomics features and analyze its application value in prognosis prediction. Methods The research steps in this study include data source and feature extraction, data processing and feature fusion, model building and optimization, model training, and so on. Using data from 441 stroke patients, clinical and radiomics features were extracted, and feature selection was performed. Clinical, radiomics, and combined features were included to construct predictive models. We applied the concept of deep integration to the joint analysis of multiple deep learning methods, used a metaheuristic algorithm to improve the parameter search efficiency, and finally, developed an acute ischemic stroke (AIS) prognosis prediction method, namely, the optimized ensemble of deep learning (OEDL) method. Results Among the clinical features, 17 features passed the correlation check. Among the radiomics features, 19 features were selected. In the comparison of the prediction performance of each method, the OEDL method based on the concept of ensemble optimization had the best classification performance. In the comparison to the predictive performance of each feature, the inclusion of the combined features resulted in better classification performance than that of the clinical and radiomics features. In the comparison to the prediction performance of each balanced method, SMOTEENN, which is based on a hybrid sampling method, achieved the best classification performance than that of the unbalanced, oversampled, and undersampled methods. The OEDL method with combined features and mixed sampling achieved the best classification performance, with 97.89, 95.74, 94.75, 94.03, and 94.35% for Macro-AUC, ACC, Macro-R, Macro-P, and Macro-F1, respectively, and achieved advanced performance in comparison with that of methods in previous studies. Conclusion The OEDL approach proposed herein could effectively achieve improved stroke prognosis prediction performance, the effect of using combined data modeling was significantly better than that of single clinical or radiomics feature models, and the proposed method had a better intervention guidance value. Our approach is beneficial for optimizing the early clinical intervention process and providing the necessary clinical decision support for personalized treatment.
Collapse
Affiliation(s)
- Wei Ye
- Department of Health Statistics, College of Preventive Medicine, Army Medical University, Chongqing, China
| | - Xicheng Chen
- Department of Health Statistics, College of Preventive Medicine, Army Medical University, Chongqing, China
| | - Pengpeng Li
- Department of Health Statistics, College of Preventive Medicine, Army Medical University, Chongqing, China
| | - Yongjun Tao
- Department of Neurology, Taizhou Municipal Hospital, Taizhou, Zhejiang, China
| | - Zhenyan Wang
- Department of Health Statistics, College of Preventive Medicine, Army Medical University, Chongqing, China
| | - Chengcheng Gao
- Department of Health Statistics, College of Preventive Medicine, Army Medical University, Chongqing, China
| | - Jian Cheng
- Department of Radiology, Taizhou Municipal Hospital, Taizhou, Zhejiang, China
| | - Fang Li
- Department of Health Statistics, College of Preventive Medicine, Army Medical University, Chongqing, China
| | - Dali Yi
- Department of Health Statistics, College of Preventive Medicine, Army Medical University, Chongqing, China
- Department of Health Education, College of Preventive Medicine, Army Medical University, Chongqing, China
| | - Zeliang Wei
- Department of Health Statistics, College of Preventive Medicine, Army Medical University, Chongqing, China
| | - Dong Yi
- Department of Health Statistics, College of Preventive Medicine, Army Medical University, Chongqing, China
| | - Yazhou Wu
- Department of Health Statistics, College of Preventive Medicine, Army Medical University, Chongqing, China
| |
Collapse
|
8
|
Pattusamy M, Kanth L. Classification of Tweets Into Facts and Opinions Using Recurrent Neural Networks. INTERNATIONAL JOURNAL OF TECHNOLOGY AND HUMAN INTERACTION 2023. [DOI: 10.4018/ijthi.319358] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/12/2023]
Abstract
In the last few years, the growth rate of the number of people who are active on Twitter has been consistently spiking. In India, even the government agencies have started using Twitter accounts as they feel that they can get connected to a greater number of people in a short span of time. Apart from the social media platforms, there are an enormous number of blogging applications that have popped up providing another platform for the people to share their views. With all this, the authenticity of the content that is being generated is going for a toss. On that note, the authors have the task in hand of differentiating the genuineness of the content. In this process, they have worked upon various techniques that would maximize the authenticity of the content and propose a long short-term memory (LSTM) model that will make a distinction between the tweets posted on the Twitter platform. The model in combination with the manually engineered features and the bag of words model is able to classify the tweets efficiently.
Collapse
|
9
|
Zheng Y, Xu Z, Xiao A. Deep learning in economics: a systematic and critical review. Artif Intell Rev 2023; 56:1-43. [PMID: 36777109 PMCID: PMC9898707 DOI: 10.1007/s10462-022-10272-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
From the perspective of historical review, the methodology of economics develops from qualitative to quantitative, from a small sampling of data to a vast amount of data. Because of the superiority in learning inherent law and representative level, deep learning models assist in realizing intelligent decision-making in economics. After presenting some statistical results of relevant researches, this paper systematically investigates deep learning in economics, including a survey of frequently-used deep learning models in economics, several applications of deep learning models used in economics. Then, some critical reviews of deep learning in economics are provided, including models and applications, why and how to implement deep learning in economics, research gap and future challenges, respectively. It is obvious that several deep learning models and their variants have been widely applied in different subfields of economics, e.g., financial economics, macroeconomics and monetary economics, agricultural and natural resource economics, industrial organization, urban, rural, regional, real estate and transportation economics, health, education and welfare, business administration and microeconomics, etc. We are very confident that decision-making in economics will be more intelligent with the development of deep learning, because the research of deep learning in economics has become a hot and important topic recently.
Collapse
Affiliation(s)
- Yuanhang Zheng
- College of Computer Science, Sichuan University, 610064 Chengdu, PR China
| | - Zeshui Xu
- Business School, Sichuan University, 610064 Chengdu, PR China
| | - Anran Xiao
- Business School, Sichuan University, 610064 Chengdu, PR China
| |
Collapse
|
10
|
de Andrade CHT, de Melo GCG, Vieira TF, de Araújo ÍBQ, de Medeiros Martins A, Torres IC, Brito DB, Santos AKX. How Does Neural Network Model Capacity Affect Photovoltaic Power Prediction? A Study Case. SENSORS (BASEL, SWITZERLAND) 2023; 23:1357. [PMID: 36772397 PMCID: PMC9920211 DOI: 10.3390/s23031357] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/01/2022] [Revised: 01/07/2023] [Accepted: 01/10/2023] [Indexed: 06/18/2023]
Abstract
The use of models capable of forecasting the production of photovoltaic (PV) energy is essential to guarantee the best possible integration of this energy source into traditional distribution grids. Long Short-Term Memory networks (LSTMs) are commonly used for this purpose, but their use may not be the better option due to their great computational complexity and slower inference and training time. Thus, in this work, we seek to evaluate the use of neural networks MLPs (Multilayer Perceptron), Recurrent Neural Networks (RNNs), and LSTMs, for the forecast of 5 min of photovoltaic energy production. Each iteration of the predictions uses the last 120 min of data collected from the PV system (power, irradiation, and PV cell temperature), measured from 2019 to mid-2022 in Maceió (Brazil). In addition, Bayesian hyperparameters optimization was used to obtain the best of each model and compare them on an equal footing. Results showed that the MLP performs satisfactorily, requiring much less time to train and forecast, indicating that they can be a better option when dealing with a very short-term forecast in specific contexts, for example, in systems with little computational resources.
Collapse
Affiliation(s)
| | | | - Tiago Figueiredo Vieira
- Center of Agrarian Sciences, Engineering and Agricultural Sciences Campus, Federal University of Alagoas—UFAL, Rio Largo 57100-000, Brazil
| | | | - Allan de Medeiros Martins
- Electrical Engineering Department, Center of Technology, Federal University of Rio Grande do Norte—UFRN, Natal 59072-970, Brazil
| | - Igor Cavalcante Torres
- Center of Agrarian Sciences, Engineering and Agricultural Sciences Campus, Federal University of Alagoas—UFAL, Rio Largo 57100-000, Brazil
| | - Davi Bibiano Brito
- Computing Institute, A. C. Simões Campus, Federal University of Alagoas—UFAL, Maceió 57072-970, Brazil
| | - Alana Kelly Xavier Santos
- Center of Agrarian Sciences, Engineering and Agricultural Sciences Campus, Federal University of Alagoas—UFAL, Rio Largo 57100-000, Brazil
| |
Collapse
|
11
|
Zhang X, Kim T. A hybrid attention and time series network for enterprise sales forecasting under digital management and edge computing. JOURNAL OF CLOUD COMPUTING 2023. [DOI: 10.1186/s13677-023-00390-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/23/2023]
Abstract
AbstractEnterprises have both new opportunities and new challenges as a result of the rapid advancements in information technology that have accompanied the age of economic globalization. With the growth of internet of Things devices, data sizes have significantly increased. Further, the traditional cloud platform has been enriched with edge computing so that the huge data can be processed where it is collected. Therefore, businesses must adapt to new size requirements and rising standards for technical content. Forecasting corporate sales has emerged as a hot topic in the field of digital management. To successfully direct the future production and existence of enterprises, time series forecasting is of utmost importance and value. This is because it makes use of already-existing data to get the best predicting result. This work proposes a combination of enterprise sales forecasting from the perspective of digital management and neural networks, and proposes a network HATT-CNN-BiLSTM model for enterprise sales forecasting. First, this work combines multi-scale CNN (MSCNN) with improved BiLSTM (IBiLSTM) model. The MSCNN is utilized to extract spatial features with different scale, and it is often impossible to effectively explore the rules of time series features, and the processing of time series data is the strength of the LSTM network. Moreover, the IBiLSTM model can explore time series features in both directions, and therefore more useful information can be obtained. The MSCNN-IBiLSTM model, which is composed of MSCNN and IBiLSTM, can take advantage of strengths and avoid weaknesses, and give full play to the roles of the two models in different fields. Second, this work proposes a hybrid attention mechanism that combines self-attention, channel attention, and spatial attention. It enhances features extracted by MSCNN-IBiLSTM through a hybrid attention to build HATT-MSCNN-IBiLSTM network, which can extract more discriminative features. Third, this work conducts comprehensive and systematic experiments on HATT- MSCNN-IBiLSTM to verify feasibility of the proposed method. The proposed model is implemented over an edge computing platform that increases the model training speed and improve the response time.
Collapse
|
12
|
Alharkan H, Habib S, Islam M. Solar Power Prediction Using Dual Stream CNN-LSTM Architecture. SENSORS (BASEL, SWITZERLAND) 2023; 23:945. [PMID: 36679739 PMCID: PMC9864442 DOI: 10.3390/s23020945] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/28/2022] [Revised: 01/04/2023] [Accepted: 01/06/2023] [Indexed: 06/17/2023]
Abstract
The integration of solar energy with a power system brings great economic and environmental benefits. However, the high penetration of solar power is challenging due to the operation and planning of the existing power system owing to the intermittence and randomicity of solar power generation. Achieving accurate predictions for power generation is important to provide high-quality electric energy for end-users. Therefore, in this paper, we introduce a deep learning-based dual-stream convolutional neural network (CNN) and long short-term nemory (LSTM) network followed by a self-attention mechanism network (DSCLANet). Here, CNN is used to learn spatial patterns and LSTM is incorporated for temporal feature extraction. The output spatial and temporal feature vectors are then fused, followed by a self-attention mechanism to select optimal features for further processing. Finally, fully connected layers are incorporated for short-term solar power prediction. The performance of DSCLANet is evaluated on DKASC Alice Spring solar datasets, and it reduces the error rate up to 0.0136 MSE, 0.0304 MAE, and 0.0458 RMSE compared to recent state-of-the-art methods.
Collapse
Affiliation(s)
- Hamad Alharkan
- Department of Electrical Engineering, Unaizah College of Engineering, Qassim University, Unaizah 56452, Saudi Arabia
| | - Shabana Habib
- Department of Information Technology, College of Computer, Qassim University, Buraydah 51452, Saudi Arabia
| | - Muhammad Islam
- Department of Electrical Engineering, College of Engineering and Information Technology, Onaizah Colleges, Onaizah 56447, Saudi Arabia
| |
Collapse
|
13
|
Wei X, Ouyang H, Liu M. Stock index trend prediction based on TabNet feature selection and long short-term memory. PLoS One 2022; 17:e0269195. [PMID: 36512541 PMCID: PMC9746941 DOI: 10.1371/journal.pone.0269195] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2021] [Accepted: 05/17/2022] [Indexed: 12/15/2022] Open
Abstract
In this study, we propose a predictive model TabLSTM that combines machine learning methods such as TabNet and Long Short-Term Memory Neural Network (LSTM) with a complete factor library for stock index trend prediction. Our motivation is based on the notion that there are numerous interrelated factors in the stock market, and the factors that affect each stock are different. Therefore, a complete factor library and an efficient feature selection technique are necessary to predict stock index. In this paper, we first build a factor database that includes macro, micro and technical indicators. Successively, we calculate the factor importance through TabNet and rank them. Based on a prespecified threshold, the optimal factors set will include only the highest-ranked factors. Finally, using the optimal factors set as input information, LSTM is employed to predict the future trend of 4 stock indices. Empirical validation of the model shows that the combination of TabNet for factors selection and LSTM outperforms existing methods. Moreover, constructing a factor database is necessary for stock index prediction. The application of our method does not only show the feasibility to predict stock indices across different financial markets, yet it also provides an complete factor database and a comprehensive architecture for stock index trend prediction, which may provide some references for stock forecasting and quantitative investments.
Collapse
Affiliation(s)
- Xiaolu Wei
- Business School, Hubei University, Wuhan, Hubei, China
- * E-mail:
| | - Hongbing Ouyang
- Department of Economics, Huazhong University of Science and Technology, Wuhan, Hubei, China
| | - Muyan Liu
- Business School, Sichuan University, Chengdu, Sichuan, China
| |
Collapse
|
14
|
Engel E, Engel N. A Review on Machine Learning Applications for Solar Plants. SENSORS (BASEL, SWITZERLAND) 2022; 22:9060. [PMID: 36501762 PMCID: PMC9738664 DOI: 10.3390/s22239060] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/26/2022] [Revised: 11/20/2022] [Accepted: 11/21/2022] [Indexed: 06/17/2023]
Abstract
A solar plant system has complex nonlinear dynamics with uncertainties due to variations in system parameters and insolation. Thereby, it is difficult to approximate these complex dynamics with conventional algorithms whereas Machine Learning (ML) methods yield the essential performance required. ML models are key units in recent sensor systems for solar plant design, forecasting, maintenance, and control to provide the best safety, reliability, robustness, and performance as compared to classical methods which are usually employed in the hardware and software of solar plants. Considering this, the goal of our paper is to explore and analyze ML technologies and their advantages and shortcomings as compared to classical methods for the design, forecasting, maintenance, and control of solar plants. In contrast with other review articles, our research briefly summarizes our intelligent, self-adaptive models for sizing, forecasting, maintenance, and control of a solar plant; sets benchmarks for performance comparison of the reviewed ML models for a solar plant's system; proposes a simple but effective integration scheme of an ML sensor solar plant system's implementation and outlines its future digital transformation into a smart solar plant based on the integrated cutting-edge technologies; and estimates the impact of ML technologies based on the proposed scheme on a solar plant value chain.
Collapse
|
15
|
Rezaeenour J, Ahmadi M, Jelodar H, Shahrooei R. Systematic review of content analysis algorithms based on deep neural networks. MULTIMEDIA TOOLS AND APPLICATIONS 2022; 82:17879-17903. [PMID: 36313481 PMCID: PMC9589819 DOI: 10.1007/s11042-022-14043-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/02/2021] [Revised: 07/12/2022] [Accepted: 10/06/2022] [Indexed: 06/16/2023]
Abstract
Today according to social media, the internet, Etc. Data is rapidly produced and occupies a large space in systems that have resulted in enormous data warehouses; the progress in information technology has significantly increased the speed and ease of data flow.text mining is one of the most important methods for extracting a useful model through extracting and adapting knowledge from data sets. However, many studies have been conducted based on the usage of deep learning for text processing and text mining issues.The idea and method of text mining are one of the fields that seek to extract useful information from unstructured textual data that is used very today. Deep learning and machine learning techniques in classification and text mining and their type are discussed in this paper as well. Neural networks of various kinds, namely, ANN, RNN, CNN, and LSTM, are the subject of study to select the best technique. In this study, we conducted a Systematic Literature Review to extract and associate the algorithms and features that have been used in this area. Based on our search criteria, we retrieved 130 relevant studies from electronic databases between 1997 and 2021; we have selected 43 studies for further analysis using inclusion and exclusion criteria in Section 3.2. According to this study, hybrid LSTM is the most widely used deep learning algorithm in these studies, and SVM in machine learning method high accuracy in result shown.
Collapse
Affiliation(s)
- Jalal Rezaeenour
- Department of Industrial Engineering, University of Qom, Qom, Iran
| | - Mahnaz Ahmadi
- Department of Industrial Engineering, University of Qom, Qom, Iran
| | - Hamed Jelodar
- Faculty of computer science, Dalhousie University, 6050 University Ave, Halifax, NS B3H 1W5 Canada
| | - Roshan Shahrooei
- Department of Industrial Engineering, University of Qom, Qom, Iran
| |
Collapse
|
16
|
Solar power time series forecasting utilising wavelet coefficients. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2022.08.016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
|
17
|
Transfer learning strategies for solar power forecasting under data scarcity. Sci Rep 2022; 12:14643. [PMID: 36030346 PMCID: PMC9420121 DOI: 10.1038/s41598-022-18516-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2022] [Accepted: 08/12/2022] [Indexed: 11/22/2022] Open
Abstract
Accurately forecasting solar plants production is critical for balancing supply and demand and for scheduling distribution networks operation in the context of inclusive smart cities and energy communities. However, the problem becomes more demanding, when there is insufficient amount of data to adequately train forecasting models, due to plants being recently installed or because of lack of smart-meters. Transfer learning (TL) offers the capability of transferring knowledge from the source domain to different target domains to resolve related problems. This study uses the stacked Long Short-Term Memory (LSTM) model with three TL strategies to provide accurate solar plant production forecasts. TL is exploited both for weight initialization of the LSTM model and for feature extraction, using different freezing approaches. The presented TL strategies are compared to the conventional non-TL model, as well as to the smart persistence model, at forecasting the hourly production of 6 solar plants. Results indicate that TL models significantly outperform the conventional one, achieving 12.6% accuracy improvement in terms of RMSE and 16.3% in terms of forecast skill index with 1 year of training data. The gap between the two approaches becomes even bigger when fewer training data are available (especially in the case of a 3-month training set), breaking new ground in power production forecasting of newly installed solar plants and rendering TL a reliable tool in the hands of self-producers towards the ultimate goal of energy balancing and demand response management from an early stage.
Collapse
|
18
|
Abstract
According to the World Energy Investment 2018 report, the global annual investment in renewable energy exceeded USD 200 billion for eight consecutive years until 2017. In this paper, a deep-learning-based time-series prediction method, namely a gated recurrent unit (GRU)-based prediction method, is proposed to predict energy generation in Taiwan. Data on thermal power (coal, oil, and gas power), renewable energy (conventional hydropower, solar power, and wind power), pumped hydropower, and nuclear power generation for 1991 to 2020 were obtained from the Bureau of Energy, Ministry of Economic Affairs, Taiwan, and the Taiwan Power Company. The proposed GRU-based method was compared with six common forecasting methods: autoregressive integrated moving average, exponential smoothing (ETS), Holt–Winters ETS, support vector regression (SVR), whale-optimization-algorithm-based SVR, and long short-term memory. Among the methods compared, the proposed method had the lowest mean absolute percentage error and root mean square error and thus the highest accuracy. Government agencies and power companies in Taiwan can use the predictions of accurate energy forecasting models as references to formulate energy policies and design plans for the development of alternative energy sources.
Collapse
|
19
|
Application of Temporal Fusion Transformer for Day-Ahead PV Power Forecasting. ENERGIES 2022. [DOI: 10.3390/en15145232] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The energy generated by a solar photovoltaic (PV) system depends on uncontrollable factors, including weather conditions and solar irradiation, which leads to uncertainty in the power output. Forecast PV power generation is vital to improve grid stability and balance the energy supply and demand. This study aims to predict hourly day-ahead PV power generation by applying Temporal Fusion Transformer (TFT), a new attention-based architecture that incorporates an interpretable explanation of temporal dynamics and high-performance forecasting over multiple horizons. The proposed forecasting model has been trained and tested using data from six different facilities located in Germany and Australia. The results have been compared with other algorithms like Auto Regressive Integrated Moving Average (ARIMA), Long Short-Term Memory (LSTM), Multi-Layer Perceptron (MLP), and Extreme Gradient Boosting (XGBoost), using statistical error indicators. The use of TFT has been shown to be more accurate than the rest of the algorithms to forecast PV generation in the aforementioned facilities.
Collapse
|
20
|
Ataee S, Brochet X, Peña-Reyes CA. Bacteriophage Genetic Edition Using LSTM. FRONTIERS IN BIOINFORMATICS 2022; 2:932319. [PMID: 36353213 PMCID: PMC9639385 DOI: 10.3389/fbinf.2022.932319] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Accepted: 06/06/2022] [Indexed: 09/16/2023] Open
Abstract
Bacteriophages are gaining increasing interest as antimicrobial tools, largely due to the emergence of multi-antibiotic-resistant bacteria. Although their huge diversity and virulence make them particularly attractive for targeting a wide range of bacterial pathogens, it is difficult to select suitable phages due to their high specificity which limits their host range. In addition, other challenges remain such as structural fragility under certain environmental conditions, immunogenicity of phage therapy, or development of bacterial resistance. The use of genetically engineered phages may reduce characteristics that hinder prophylactic and therapeutic applications of phages. Nowadays, there is no systematic method to modify a given phage genome conferring its sought characteristics. We explore the use of artificial intelligence for this purpose as it has the potential to both guide and accelerate genome modification to generate phage variants with unique properties that overcome the limitations of natural phages. We propose an original architecture composed of two deep learning-driven components: a phage-bacterium interaction predictor and a phage genome-sequence generator. The former is a multi-branch 1-D convolutional neural network (1D-CNN) that analyses phage and bacterial genomes to predict interactions. The latter is a recurrent neural network, more particularly a long short-term memory (LSTM), that performs genomic modifications to a phage to offer substantial host range improvement. For this component, we developed two different architectures composed of one or two stacked LSTM layers with 256 neurons each. These generators are used to modify, more precisely to rewrite, the genome sequence of 42 selected phages, while the predictor is used to estimate the host range of the modified bacteriophages across 46 strains of Pseudomonas aeruginosa. The proposed generators, trained with an average accuracy of 96.1%, are able to improve the host range for an average of 18 phages among the 42 under study, increasing both their average host range, by 73.0 and 103.7%, and the maximum host ranges from 21 to 24 and 29, respectively. These promising results showed that the use of deep learning methodologies allows genetic modification of phages to extend, for instance, their host range, confirming the potential of these approaches to guide bacteriophage engineering.
Collapse
Affiliation(s)
- Shabnam Ataee
- Institute of Information and Communication Technology (IICT), School of Management and Engineering Vaud (HEIG-VD), Yverdon-les-Bains, Switzerland
- HES-SO University of Applied Sciences and Arts Western Switzerland, Delémont, Switzerland
- CI4CB—Computational Intelligence for Computational Biology, SIB—Swiss Institute of Bioinformatics, Lausanne, Switzerland
| | - Xavier Brochet
- Institute of Information and Communication Technology (IICT), School of Management and Engineering Vaud (HEIG-VD), Yverdon-les-Bains, Switzerland
- HES-SO University of Applied Sciences and Arts Western Switzerland, Delémont, Switzerland
- CI4CB—Computational Intelligence for Computational Biology, SIB—Swiss Institute of Bioinformatics, Lausanne, Switzerland
| | - Carlos Andrés Peña-Reyes
- Institute of Information and Communication Technology (IICT), School of Management and Engineering Vaud (HEIG-VD), Yverdon-les-Bains, Switzerland
- HES-SO University of Applied Sciences and Arts Western Switzerland, Delémont, Switzerland
- CI4CB—Computational Intelligence for Computational Biology, SIB—Swiss Institute of Bioinformatics, Lausanne, Switzerland
| |
Collapse
|
21
|
Management of Distributed Renewable Energy Resources with the Help of a Wireless Sensor Network. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12146908] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
Photovoltaic (PV) and wind energy are widely considered eco-friendly renewable energy resources. However, due to the unpredictable oscillations in solar and wind power production, efficient management to meet load demands is often hard to achieve. As a result, precise forecasting of PV and wind energy production is critical for grid managers to limit the impact of random fluctuations. In this study, the kernel recursive least-squares (KRLS) algorithm is proposed for the prediction of PV and wind energy. The wireless sensor network (WSN) typically adopted for data collection with a flexible configuration of sensor nodes is used to transport PV and wind production data to the monitoring center. For efficient transmission of the data production, a link scheduling technique based on sensor node attributes is proposed. Different statistical and machine learning (ML) techniques are examined with respect to the proposed KRLS algorithm for performance analysis. The comparison results show that the KRLS algorithm surpasses all other regression approaches. For both PV and wind power feed-in forecasts, the proposed KRLS algorithm demonstrates high forecasting accuracy. In addition, the link scheduling proposed for the transmission of data for the management of distributed renewable energy resources is compared with a reference technique to show its comparable performance. The efficacy of the proposed KRLS model is better than other regression models in all assessment events in terms of an RMSE value of 0.0146, MAE value of 0.00021, and R2 of 99.7% for PV power, and RMSE value of 0.0421, MAE value of 0.0018, and R2 of 88.17% for wind power. In addition to this, the proposed link scheduling approach results in 22% lower latency and 38% higher resource utilization through the efficient scheduling of time slots.
Collapse
|
22
|
Prediction for the Settlement of Concrete Face Rockfill Dams Using Optimized LSTM Model via Correlated Monitoring Data. WATER 2022. [DOI: 10.3390/w14142157] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/04/2022]
Abstract
Settlement prediction is of great importance for safety control of concrete-face rockfill dams (CFRDs) during the operation stage. However, the prediction accuracy achieved by the commonly used hydrostatic–seasonal–time (HST) methods, without the consideration of the previous conditions of influencing factors, is not competitive. Moreover, in most methods, settlement data at each monitoring point are modeled individually; the correlation relationships between settlements are neglected. In this paper, a method based on an optimized long short-term memory (LSTM) model is proposed to predict the settlement of CFRDs, modeling multiple monitoring data series with strong correlation relationships simultaneously. In the method, settlement data series are classified into several categories, firstly according to a global relevance measure. Then, the cuckoo search (CS) algorithm is applied to optimize the hyper-parameters in the neural network structure of LSTM. Ultimately, the LSTM model is utilized to predict the multiple settlement data series classified in the same category. Results indicate that the proposed method has a better prediction performance compared with the LSTM model, the back propagation neural network (BPNN) model, and the HST with single monitoring point.
Collapse
|
23
|
An Alternative to Index-Based Gas Sourcing Using Neural Networks. ENERGIES 2022. [DOI: 10.3390/en15134708] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
An index on the gas market commonly refers to the average price of a certain trading product, e.g., over the period of one month. Index-based sourcing is a widely-used habit in modern gas business. Risks are reduced by averaging prices over the purchasing period. Due to the significant volume, there have been many attempts to ”beat the index”, i.e., to design a strategy that, over time, offers cheaper prices than the index. Here, we use neural networks to identify n, n∈N, optimal shopping points. Both classification- and forecasting-based strategies are tested to decide on each trading day if gas should be purchased or not. Thereby, we use the Front Month index based on prices from the Dutch Title Transfer Facility as an example. Regarding cumulative performance, all but a very simple myopic algorithm are able to outperform the index. However, each strategy has its flaws and some positive results are due to the price increase during 2021. If one opts for an active sourcing strategy, then a forecasting-based approach is the best choice.
Collapse
|
24
|
Design of induction motor speed observer based on long short-term memory. Neural Comput Appl 2022. [DOI: 10.1007/s00521-022-07458-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
25
|
A novel transfer learning-based short-term solar forecasting approach for India. Neural Comput Appl 2022. [DOI: 10.1007/s00521-022-07328-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
26
|
Improvement in Solar-Radiation Forecasting Based on Evolutionary KNEA Method and Numerical Weather Prediction. SUSTAINABILITY 2022. [DOI: 10.3390/su14116824] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Accurate forecasting of solar radiation (Rs) is significant to photovoltaic power generation and agricultural management. The National Centers for Environmental Prediction (NECP) has released its latest Global Ensemble Forecast System version 12 (GEFSv12) prediction product; however, the capability of this numerical weather product for Rs forecasting has not been evaluated. This study intends to establish a coupling algorithm based on a bat algorithm (BA) and Kernel-based nonlinear extension of Arps decline (KNEA) for post-processing 1–3 d ahead Rs forecasting based on the GEFSv12 in Xinjiang of China. The new model also compares two empirical statistical methods, which were quantile mapping (QM) and Equiratio cumulative distribution function matching (EDCDFm), and compares six machine-learning methods, e.g., long-short term memory (LSTM), support vector machine (SVM), XGBoost, KNEA, BA-SVM, BA-XGBoost. The results show that the accuracy of forecasting Rs from all of the models decreases with the extension of the forecast period. Compared with the GEFS raw Rs data over the four stations, the RMSE and MAE of QM and EDCDFm models decreased by 20% and 15%, respectively. In addition, the BA-KNEA model was superior to the GEFSv12 raw Rs data and other post-processing methods, with R2 = 0.782–0.829, RMSE = 3.240–3.685 MJ m−2 d−1, MAE = 2.465–2.799 MJ m−2 d−1, and NRMSE = 0.152–0.173.
Collapse
|
27
|
Abstract
Accurate short-term solar forecasting is challenging due to weather uncertainties associated with cloud movements. Typically, a solar station comprises a single prediction model irrespective of time and cloud condition, which often results in suboptimal performance. In the proposed model, different categories of cloud movement are discovered using K-medoid clustering. To ensure broader variation in cloud movements, neighboring stations were also used that were selected using a dynamic time warping (DTW)-based similarity score. Next, cluster-specific models were constructed. At the prediction time, the current weather condition is first matched with the different weather groups found through clustering, and a cluster-specific model is subsequently chosen. As a result, multiple models are dynamically used for a particular day and solar station, which improves performance over a single site-specific model. The proposed model achieved 19.74% and 59% less normalized root mean square error (NRMSE) and mean rank compared to the benchmarks, respectively, and was validated for nine solar stations across two regions and three climatic zones of India.
Collapse
|
28
|
Development of Charging/Discharging Scheduling Algorithm for Economical and Energy-Efficient Operation of Multi-EV Charging Station. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12094786] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/07/2022]
Abstract
As the number of electric vehicles (EVs) significantly increases, the excessive charging demand of parked EVs in the charging station may incur an instability problem to the electricity network during peak hours. For the charging station to take a microgrid (MG) structure, an economical and energy-efficient power management scheme is required for the power provision of EVs while considering the local load demand of the MG. For these purposes, this study presents the power management scheme of interdependent MG and EV fleets aided by a novel EV charging/discharging scheduling algorithm. In this algorithm, the maximum amount of discharging power from parked EVs is determined based on the difference between local load demand and photovoltaic (PV) power production to alleviate imbalances occurred between them. For the power management of the MG with charging/discharging scheduling of parked EVs in the PV-based charging station, multi-objective optimization is performed to minimize the operating cost and grid dependency. In addition, the proposed scheme maximizes the utilization of EV charging/discharging while satisfying the charging requirements of parked EVs. Moreover, a more economical and energy-efficient PV-based charging station is established using the future trends of local load demand and PV power production predicted by a gated recurrent unit (GRU) network. With the proposed EV charging/discharging scheduling algorithm, the operating cost of PV-based charging station is decreased by 167.71% and 28.85% compared with the EV charging scheduling algorithm and the conventional EV charging/discharging scheduling algorithm, respectively. It is obvious that the economical and energy-efficient operation of PV-based charging station can be accomplished by applying the power management scheme with the proposed EV charging/discharging scheduling strategy.
Collapse
|
29
|
Efficient deep learning-based semantic mapping approach using monocular vision for resource-limited mobile robots. Neural Comput Appl 2022. [DOI: 10.1007/s00521-022-07273-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
30
|
A Novel Approach to Generate Hourly Photovoltaic Power Scenarios. SUSTAINABILITY 2022. [DOI: 10.3390/su14084617] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
Photovoltaic power is playing an ever-increasing role in the energy mix of countries worldwide. It is a stochastic energy source, and simulation models are needed to establish reliable risk management. This paper presents a novel approach for simulating hourly solar irradiation and—as a consequence—photovoltaic power based on easily accessible data such as wind, temperature, and cloudiness. Solar simulations are generated via a multiplication factor that scales the maximum possible solar irradiation. Photovoltaic simulations are then derived using formulas that approximate the physical interdependencies. The resulting simulations are unbiased on an annual level and reasonably reflect historic irradiation movements. Interpreting our approach as a descriptive model, we find that error values vary over the year and with granularity. Errors are highest when considering hourly values in wintertime, especially in the morning or late afternoon.
Collapse
|
31
|
Solar Irradiance Forecasting to Short-Term PV Power: Accuracy Comparison of ANN and LSTM Models. ENERGIES 2022. [DOI: 10.3390/en15072457] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
The use of renewable energies, such as Photovoltaic (PV) solar power, is necessary to meet the growing energy consumption. PV solar power generation has intrinsic characteristics related to the climatic variables that cause intermittence during the generation process, promoting instabilities and insecurity in the electrical system. One of the solutions for this problem uses methods for the Prediction of Solar Photovoltaic Power Generation (PSPPG). In this context, the aim of this study is to develop and compare the prediction accuracy of solar irradiance between Artificial Neural Network (ANN) and Long-Term Short Memory (LSTM) network models, from a comprehensive analysis that simultaneously considers two distinct sets of exogenous meteorological input variables and three short-term prediction horizons (1, 15 and 60 min), in a controlled experimental environment. The results indicate that there is a significant difference (p < 0.001) in the prediction accuracy between the ANN and LSTM models, with better overall prediction accuracy skill for the LSTM models (MAPE = 19.5%), except for the 60 min prediction horizon. Furthermore, the accuracy difference between the ANN and LSTM models decreased as the prediction horizon increased, and no significant influence was observed on the accuracy of the prediction with both sets of evaluated meteorological input variables.
Collapse
|
32
|
Short-Term Solar Power Predicting Model Based on Multi-Step CNN Stacked LSTM Technique. ENERGIES 2022. [DOI: 10.3390/en15062150] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Variability in solar irradiance has an impact on the stability of solar systems and the grid’s safety. With the decreasing cost of solar panels and recent advancements in energy conversion technology, precise solar energy forecasting is critical for energy system integration. Despite extensive research, there is still potential for advancement of solar irradiance prediction accuracy, especially global horizontal irradiance. Global Horizontal Irradiance (GHI) (unit: KWh/m2) and the Plane Of Array (POA) irradiance (unit: W/m2) were used as the forecasting objectives in this research, and a hybrid short-term solar irradiance prediction model called modified multi-step Convolutional Neural Network (CNN)-stacked Long-Short-Term-Memory network (LSTM) with drop-out was proposed. The real solar data from Sweihan Photovoltaic Independent Power Project in Abu Dhabi, UAE is preprocessed, and features were extracted using modified CNN layers. The output result from CNN is used to predict the targets using a stacked LSTM network and the efficiency is proved by comparing statistical performance measures in terms of Root Mean Square Error (RMSE), Mean Absolute Percentage Error (MAPE), Mean Squared Error (MAE), and R2 scores, with other contemporary machine learning and deep-learning-based models. The proposed model offered the best RMSE and R2 values of 0.36 and 0.98 for solar irradiance prediction and 61.24 with R2 0.96 for POA prediction, which also showed better performance as compared to the published works in the literature.
Collapse
|
33
|
Wu Y, Sun L, Sun X, Wang B. A hybrid XGBoost-ISSA-LSTM model for accurate short-term and long-term dissolved oxygen prediction in ponds. ENVIRONMENTAL SCIENCE AND POLLUTION RESEARCH INTERNATIONAL 2022; 29:18142-18159. [PMID: 34686955 DOI: 10.1007/s11356-021-17020-5] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/19/2021] [Accepted: 10/09/2021] [Indexed: 06/13/2023]
Abstract
Dissolved oxygen (DO) is one of the most critical factors to measure the water quality in ponds, which greatly impacts on healthy growth of aquatic organisms. To improve the prediction accuracy of DO and grasp its changing trends, a novel hybrid DO prediction model based on the long short-term memory network (LSTM) optimized by an improved sparrow search algorithm (ISSA) is proposed. Firstly, to discard redundant information and improve the calculation speed of the model, the key factors that have a greater correlation with DO are selected as the input parameters by extreme gradient boosting (XGBoost). Secondly, towards expanding the searching range of sparrows and balancing the global and local search, we introduce an adaptive factor exponential declining strategy for producers, and an arcsine decreasing strategy for scouters, which nonlinearly decreases with the increase of iterations. Besides, we also improve the position updating of scouters, making the sparrows gradually move to the best position. Finally, LSTM is optimized by ISSA to get the best initial weights and thresholds to construct an XGBoost-ISSA-LSTM DO prediction model. Specifically, we first analyze the method for water quality prediction, which can make short-term prediction (including about 1 h, 2 h) and long-term prediction (including about 12 h, 24 h) of DO. In 1-h prediction, the root mean square error (RMSE) of the model is 0.5571, the mean absolute error (MAE) is 0.2572, and the R2 is 0.9276. In 24 h prediction, RMSE of the model is 0.6310, MAE is 0.4562, and R2 is 0.9082. The experimental results show that the proposed model has better generalization performance and higher prediction accuracy compared with other common models. Therefore, the presented model based on XGBoost-ISSA-LSTM is more effective and could meet the actual demand of accurate prediction of DO.
Collapse
Affiliation(s)
- Yuhan Wu
- National Innovation Center for Digital Fishery, China Agricultural University, 17 Tsinghua East Road, P. O. Box 121, Beijing, 100083, People's Republic of China
- Precision Agricultural Technology Integration Research Base (Fishery), Ministry of Agriculture and Rural Affairs, Beijing, 100083, China
- College of Information and Electrical Engineering, China Agricultural University, Beijing, 100083, China
| | - Longqing Sun
- National Innovation Center for Digital Fishery, China Agricultural University, 17 Tsinghua East Road, P. O. Box 121, Beijing, 100083, People's Republic of China.
- Precision Agricultural Technology Integration Research Base (Fishery), Ministry of Agriculture and Rural Affairs, Beijing, 100083, China.
- College of Information and Electrical Engineering, China Agricultural University, Beijing, 100083, China.
| | - Xibei Sun
- National Innovation Center for Digital Fishery, China Agricultural University, 17 Tsinghua East Road, P. O. Box 121, Beijing, 100083, People's Republic of China
- Precision Agricultural Technology Integration Research Base (Fishery), Ministry of Agriculture and Rural Affairs, Beijing, 100083, China
- College of Information and Electrical Engineering, China Agricultural University, Beijing, 100083, China
| | - Boning Wang
- National Innovation Center for Digital Fishery, China Agricultural University, 17 Tsinghua East Road, P. O. Box 121, Beijing, 100083, People's Republic of China
- Precision Agricultural Technology Integration Research Base (Fishery), Ministry of Agriculture and Rural Affairs, Beijing, 100083, China
- College of Information and Electrical Engineering, China Agricultural University, Beijing, 100083, China
| |
Collapse
|
34
|
The Important Role of Global State for Multi-Agent Reinforcement Learning. FUTURE INTERNET 2021. [DOI: 10.3390/fi14010017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Environmental information plays an important role in deep reinforcement learning (DRL). However, many algorithms do not pay much attention to environmental information. In multi-agent reinforcement learning decision-making, because agents need to make decisions combined with the information of other agents in the environment, this makes the environmental information more important. To prove the importance of environmental information, we added environmental information to the algorithm. We evaluated many algorithms on a challenging set of StarCraft II micromanagement tasks. Compared with the original algorithm, the standard deviation (except for the VDN algorithm) was smaller than that of the original algorithm, which shows that our algorithm has better stability. The average score of our algorithm was higher than that of the original algorithm (except for VDN and COMA), which shows that our work significantly outperforms existing multi-agent RL methods.
Collapse
|
35
|
A Multi-RNN Research Topic Prediction Model Based on Spatial Attention and Semantic Consistency-Based Scientific Influence Modeling. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE 2021; 2021:1766743. [PMID: 34961813 PMCID: PMC8710157 DOI: 10.1155/2021/1766743] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/23/2021] [Revised: 10/28/2021] [Accepted: 11/24/2021] [Indexed: 11/30/2022]
Abstract
Computer science discipline includes many research fields, which mutually influence and promote each other's development. This poses two great challenges of predicting the research topics of each research field. One is how to model fine-grained topic representation of a research field. The other is how to model research topic of different fields and keep the semantic consistency of research topics when learning the scientific influence context from other related fields. Unfortunately, the existing research topic prediction approaches cannot handle these two challenges. To solve these problems, we employ multiple different Recurrent Neural Network chains which model research topics of different fields and propose a research topic prediction model based on spatial attention and semantic consistency-based scientific influence modeling. Spatial attention is employed in field topic representation which can selectively extract the attributes from the field topics to distinguish the importance of field topic attributes. Semantic consistency-based scientific influence modeling maps research topics of different fields to a unified semantic space to obtain the scientific influence context of other related fields. Extensive experiment results on five related research fields in the computer science (CS) discipline show that the proposed model is superior to the most advanced methods and achieves good topic prediction performance.
Collapse
|
36
|
A Novel Feature Representation for Prediction of Global Horizontal Irradiance Using a Bidirectional Model. MACHINE LEARNING AND KNOWLEDGE EXTRACTION 2021. [DOI: 10.3390/make3040047] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Complex weather conditions—in particular clouds—leads to uncertainty in photovoltaic (PV) systems, which makes solar energy prediction very difficult. Currently, in the renewable energy domain, deep-learning-based sequence models have reported better results compared to state-of-the-art machine-learning models. There are quite a few choices of deep-learning architectures, among which Bidirectional Gated Recurrent Unit (BGRU) has apparently not been used earlier in the solar energy domain. In this paper, BGRU was used with a new augmented and bidirectional feature representation. The used BGRU network is more generalized as it can handle unequal lengths of forward and backward context. The proposed model produced 59.21%, 37.47%, and 76.80% better prediction accuracy compared to traditional sequence-based, bidirectional models, and some of the established states-of-the-art models. The testbed considered for evaluation of the model is far more comprehensive and reliable considering the variability in the climatic zones and seasons, as compared to some of the recent studies in India.
Collapse
|
37
|
Ozcan A, Catal C, Kasif A. Energy Load Forecasting Using a Dual-Stage Attention-Based Recurrent Neural Network. SENSORS 2021; 21:s21217115. [PMID: 34770422 PMCID: PMC8587894 DOI: 10.3390/s21217115] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/11/2021] [Revised: 10/22/2021] [Accepted: 10/22/2021] [Indexed: 11/24/2022]
Abstract
Providing a stable, low-price, and safe supply of energy to end-users is a challenging task. The energy service providers are affected by several events such as weather, volatility, and special events. As such, the prediction of these events and having a time window for taking preventive measures are crucial for service providers. Electrical load forecasting can be modeled as a time series prediction problem. One solution is to capture spatial correlations, spatial-temporal relations, and time-dependency of such temporal networks in the time series. Previously, different machine learning methods have been used for time series prediction tasks; however, there is still a need for new research to improve the performance of short-term load forecasting models. In this article, we propose a novel deep learning model to predict electric load consumption using Dual-Stage Attention-Based Recurrent Neural Networks in which the attention mechanism is used in both encoder and decoder stages. The encoder attention layer identifies important features from the input vector, whereas the decoder attention layer is used to overcome the limitations of using a fixed context vector and provides a much longer memory capacity. The proposed model improves the performance for short-term load forecasting (STLF) in terms of the Mean Absolute Error (MAE) and Root Mean Squared Errors (RMSE) scores. To evaluate the predictive performance of the proposed model, the UCI household electric power consumption (HEPC) dataset has been used during the experiments. Experimental results demonstrate that the proposed approach outperforms the previously adopted techniques.
Collapse
Affiliation(s)
- Alper Ozcan
- Department of Computer Engineering, Akdeniz University, Antalya 07070, Turkey;
| | - Cagatay Catal
- Department of Computer Science and Engineering, Qatar University, Doha 2713, Qatar
- Correspondence:
| | - Ahmet Kasif
- Department of Computer Engineering, Bursa Technical University, Bursa 16330, Turkey;
| |
Collapse
|
38
|
Chu Y, Li M, Coimbra CF, Feng D, Wang H. Intra-hour irradiance forecasting techniques for solar power integration: a review. iScience 2021; 24:103136. [PMID: 34723160 PMCID: PMC8531863 DOI: 10.1016/j.isci.2021.103136] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
The ever-growing installation of solar power systems imposes severe challenges on the operations of local and regional power grids due to the inherent intermittency and variability of ground-level solar irradiance. In recent decades, solar forecasting methodologies for intra-hour, intra-day and day-ahead energy markets have been extensively explored as cost-effective technologies to mitigate the negative effects on the power grids caused by solar power instability. In this work, the progress in intra-hour solar forecasting methodologies are comprehensively reviewed and concisely summarized. The theories behind the forecasting methodologies and how these theories are applied in various forecasting models are presented. The reviewed mathematical tools include regressive methods, stochastic learning methods, deep learning methods, and genetic algorithm. The reviewed forecasting methodologies include data-driven methods, local-sensing methods, hybrid forecasting methods, and application orientated methods that generate probabilistic forecasts and spatial forecasts. Furthermore, suggestions to accelerate the development of future intra-hour forecasting methods are provided.
Collapse
Affiliation(s)
- Yinghao Chu
- College of Electronics and Information Engineering, Shenzhen Key Laboratory of Digital Creative Technology, and Guangdong Province Engineering Laboratory for Digital Creative Technology, Shenzhen 518060, China
| | - Mengying Li
- Department of Mechanical Engineering & Research Institute for Smart Energy, The Hong Kong Polytechnic University, Kowloon, Hong Kong SAR
- Corresponding author
| | - Carlos F.M. Coimbra
- Department of Mechanical and Aerospace Engineering, University of California San Diego, La Jolla, CA, USA
| | - Daquan Feng
- College of Electronics and Information Engineering, Shenzhen Key Laboratory of Digital Creative Technology, and Guangdong Province Engineering Laboratory for Digital Creative Technology, Shenzhen 518060, China
| | - Huaizhi Wang
- Guangdong Key Laboratory of Electromagnetic Control and Intelligent Robots, Department of Mechatronics and Control Engineering, Shenzhen University, Shenzhen 518060, China
| |
Collapse
|
39
|
Short-Term Load Forecasting Based on Deep Learning Bidirectional LSTM Neural Network. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11178129] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Accurate load forecasting guarantees the stable and economic operation of power systems. With the increasing integration of distributed generations and electrical vehicles, the variability and randomness characteristics of individual loads and the distributed generation has increased the complexity of power loads in power systems. Hence, accurate and robust load forecasting results are becoming increasingly important in modern power systems. The paper presents a multi-layer stacked bidirectional long short-term memory (LSTM)-based short-term load forecasting framework; the method includes neural network architecture, model training, and bootstrapping. In the proposed method, reverse computing is combined with forward computing, and a feedback calculation mechanism is designed to solve the coupling of before and after time-series information of the power load. In order to improve the convergence of the algorithm, deep learning training is introduced to mine the correlation between historical loads, and the multi-layer stacked style of the network is established to manage the power load information. Finally, actual data are applied to test the proposed method, and a comparison of the results of the proposed method with different methods shows that the proposed method can extract dynamic features from the data as well as make accurate predictions, and the availability of the proposed method is verified with real operational data.
Collapse
|
40
|
Pan C, Tan J, Feng D. Prediction intervals estimation of solar generation based on gated recurrent unit and kernel density estimation. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2020.10.027] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
41
|
Moreno G, Santos C, Martín P, Rodríguez FJ, Peña R, Vuksanovic B. Intra-Day Solar Power Forecasting Strategy for Managing Virtual Power Plants. SENSORS 2021; 21:s21165648. [PMID: 34451090 PMCID: PMC8402480 DOI: 10.3390/s21165648] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/29/2021] [Revised: 08/18/2021] [Accepted: 08/20/2021] [Indexed: 12/04/2022]
Abstract
Solar energy penetration has been on the rise worldwide during the past decade, attracting a growing interest in solar power forecasting over short time horizons. The increasing integration of these resources without accurate power forecasts hinders the grid operation and discourages the use of this renewable resource. To overcome this problem, Virtual Power Plants (VPPs) provide a solution to centralize the management of several installations to minimize the forecasting error. This paper introduces a method to efficiently produce intra-day accurate Photovoltaic (PV) power forecasts at different locations, by using free and available information. Prediction intervals, which are based on the Mean Absolute Error (MAE), account for the forecast uncertainty which provides additional information about the VPP node power generation. The performance of the forecasting strategy has been verified against the power generated by a real PV installation, and a set of ground-based meteorological stations in geographical proximity have been used to emulate a VPP. The forecasting approach is based on a Long Short-Term Memory (LSTM) network and shows similar errors to those obtained with other deep learning methods published in the literature, offering a MAE performance of 44.19 W/m2 under different lead times and launch times. By applying this technique to 8 VPP nodes, the global error is reduced by 12.37% in terms of the MAE, showing huge potential in this environment.
Collapse
Affiliation(s)
- Guillermo Moreno
- Department of Electronics, University of Alcalá, Alcalá de Henares, 28805 Madrid, Spain; (G.M.); (P.M.)
| | - Carlos Santos
- Department of Signal Theory and Communications, University of Alcalá, Alcalá de Henares, 28805 Madrid, Spain; (C.S.); (R.P.)
| | - Pedro Martín
- Department of Electronics, University of Alcalá, Alcalá de Henares, 28805 Madrid, Spain; (G.M.); (P.M.)
| | - Francisco Javier Rodríguez
- Department of Electronics, University of Alcalá, Alcalá de Henares, 28805 Madrid, Spain; (G.M.); (P.M.)
- Correspondence: ; Tel.: +34-91-885-6561
| | - Rafael Peña
- Department of Signal Theory and Communications, University of Alcalá, Alcalá de Henares, 28805 Madrid, Spain; (C.S.); (R.P.)
| | - Branislav Vuksanovic
- School of Engineering, University of Portsmouth, Winston Churchill Ave., Portsmouth PO1 3HJ, UK;
| |
Collapse
|
42
|
A Comparison of the Performance of Supervised Learning Algorithms for Solar Power Prediction. ENERGIES 2021. [DOI: 10.3390/en14154424] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Science seeks strategies to mitigate global warming and reduce the negative impacts of the long-term use of fossil fuels for power generation. In this sense, implementing and promoting renewable energy in different ways becomes one of the most effective solutions. The inaccuracy in the prediction of power generation from photovoltaic (PV) systems is a significant concern for the planning and operational stages of interconnected electric networks and the promotion of large-scale PV installations. This study proposes the use of Machine Learning techniques to model the photovoltaic power production for a system in Medellín, Colombia. Four forecasting models were generated from techniques compatible with Machine Learning and Artificial Intelligence methods: K-Nearest Neighbors (KNN), Linear Regression (LR), Artificial Neural Networks (ANN) and Support Vector Machines (SVM). The results obtained indicate that the four methods produced adequate estimations of photovoltaic energy generation. However, the best estimate according to RMSE and MAE is the ANN forecasting model. The proposed Machine Learning-based models were demonstrated to be practical and effective solutions to forecast PV power generation in Medellin.
Collapse
|
43
|
Stergiou K, Karakasidis TE. Application of deep learning and chaos theory for load forecasting in Greece. Neural Comput Appl 2021. [DOI: 10.1007/s00521-021-06266-2] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
|
44
|
Enhanced Random Forest Model for Robust Short-Term Photovoltaic Power Forecasting Using Weather Measurements. ENERGIES 2021. [DOI: 10.3390/en14133992] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Short-term Photovoltaic (PV) Power Forecasting (STPF) is considered a topic of utmost importance in smart grids. The deployment of STPF techniques provides fast dispatching in the case of sudden variations due to stochastic weather conditions. This paper presents an efficient data-driven method based on enhanced Random Forest (RF) model. The proposed method employs an ensemble of attribute selection techniques to manage bias/variance optimization for STPF application and enhance the forecasting quality results. The overall architecture strategy gathers the relevant information to constitute a voted feature-weighting vector of weather inputs. The main emphasis in this paper is laid on the knowledge expertise obtained from weather measurements. The feature selection techniques are based on local Interpretable Model-Agnostic Explanations, Extreme Boosting Model, and Elastic Net. A comparative performance investigation using an actual database, collected from the weather sensors, demonstrates the superiority of the proposed technique versus several data-driven machine learning models when applied to a typical distributed PV system.
Collapse
|
45
|
Voltage Regulation For Residential Prosumers Using a Set of Scalable Power Storage. ENERGIES 2021. [DOI: 10.3390/en14113288] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Among the electrical problems observed from the solar irradiation variability, the electrical energy quality and the energetic dispatch guarantee stand out. The great revolution in batteries technologies has fostered its usage with the installation of photovoltaic system (PVS). This work presents a proposition for voltage regulation for residential prosumers using a set of scalable power batteries in passive mode, operating as a consumer device. The mitigation strategy makes decisions acting directly on the demand, for a storage bank, and the power of the storage element is selected in consequence of the results obtained from the power flow calculation step combined with the prediction of the solar radiation calculated by a recurrent neural network Long Short-Term Memory (LSTM) type. The results from the solar radiation predictions are used as subsidies to estimate, the state of the power grid, solving the power flow and evidencing the values of the electrical voltages 1-min enabling the entry of the storage device. In this stage, the OpenDSS (Open distribution system simulator) software is used, to perform the complete modeling of the power grid where the study will be developed, as well as simulating the effect of the overvoltages mitigation system. The clear sky day stored 9111 Wh/day of electricity to mitigate overvoltages at the supply point; when compared to other days, the clear sky day needed to store less electricity. On days of high variability, the energy stored to regulate overvoltages was 84% more compared to a clear day. In order to maintain a constant state of charge (SoC), it is necessary that the capacity of the battery bank be increased to meet the condition of maximum accumulated energy. Regarding the total loading of the storage system, the days of low variability consumed approximately 12% of the available capacity of the battery, considering the SoC of 70% of the capacity of each power level.
Collapse
|
46
|
Performance Evaluation of Neural Network-Based Short-Term Solar Irradiation Forecasts. ENERGIES 2021. [DOI: 10.3390/en14113030] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Due to the globally increasing share of renewable energy sources like wind and solar power, precise forecasts for weather data are becoming more and more important. To compute such forecasts numerous authors apply neural networks (NN), whereby models became ever more complex recently. Using solar irradiation as an example, we verify if this additional complexity is required in terms of forecasting precision. Different NN models, namely the long-short term (LSTM) neural network, a convolutional neural network (CNN), and combinations of both are benchmarked against each other. The naive forecast is included as a baseline. Various locations across Europe are tested to analyze the models’ performance under different climate conditions. Forecasts up to 24 h in advance are generated and compared using different goodness of fit (GoF) measures. Besides, errors are analyzed in the time domain. As expected, the error of all models increases with rising forecasting horizon. Over all test stations it shows that combining an LSTM network with a CNN yields the best performance. However, regarding the chosen GoF measures, differences to the alternative approaches are fairly small. The hybrid model’s advantage lies not in the improved GoF but in its versatility: contrary to an LSTM or a CNN, it produces good results under all tested weather conditions.
Collapse
|
47
|
A Fuzzy Seasonal Long Short-Term Memory Network for Wind Power Forecasting. MATHEMATICS 2021. [DOI: 10.3390/math9111178] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
To protect the environment and achieve the Sustainable Development Goals (SDGs), reducing greenhouse gas emissions has been actively promoted by global governments. Thus, clean energy, such as wind power, has become a very important topic among global governments. However, accurately forecasting wind power output is not a straightforward task. The present study attempts to develop a fuzzy seasonal long short-term memory network (FSLSTM) that includes the fuzzy decomposition method and long short-term memory network (LSTM) to forecast a monthly wind power output dataset. LSTM technology has been successfully applied to forecasting problems, especially time series problems. This study first adopts the fuzzy seasonal index into the fuzzy LSTM model, which effectively extends the traditional LSTM technology. The FSLSTM, LSTM, autoregressive integrated moving average (ARIMA), generalized regression neural network (GRNN), back propagation neural network (BPNN), least square support vector regression (LSSVR), and seasonal autoregressive integrated moving average (SARIMA) models are then used to forecast monthly wind power output datasets in Taiwan. The empirical results indicate that FSLSTM can obtain better performance in terms of forecasting accuracy than the other methods. Therefore, FSLSTM can efficiently provide credible prediction values for Taiwan’s wind power output datasets.
Collapse
|
48
|
Abstract
Previous studies on commercial vacancy have mostly focused on the survival rate of commercial buildings over a certain time frame and the cause of their closure, due to a lack of appropriate data. Based on a time-series of 2,940,000 individual commercial facility data, the main purpose of this research is two-fold: (1) to examine long short-term memory (LSTM) as a feasible option for predicting trends in commercial districts and (2) to identify the influence of each variable on prediction results for establishing evidence-based decision-making on the primary influences of commercial vacancy. The results indicate that LSTM can be useful in simulating commercial vacancy dynamics. Furthermore, sales, floating population, and franchise rate were found to be the main determinants for commercial vacancy. The results suggest that it is imperative to control the cannibalization of commercial districts and develop their competitiveness to retain a consistent floating population.
Collapse
|
49
|
Guleryuz D. Forecasting outbreak of COVID-19 in Turkey; Comparison of Box-Jenkins, Brown's exponential smoothing and long short-term memory models. PROCESS SAFETY AND ENVIRONMENTAL PROTECTION : TRANSACTIONS OF THE INSTITUTION OF CHEMICAL ENGINEERS, PART B 2021; 149:927-935. [PMID: 33776248 PMCID: PMC7983456 DOI: 10.1016/j.psep.2021.03.032] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/03/2020] [Accepted: 03/15/2021] [Indexed: 05/27/2023]
Abstract
The new coronavirus disease (COVID-19), which first appeared in China in December 2019, has pervaded throughout the world. Because the epidemic started later in Turkey than other European countries, it has the least number of deaths according to the current data. Outbreak management in COVID-19 is of great importance for public safety and public health. For this reason, prediction models can decide the precautionary warning to control the spread of the disease. Therefore, this study aims to develop a forecasting model, considering statistical data for Turkey. Box-Jenkins Methods (ARIMA), Brown's Exponential Smoothing model and RNN-LSTM are employed. ARIMA was selected with the lowest AIC values (12.0342, -2.51411, 12.0253, 3.67729, -4.24405, and 3.66077) as the best fit for the number of total case, the growth rate of total cases, the number of new cases, the number of total death, the growth rate of total deaths and the number of new deaths, respectively. The forecast values of the number of each indicator are stable over time. In the near future, it will not show an increasing trend in the number of cases for Turkey. In addition, the pandemic will become a steady state and an increase in mortality rates will not be expected between 17-31 May. ARIMA models can be used in fresh outbreak situations to ensure health and safety. It is vital to make quick and accurate decisions on the precautions for epidemic preparedness and management, so corrective and preventive actions can be updated considering obtained values.
Collapse
Affiliation(s)
- Didem Guleryuz
- Department of Industrial Engineering, Bayburt University, Bayburt, Turkey
| |
Collapse
|
50
|
Methods for Integrating Extraterrestrial Radiation into Neural Network Models for Day-Ahead PV Generation Forecasting. ENERGIES 2021. [DOI: 10.3390/en14092601] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Variability, intermittency, and limited controllability are inherent characteristics of photovoltaic (PV) generation that result in inaccurate solutions to scheduling problems and the instability of the power grid. As the penetration level of PV generation increases, it becomes more important to mitigate these problems by improving forecasting accuracy. One of the alternatives to improving forecasting performance is to include a seasonal component. Thus, this study proposes using information on extraterrestrial radiation (ETR), which is the solar radiation outside of the atmosphere, in neural network models for day-ahead PV generation forecasting. Specifically, five methods for integrating the ETR into the neural network models are presented: (1) division preprocessing, (2) multiplication preprocessing, (3) replacement of existing input, (4) inclusion as additional input, and (5) inclusion as an intermediate target. The methods were tested using two datasets in Australia using four neural network models: Multilayer perceptron and three recurrent neural network(RNN)-based models including vanilla RNN, long short-term memory, and gated recurrent unit. It was found that, among the integration methods, including the ETR as the intermediate target improved the mean squared error by 4.1% on average, and by 12.28% at most in RNN-based models. These results verify that the integration of ETR into the PV forecasting models based on neural networks can improve the forecasting performance.
Collapse
|