Shi T, Li P, Yang W, Qi A, Qiao J. Application of TCN-biGRU neural network in [Formula: see text] concentration prediction.
ENVIRONMENTAL SCIENCE AND POLLUTION RESEARCH INTERNATIONAL 2023;
30:119506-119517. [PMID:
37930575 DOI:
10.1007/s11356-023-30354-6]
[Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/17/2023] [Accepted: 10/05/2023] [Indexed: 11/07/2023]
Abstract
Fine particulate matter ([Formula: see text]) poses a significant threat to human life and health, and therefore, accurately predicting [Formula: see text] concentration is critical for controlling air pollution. Two improved types of recurrent neural networks (RNNs), the long short-term memory (LSTM) and gated recurrent unit (GRU), have been widely used in time series data prediction due to their ability to capture temporal features. However, both degrade into random guessing as the time length increases. In order to enhance the accuracy of [Formula: see text] concentration prediction and address the issue of random guessing in RNNs neural networks, this study introduces a TCN-biGRU neural network model. This model is a hybrid prediction approach based on combining temporal convolutional networks (TCN) and bidirectional gated recurrent units (bi-GRU). TCN extracts higher-level feature information from longer time series data of [Formula: see text] concentrations, while bi-GRU captures features from past and future data to achieve more accurate predictive outcomes. This case study utilizes data from monitoring stations in Beijing in 2021 for conducting [Formula: see text] prediction experiments. The TCN-biGRU model achieves an average absolute error, root mean square error, and [Formula: see text] of 4.20, 7.71, and 0.961 in its predictive outcomes. When compared to the predictive outcomes of individual LSTM, GRU, and bi-GRU models, it is evident that the TCN-biGRU model exhibits smaller errors and superior predictive performance.
Collapse