Mao W, Liu P, Huang J. SF-Transformer: A Mutual Information-Enhanced Transformer Model with Spot-Forward Parity for Forecasting Long-Term Chinese Stock Index Futures Prices.
ENTROPY (BASEL, SWITZERLAND) 2024;
26:478. [PMID:
38920487 PMCID:
PMC11202502 DOI:
10.3390/e26060478]
[Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/01/2024] [Revised: 05/23/2024] [Accepted: 05/27/2024] [Indexed: 06/27/2024]
Abstract
The complexity in stock index futures markets, influenced by the intricate interplay of human behavior, is characterized as nonlinearity and dynamism, contributing to significant uncertainty in long-term price forecasting. While machine learning models have demonstrated their efficacy in stock price forecasting, they rely solely on historical price data, which, given the inherent volatility and dynamic nature of financial markets, are insufficient to address the complexity and uncertainty in long-term forecasting due to the limited connection between historical and forecasting prices. This paper introduces a pioneering approach that integrates financial theory with advanced deep learning methods to enhance predictive accuracy and risk management in China's stock index futures market. The SF-Transformer model, combining spot-forward parity and the Transformer model, is proposed to improve forecasting accuracy across short and long-term horizons. Formulated upon the arbitrage-free futures pricing model, the spot-forward parity model offers variables such as stock index price, risk-free rate, and stock index dividend yield for forecasting. Our insight is that the mutual information generated by these variables has the potential to significantly reduce uncertainty in long-term forecasting. A case study on predicting major stock index futures prices in China demonstrates the superiority of the SF-Transformer model over models based on LSTM, MLP, and the stock index futures arbitrage-free pricing model, covering both short and long-term forecasting up to 28 days. Unlike existing machine learning models, the Transformer processes entire time series concurrently, leveraging its attention mechanism to discern intricate dependencies and capture long-range relationships, thereby offering a holistic understanding of time series data. An enhancement of mutual information is observed after introducing spot-forward parity in the forecasting. The variation of mutual information and ablation study results highlights the significant contributions of spot-forward parity, particularly to the long-term forecasting. Overall, these findings highlight the SF-Transformer model's efficacy in leveraging spot-forward parity for reducing uncertainty and advancing robust and comprehensive approaches in long-term stock index futures price forecasting.
Collapse