Multi-Variable Time Series Decoding with Long Short-Term Memory and Mixture Attention

Journal Title: Acadlore Transactions on AI and Machine Learning - Year 2023, Vol 2, Issue 3

Abstract

The task of interpreting multi-variable time series data, while also forecasting outcomes accurately, is an ongoing challenge within the machine learning domain. This study presents an advanced method of utilizing Long Short-Term Memory (LSTM) recurrent neural networks in the analysis of such data, with specific attention to both target and exogenous variables. The novel approach aims to extract hidden states that are unique to individual variables, thereby capturing the distinctive dynamics inherent in multi-variable time series and allowing the elucidation of each variable's contribution to predictive outcomes. A pioneering mixture attention mechanism is introduced, which, by leveraging the aforementioned variable-specific hidden states, characterizes the generative process of the target variable. The study further enhances this methodology by formulating associated training techniques that permit concurrent learning of network parameters, variable interactions, and temporal significance with respect to the target prediction. The effectiveness of this approach is empirically validated through rigorous experimentation on three real-world datasets, including the 2022 closing prices of three major stocks - Apple (AAPL), Amazon (AMZN), and Microsoft (MSFT). The results demonstrated superior predictive performance, attributable to the successful encapsulation of the diverse dynamics of different variables. Furthermore, the study provides a comprehensive evaluation of the interpretability outcomes, both qualitatively and quantitatively. The presented framework thus holds substantial promise as a comprehensive solution that not only enhances prediction accuracy but also aids in the extraction of valuable insights from complex multi-variable datasets.

Authors and Affiliations

Soukaina Seddik,Hayat Routaib,Anass Elhaddadi

Keywords

Related Articles

Performance Comparison of Three Classifiers for Fetal Health Classification Based on Cardiotocographic Data

The global child mortality rate, which is steadily declining, will be around 26 fatalities per 1000 live births in 2022. Numerous Sustainable Development Goals of the United Nations take into account the declining child...

Advances in Breast Cancer Segmentation: A Comprehensive Review

The diagnosis and treatment of breast cancer (BC) are significantly subject to medical imaging techniques, with segmentation being crucial in delineating pathological regions for precise diagnosis and treatment planning....

Enhanced Pest and Disease Detection in Agriculture Using Deep Learning-Enabled Drones

In this study, an integrated pest and disease recognition system for agricultural drones has been developed, leveraging deep learning technologies to significantly improve the accuracy and efficiency of pest and disease...

Augmenting Diabetic Retinopathy Severity Prediction with a Dual-Level Deep Learning Approach Utilizing Customized MobileNet Feature Embeddings

Diabetic retinopathy, a severe ocular disease correlated with elevated blood glucose levels in diabetic patients, carries a significant risk of visual impairment. The essentiality of its timely and precise severity class...

Characterization and Risk Assessment of Cyber Security Threats in Cloud Computing: A Comparative Evaluation of Mitigation Techniques

Advancements in information technology have significantly enhanced productivity and efficiency through the adoption of cloud computing, yet this adoption has also introduced a spectrum of security threats. Effective cybe...

Download PDF file
  • EP ID EP731892
  • DOI https://doi.org/10.56578/ataiml020304
  • Views 83
  • Downloads 1

How To Cite

Soukaina Seddik, Hayat Routaib, Anass Elhaddadi (2023). Multi-Variable Time Series Decoding with Long Short-Term Memory and Mixture Attention. Acadlore Transactions on AI and Machine Learning, 2(3), -. https://europub.co.uk./articles/-A-731892