Found 22 repositories(showing 22)
BLarzalere
Probabilistic Forecast of a Multivariate Time Series using the Temporal Fusion Transformer & PyTorch Lightning
AlfredT15
Multivariate time-series forecasting on stock market data using a transformer neural network
AlgazinovAleksandr
Multivariate time series analysis and forecasting. Use multiple models, such as CatBoost, Prophet, LSTM, Seq2Seq, Transformer, and AutoML to compare classical, ml-based, and deep learning approaches
This project develops a Transformers-based neural network for modeling and forecasting multivariate time series data from a COVID-19 dataset in Poland. The implementation is in Python, utilizing the Keras library for neural network design and training, along with numpy, pandas, matplotlib, and sklearn for data analysis and model evaluation.
MadaraSemini
This is multivariate time series transformer model to forecast financial market using historical data.
gavisangavi2502-max
This project focuses on advanced time series forecasting using deep learning models such as LSTM with Attention and Transformers. It includes synthetic multivariate data generation, preprocessing, feature engineering, SARIMAX baseline comparison, model training, and evaluation using MAE, RMSE, and MAPE
Advanced multivariate time-series forecasting project using deep learning with attention mechanisms. Includes data preprocessing, baseline LSTM model, Transformer-based attention model, hyperparameter tuning, comparative evaluation using RMSE/MAE, and attention-weight interpretation for feature importance
gavisangavi2502-max
“Advanced multivariate time-series forecasting project using deep learning and attention mechanisms. Implements a full Transformer encoder–decoder, LSTM and ARIMA baselines, rigorous preprocessing, hyperparameter tuning, and detailed evaluation using RMSE, MAE, and MAPE with attention-based insights.”
JillianWatson
Multivariate time series forecasting of Alberta livestock trade using Transformers and utilizing PySpark
troy3977-blip
Production-grade multivariate time-series forecasting using Temporal Fusion Transformer, MLflow, and Azure.
Advanced multivariate time series forecasting project using LSTM/Transformer models with uncertainty quantification and explainability (SHAP/IG).
Marudha-Nayagam
A deep learning framework for forecasting complex, non-stationary time series using attention-based models such as Transformers, Temporal Fusion Transformer, and Informer. Includes multivariate data generation, windowing, hyperparameter tuning, and advanced metrics.
This repository contains the source code and research for the M.Sc. thesis, "Normalizing Flows for Regular Time Series Forecasting". It explores the use of RNNs, Transformers, and the Shiesh activation function for multivariate probabilistic forecasting
A deep learning-based multivariate time series project comparing Transformer, TCN, N-BEATS, TFT, TimeLLM, and TimesFM architectures for forecasting flight arrival delays using operational and meteorological features.
Transformer-based multivariate time series forecasting using PyTorch with sliding window modeling. Compares deep learning performance against a SARIMA baseline using RMSE, MAE, and MAPE metrics. Includes preprocessing, training, evaluation, and visualization pipeline.
Built a production-ready Transformer-based deep learning model for long-range multivariate time series forecasting on the Electricity Transformer Temperature (ETT) dataset. The model predicts the next 24-hour oil temperature using 96-step historical multivariate inputs, outperforming LSTM and ARIMA baselines in RMSE and MAE.
alageshmithun-salem
This project builds an advanced deep learning model for multivariate time series forecasting using Transformer and LSTM-Attention architectures. It includes synthetic data generation, sliding-window preprocessing, Optuna hyperparameter tuning, and evaluation using RMSE and MAE.
“This project builds an advanced multivariate time series forecasting system using LSTM and Transformer models. It predicts future demand from historical features and includes preprocessing, model training, evaluation, and SHAP-based explainability for feature importance.”
An advanced multivariate time-series forecasting project using a custom Transformer-based attention model. Includes data generation, preprocessing, model training, baseline comparisons with LSTM and ARIMA, and interpretability through attention weight analysis for deeper temporal insights
A Transformer-based model was developed for multivariate time-series forecasting, using attention to capture long-term patterns. A synthetic dataset with trend and seasonality was created, and the model was compared against SARIMA and XGBoost, showing improved accuracy for multi-step prediction
shobanaganesan005
This project builds a multivariate time-series forecasting pipeline using an attention-based Transformer. A synthetic dataset with trend and seasonality is generated, and a seq2seq model with positional encoding predicts long-horizon targets. Performance is rigorously evaluated against an ARIMA baseline using MAE and RMSE.
Project Overview This project implements an end‑to‑end pipeline for multivariate time series forecasting using a Transformer‑based deep learning model with attention mechanisms. We benchmark against traditional models (ARIMA, Exponential Smoothing) and interpret the learned attention weights to understand temporal feature importance.
All 22 repositories loaded