Authors: Gerald Woo, Chenghao Liu, Akshat Kumar, Caiming Xiong, Silvio Savarese, Doyen Sahoo
Published on: February 04, 2024
Impact Score: 8.22
Arxiv code: Arxiv:2402.02592
Summary
- What is new: Introduction of the Moirai architecture and LOTSA, achieving universal forecasting with a single model.
- Why this is important: Time series forecasting’s limitation by a one-model-per-dataset framework.
- What the research proposes: Moirai, a novel Transformer architecture enhanced for universal forecasting, trained on the LOTSA.
- Results: Moirai demonstrates competitive or superior zero-shot forecasting performance.
Technical Details
Technological frameworks used: Transformer architecture
Models used: Masked Encoder-based Universal Time Series Forecasting Transformer (Moirai)
Data used: Large-scale Open Time Series Archive (LOTSA)
Potential Impact
Potential impacts across finance, supply chain, weather forecasting, and more.
Want to implement this idea in a business?
We have generated a startup concept here: ForeCastAI.
Leave a Reply