Update README.md
Browse files
README.md
CHANGED
@@ -29,11 +29,11 @@ tags:
|
|
29 |
|
30 |
**Update** (2024.5) [Timer](https://arxiv.org/abs/2402.02368), a large-scale pre-trained time sereis Transformer is accepted by ICML 2024.
|
31 |
|
32 |
-
|
33 |
|
34 |

|
35 |
|
36 |
-
This version is univariate pre-trained on **260B** time points with **84M** parameters, a lightweight generative Transformer for zero-shot point forecasting.
|
37 |
|
38 |
We evaluate the model on the following benchmark: [TSLib Dataset](https://cdn-uploads.huggingface.co/production/uploads/64fbe24a2d20ced4e91de38a/AXhZLVGR8Cnuxe8CVK4Fu.png).
|
39 |
|
@@ -68,12 +68,12 @@ A notebook example is also provided [here](https://github.com/thuml/Large-Time-S
|
|
68 |
|
69 |
## Specification
|
70 |
|
71 |
-
* Architecture
|
72 |
-
* Pre-training Scale
|
73 |
-
* Context Length
|
74 |
-
* Parameter Count
|
75 |
-
* Patch Length
|
76 |
-
* Number of Layers
|
77 |
|
78 |
## Adaptation
|
79 |
|
|
|
29 |
|
30 |
**Update** (2024.5) [Timer](https://arxiv.org/abs/2402.02368), a large-scale pre-trained time sereis Transformer is accepted by ICML 2024.
|
31 |
|
32 |
+
**Timer** is a large time-series model introduced in this [paper](https://arxiv.org/abs/2402.02368) and enhanced with [further work](https://arxiv.org/abs/2410.04803).
|
33 |
|
34 |

|
35 |
|
36 |
+
This version is univariate pre-trained on **260B** time points with **84M** parameters, a lightweight generative Transformer for **zero-shot** point forecasting.
|
37 |
|
38 |
We evaluate the model on the following benchmark: [TSLib Dataset](https://cdn-uploads.huggingface.co/production/uploads/64fbe24a2d20ced4e91de38a/AXhZLVGR8Cnuxe8CVK4Fu.png).
|
39 |
|
|
|
68 |
|
69 |
## Specification
|
70 |
|
71 |
+
* **Architecture**: Causal Transformer (Decoder-only)
|
72 |
+
* **Pre-training Scale**: 260B time points
|
73 |
+
* **Context Length**: up to 2880
|
74 |
+
* **Parameter Count**: 84M
|
75 |
+
* **Patch Length**: 96
|
76 |
+
* **Number of Layers**: 8
|
77 |
|
78 |
## Adaptation
|
79 |
|