End of training
Browse files
README.md
CHANGED
@@ -15,14 +15,14 @@ This student model is distilled from the teacher model [roneneldan/TinyStories-3
|
|
15 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
16 |
|
17 |
It achieves the following results on the evaluation set:
|
18 |
-
- eval_enwikippl:
|
19 |
-
- eval_frwikippl:
|
20 |
-
- eval_zhwikippl:
|
21 |
-
- eval_tinystoriesppl:
|
22 |
-
- eval_loss:
|
23 |
-
- eval_runtime: 6.
|
24 |
-
- eval_samples_per_second: 76.
|
25 |
-
- eval_steps_per_second: 9.
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
@@ -62,17 +62,17 @@ Peak GPU Memory: 6.6064 GB
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
|
65 |
-
| 0 | 0 |
|
66 |
-
| 5000 | 0.1010 |
|
67 |
-
| 10000 | 0.2020 |
|
68 |
-
| 15000 | 0.3030 |
|
69 |
-
| 20000 | 0.4040 |
|
70 |
-
| 25000 | 0.5051 |
|
71 |
-
| 30000 | 0.6061 |
|
72 |
-
| 35000 | 0.7071 |
|
73 |
-
| 40000 | 0.8081 |
|
74 |
-
| 45000 | 0.9091 |
|
75 |
-
| 49500 | 1.0 |
|
76 |
|
77 |
### Framework versions
|
78 |
- Distily 0.2.0
|
|
|
15 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
16 |
|
17 |
It achieves the following results on the evaluation set:
|
18 |
+
- eval_enwikippl: 104.6652
|
19 |
+
- eval_frwikippl: 13772.8643
|
20 |
+
- eval_zhwikippl: 74161.4531
|
21 |
+
- eval_tinystoriesppl: 5.5260
|
22 |
+
- eval_loss: 0.7819
|
23 |
+
- eval_runtime: 6.5261
|
24 |
+
- eval_samples_per_second: 76.616
|
25 |
+
- eval_steps_per_second: 9.654
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
|
65 |
+
| 0 | 0 | 25306.5312 | 80342.6562 | 6.4738 | 6.54 | 76.453 | 9.633 | 14565.9658 | 71518.8438 |
|
66 |
+
| 5000 | 0.1010 | 104.6652 | 13772.8643 | 0.7819 | 6.5261 | 76.616 | 9.654 | 5.5260 | 74161.4531 |
|
67 |
+
| 10000 | 0.2020 | 144.5553 | 13569.7109 | 0.7842 | 6.5111 | 76.792 | 9.676 | 8.9185 | 62270.8359 |
|
68 |
+
| 15000 | 0.3030 | 105.4526 | 12598.8818 | 0.7708 | 6.5186 | 76.704 | 9.665 | 5.6194 | 53872.625 |
|
69 |
+
| 20000 | 0.4040 | 121.5509 | 12060.5781 | 0.7610 | 6.5194 | 76.694 | 9.663 | 7.1313 | 52133.4336 |
|
70 |
+
| 25000 | 0.5051 | 111.6548 | 13016.4775 | 0.7537 | 6.5166 | 76.727 | 9.668 | 6.0700 | 53485.9688 |
|
71 |
+
| 30000 | 0.6061 | 101.6823 | 11441.6719 | 0.7577 | 6.5294 | 76.577 | 9.649 | 5.7104 | 48007.9258 |
|
72 |
+
| 35000 | 0.7071 | 97.8760 | 10992.2207 | 0.7519 | 6.5151 | 76.745 | 9.67 | 5.5543 | 47549.0430 |
|
73 |
+
| 40000 | 0.8081 | 114.8546 | 11104.2744 | 0.7378 | 6.5634 | 76.18 | 9.599 | 6.9089 | 42804.5273 |
|
74 |
+
| 45000 | 0.9091 | 112.3336 | 11524.9678 | 0.7228 | 6.5648 | 76.164 | 9.597 | 6.6096 | 46781.4727 |
|
75 |
+
| 49500 | 1.0 | 110.1023 | 10899.7119 | 0.7097 | 6.5487 | 76.351 | 9.62 | 6.5616 | 49450.9141 |
|
76 |
|
77 |
### Framework versions
|
78 |
- Distily 0.2.0
|
logs/copy_teacher_modules=_(_lm_head___True)_, dropout=0, learning_rate=4e-05, warmup_ratio=0, weight_decay=0.01/events.out.tfevents.1723998203.5f530b1cf724
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d8813fe3fae697f2f0640c456e3db71924747a0db2c576e2494376e348b7e060
|
3 |
+
size 312
|