lapp0 commited on
Commit
5a1d8bd
·
verified ·
1 Parent(s): d30b7e8

End of training

Browse files
README.md CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
- - eval_enwikippl: 18151.8379
20
- - eval_frwikippl: 38363.0352
21
- - eval_zhwikippl: 56660.7266
22
- - eval_loss: 0.0004
23
- - eval_runtime: 0.0556
24
- - eval_samples_per_second: 17.976
25
- - eval_steps_per_second: 17.976
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -57,19 +57,19 @@ The following hyperparameters were used during training:
57
  - num_epochs: 1.0
58
 
59
  ### Resource Usage
60
- Peak GPU Memory: 1.2477 GB
61
 
62
  ### Model Results
63
  `eval_` metrics:
64
 
65
- | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl | epoch | step |
66
  | --- | --- | --- | --- | --- | --- | --- | --- | --- |
67
- | | | | | | | | | **teacher eval** |
68
- | | | | | | | | 0 | 0 |
69
- | | | | | | | | 0.3030 | 30 |
70
- | | | | | | | | 0.6061 | 60 |
71
- | | | | | | | | 0.9091 | 90 |
72
- | | | | | | | | 1.0 | 99 |
73
 
74
  ### Framework versions
75
  - Distily 0.1.0
 
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
+ - eval_enwikippl: 18261.1387
20
+ - eval_frwikippl: 38633.1055
21
+ - eval_zhwikippl: 52085.4805
22
+ - eval_loss: 0.0005
23
+ - eval_runtime: 0.0656
24
+ - eval_samples_per_second: 15.248
25
+ - eval_steps_per_second: 15.248
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
57
  - num_epochs: 1.0
58
 
59
  ### Resource Usage
60
+ Peak GPU Memory: 1.2411 GB
61
 
62
  ### Model Results
63
  `eval_` metrics:
64
 
65
+ | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
66
  | --- | --- | --- | --- | --- | --- | --- | --- | --- |
67
+ | **teacher eval** | | 30.2266 | 57.3005 | | | | | 18.1903 |
68
+ | 0 | 0 | 58974.8945 | 59857.6992 | 0.0042 | 0.1173 | 8.525 | 8.525 | 60252.3672 |
69
+ | 30 | 0.3030 | 26646.1797 | 43684.125 | 0.0006 | 0.0661 | 15.123 | 15.123 | 53511.3242 |
70
+ | 60 | 0.6061 | 18083.6934 | 38626.9922 | 0.0005 | 0.0647 | 15.459 | 15.459 | 53146.3672 |
71
+ | 90 | 0.9091 | 18261.8535 | 38627.6914 | 0.0005 | 0.0656 | 15.248 | 15.248 | 52085.4805 |
72
+ | 99 | 1.0 | 18261.1387 | 38633.1055 | 0.0005 | 0.0656 | 15.248 | 15.248 | 52085.4805 |
73
 
74
  ### Framework versions
75
  - Distily 0.1.0
runs/Aug05_22-19-50_232a0f8c3879/events.out.tfevents.1722896645.232a0f8c3879 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a2a57109d18fbda93bc35dca9b3c8d3ead3dc68411b6dd80e1b03646e70f628a
3
+ size 245