lapp0 commited on
Commit
1ee9733
·
verified ·
1 Parent(s): b67d23b

End of training

Browse files
Files changed (1) hide show
  1. README.md +74 -0
README.md CHANGED
@@ -0,0 +1,74 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: gpt2
3
+ library_name: distily
4
+ license: mit
5
+ tags:
6
+ - generated_from_trainer
7
+ model-index:
8
+ - name: gpt2_model_card_distily_test
9
+ results: []
10
+ ---
11
+
12
+ # gpt2_model_card_distily_test
13
+
14
+ This student model is distilled from the teacher model [gpt2](https://huggingface.co/gpt2) using the dataset (unspecified).
15
+
16
+ The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
+
18
+ It achieves the following results on the evaluation set:
19
+ - eval_enwikippl: 3305.2227
20
+ - eval_frwikippl: 13155.6016
21
+ - eval_zhwikippl: 73644.4141
22
+ - eval_loss: 2480.0
23
+ - eval_runtime: 0.0549
24
+ - eval_samples_per_second: 18.218
25
+ - eval_steps_per_second: 18.218
26
+
27
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
+ should probably proofread and complete it, then remove this comment.
29
+
30
+ ## Model description
31
+
32
+ More information needed
33
+
34
+ ## Intended uses & limitations
35
+
36
+ More information needed
37
+
38
+ ## Training and evaluation data
39
+
40
+ More information needed
41
+ -->
42
+
43
+ ## Training procedure
44
+
45
+ ### Training hyperparameters
46
+
47
+ The following hyperparameters were used during training:
48
+ - distillation_strategy: logits_activations
49
+ - loss_fn: reverse_kl
50
+ - train_embeddings: True
51
+ - learning_rate: 0.0001
52
+ - train_batch_size: 1
53
+ - eval_batch_size: 2
54
+ - seed: 42
55
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
56
+ - lr_scheduler_type: cosine
57
+ - num_epochs: 1.0
58
+
59
+ ### Resource Usage
60
+ Peak GPU Memory: 1.245255947113037GB
61
+
62
+ ### Model Results
63
+ | epoch | eval_enwikippl | eval_frwikippl | eval_loss | eval_runtime | eval_samples_per_second | eval_steps_per_second | eval_zhwikippl | step |
64
+ | --- | --- | --- | --- | --- | --- | --- | --- | --- |
65
+ | 0 | 58716.1836 | 59308.4531 | 6848.0 | 0.078 | 12.815 | 12.815 | 56780.0039 | 0 |
66
+ | 0.5025 | 2729.5120 | 12198.3467 | 2288.0 | 0.0559 | 17.885 | 17.885 | 86770.4375 | 100 |
67
+ | 0.7538 | 2523.9104 | 11865.6045 | 2240.0 | 0.0557 | 17.969 | 17.969 | 91730.1328 | 150 |
68
+ | 0.2513 | 3305.2227 | 13155.6016 | 2480.0 | 0.0549 | 18.218 | 18.218 | 73644.4141 | 50 |
69
+
70
+ ### Framework versions
71
+ - Distily 0.1.0
72
+ - Transformers 4.43.3
73
+ - Pytorch 2.3.0
74
+ - Datasets 2.20.0