Imadken commited on
Commit
56958c5
·
verified ·
1 Parent(s): 8c6681f

Imadken/mistral-7b-platypus-lamini-vxxiii-chat-real_augmented_costumer-dpo

Browse files
Files changed (3) hide show
  1. README.md +1 -1
  2. adapter_model.safetensors +1 -1
  3. training_args.bin +1 -1
README.md CHANGED
@@ -56,7 +56,7 @@ The following hyperparameters were used during training:
56
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
57
  - lr_scheduler_type: cosine
58
  - lr_scheduler_warmup_steps: 100
59
- - training_steps: 400
60
  - mixed_precision_training: Native AMP
61
 
62
  ### Training results
 
56
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
57
  - lr_scheduler_type: cosine
58
  - lr_scheduler_warmup_steps: 100
59
+ - training_steps: 800
60
  - mixed_precision_training: Native AMP
61
 
62
  ### Training results
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:10be0e1243568b2f972051083208d7e5a117a978d66b043a2038f0304ae5dab7
3
  size 113271504
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:06373a9219f4fca45aa53630c954d04f2e0654aad58f6d6d04d9679a8db0648e
3
  size 113271504
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ff5ac31dac759157f3e007a0012435d14971c8457cca5fdd900aadeab2243665
3
  size 4792
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eeb1cc69fce4b8ed14bde5482bc287425c4117ef4dc3a509ec483f3f2ec10ddc
3
  size 4792