Модель mistralai/Mistral-7B-v0.1, обучение всех слоев с ~4млрд токенов из датасета.

130 часов 2xTesla H100.

batch_size: 20
epochs: 1
optimizer:
  _component_: torch.optim.AdamW
  lr: 5e-6
  weight_decay: 0.01
loss:
  _component_: torch.nn.CrossEntropyLoss
max_steps_per_epoch: null
gradient_accumulation_steps: 5

Размер последовательности 1024 токенов.

Loss curve image/png

По https://github.com/NLP-Core-Team/mmlu_ru

Квантизация в 4b: accuracy_total=41.86218134391028

Downloads last month
13
Safetensors
Model size
7.24B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Model tree for kirv/Mistral-7b-tokens4b-v1

Finetuned
(817)
this model

Dataset used to train kirv/Mistral-7b-tokens4b-v1