Llama 3 Translation 8B v0.170

  • Eval Loss: 0.70520
  • Train Loss: 0.49367
  • lr: 9e-05
  • optimizer: adamw
  • lr_scheduler_type: cosine

Prompt Template

<|begin_of_text|><|start_header_id|>user<|end_header_id|>

Translate into Korean:Hamsters don't eat cats.<|eot_id|><|start_header_id|>assistant<|end_header_id|>

ํ–„์Šคํ„ฐ๋Š” ๊ณ ์–‘์ด๋ฅผ ๋จน์ง€ ์•Š์Šต๋‹ˆ๋‹ค.<|eot_id|><|end_of_text|>
<|begin_of_text|><|start_header_id|>user<|end_header_id|>

Translate into English:ํ–„์Šคํ„ฐ๋Š” ๊ณ ์–‘์ด๋ฅผ ๋จน์ง€ ์•Š์Šต๋‹ˆ๋‹ค.<|eot_id|><|start_header_id|>assistant<|end_header_id|>

Hamsters do not eat cats.<|eot_id|><|end_of_text|>

Model Description

Downloads last month
4
Safetensors
Model size
8.03B params
Tensor type
BF16
ยท
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for lemon-mint/Llama-3-Translation-8B-v0.170

Finetuned
(5)
this model