File size: 666 Bytes
700eb52
5b23bb9
fce46aa
 
9a0973b
 
 
 
 
5b23bb9
575f18e
d9c50f5
ab603cb
52ea1c3
f82a533
700eb52
f82a533
534f0df
5b23bb9
4ec1ddf
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
license: other
language:
- en
- fr
- es
- hi
- zh
- code
base_model: microsoft/Orca-2-13b
datasets:
- HuggingFaceH4/no_robots
- mlabonne/guanaco-llama2-1k
- OpenAssistant/oasst_top1_2023-08-25
- totally-not-an-llm/EverythingLM-data-V3
---
The "microsoft/Orca-2-13b" model fully fine-tuned on HuggingFaceH4/no_robots, totally-not-an-llm/EverythingLM-data-V3, mlabonne/guanaco-llama2-1k, and OpenAssistant/oasst_top1_2023-08-25. This model achieved a test loss of 0.18.

Make sure to comply with the microsoft research license. Please read it before using this model.

This model was trained on the following chat template:
"<|USER|> message <|ASSISTANT|> message"