Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
llama-moe
/
LLaMA-MoE-v2-3_8B-2_8-sft
like
3
Follow
LLaMA-MoE
18
Safetensors
English
mixtral
MoE
custom_code
arxiv:
2411.15708
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
d2f45e6
LLaMA-MoE-v2-3_8B-2_8-sft
/
generation_config.json
huxy912
commit
d2f45e6
11 months ago
raw
Copy download link
history
blame
Safe
194 Bytes
{
"bos_token_id"
:
128000
,
"do_sample"
:
true
,
"eos_token_id"
:
[
128001
,
128009
]
,
"max_length"
:
4096
,
"temperature"
:
0.6
,
"top_p"
:
0.9
,
"transformers_version"
:
"4.42.4"
}