The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Error code: DatasetGenerationError Exception: ArrowInvalid Message: Failed to parse string: '[16]' as a scalar of type double Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1831, in _prepare_split_single writer.write_table(table) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 644, in write_table pa_table = table_cast(pa_table, self._schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2272, in table_cast return cast_table_to_schema(table, schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2223, in cast_table_to_schema arrays = [ File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2224, in <listcomp> cast_array_to_feature( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1795, in wrapper return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks]) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1795, in <listcomp> return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks]) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2086, in cast_array_to_feature return array_cast( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1797, in wrapper return func(array, *args, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1949, in array_cast return array.cast(pa_type) File "pyarrow/array.pxi", line 996, in pyarrow.lib.Array.cast File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/compute.py", line 404, in cast return call_function("cast", [arr], options, memory_pool) File "pyarrow/_compute.pyx", line 590, in pyarrow._compute.call_function File "pyarrow/_compute.pyx", line 385, in pyarrow._compute.Function.call File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status pyarrow.lib.ArrowInvalid: Failed to parse string: '[16]' as a scalar of type double The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1456, in compute_config_parquet_and_info_response parquet_operations = convert_to_parquet(builder) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1055, in convert_to_parquet builder.download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 894, in download_and_prepare self._download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 970, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1702, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1858, in _prepare_split_single raise DatasetGenerationError("An error occurred while generating the dataset") from e datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
model
string | date
string | gated
string | disabled
bool | base_model
string | tags_str
string | size
float64 | act
string | d_model
float64 | d_ffn
float64 | heads
float64 | layers
float64 | kv_heads
string | vocab_size
float64 | pos
float64 | n_exp
null | selected_exp
null | shared_exp
null | dtype
string | model_type
string | multimodal
bool | downloads
float64 | likes
float64 | moe
bool | type
string |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
claritylab/zero-shot-vanilla-gpt2
|
05/15/2023
|
False
| false | null |
zeroshot_classifier, pytorch, gpt2, text-generation, transformers, sentence-transformers, en, dataset:claritylab/UTCD, license:mit, region:us
| null |
gelu_new
| 1,024 | null | 16 | 24 | null | 50,266 | 1,024 | null | null | null |
float32
|
gpt2
| false | 23 | 1 | false | null |
claritylab/zero-shot-explicit-gpt2
|
05/15/2023
|
False
| false | null |
zeroshot_classifier, pytorch, gpt2, text-generation, transformers, sentence-transformers, en, dataset:claritylab/UTCD, license:mit, region:us
| null |
gelu_new
| 1,024 | null | 16 | 24 | null | 50,266 | 1,024 | null | null | null |
float32
|
gpt2
| false | 22 | 0 | false | null |
claritylab/zero-shot-implicit-gpt2
|
05/15/2023
|
False
| false | null |
zeroshot_classifier, pytorch, gpt2, text-generation, transformers, sentence-transformers, en, dataset:claritylab/UTCD, license:mit, region:us
| null |
gelu_new
| 1,024 | null | 16 | 24 | null | 50,267 | 1,024 | null | null | null |
float32
|
gpt2
| false | 32 | 0 | false | null |
mukel/Yi-Coder-9B-Chat-GGUF
|
09/24/2024
|
False
| false |
01-ai/Yi-Coder-9B-Chat
|
yi.java, gguf, java, yi, llama3.java, code, text-generation, base_model:01-ai/Yi-Coder-9B-Chat, base_model:quantized:01-ai/Yi-Coder-9B-Chat, license:apache-2.0, endpoints_compatible, region:us, conversational
| 8.8 |
silu
| 4,096 | 11,008 | 32 | 48 |
4
| 64,000 | 131,072 | null | null | null | null |
llama
| false | 21 | 0 | null |
quantized
|
mukel/Yi-Coder-1.5B-Chat-GGUF
|
09/24/2024
|
False
| false |
01-ai/Yi-Coder-1.5B-Chat
|
yi.java, gguf, java, yi, llama3.java, code, text-generation, base_model:01-ai/Yi-Coder-1.5B-Chat, base_model:quantized:01-ai/Yi-Coder-1.5B-Chat, license:apache-2.0, endpoints_compatible, region:us, conversational
| 1.5 |
silu
| 2,048 | 5,504 | 16 | 24 |
16
| 64,000 | 131,072 | null | null | null | null |
llama
| false | 30 | 0 | null |
quantized
|
bartowski/Nous-Hermes-2-Yi-34B-exl2
|
12/26/2023
|
False
| false |
01-ai/Yi-34B
|
yi, instruct, finetune, chatml, gpt4, synthetic data, distillation, text-generation, en, base_model:01-ai/Yi-34B, base_model:finetune:01-ai/Yi-34B, license:apache-2.0, region:us
| 34.4 |
silu
| 7,168 | 20,480 | 56 | 60 |
8
| 64,000 | 4,096 | null | null | null | null |
llama
| false | 0 | 1 | null |
finetune
|
bartowski/xDAN-L1-Chat-RL-v1-exl2
|
12/27/2023
|
False
| false | null |
xDAN-AI, OpenOrca, DPO, Self-Think, text-generation, en, dataset:Open-Orca/OpenOrca, dataset:Intel/orca_dpo_pairs, license:cc-by-4.0, region:us
| null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
finetune
|
justmichaelaj/Storytelling-Questioning-Bot
|
09/27/2023
|
False
| false | null |
writing, story, text-generation, region:us
| null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
bartowski/WikiHow-Mistral-Instruct-7B-exl2
|
03/26/2024
|
False
| false | null |
wikihow, tutorial, educational, text-generation, en, dataset:ajibawa-2023/WikiHow, license:apache-2.0, region:us
| 7 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
finetune
|
bartowski/Mistral-Large-Instruct-2411-exl2
|
11/19/2024
|
False
| false |
mistralai/Mistral-Large-Instruct-2411
|
vllm, text-generation, en, fr, de, es, it, pt, zh, ja, ru, ko, base_model:mistralai/Mistral-Large-Instruct-2411, base_model:quantized:mistralai/Mistral-Large-Instruct-2411, license:other, region:us
| null | null | null | null | null | null | null | null | null | null | null | null | null | null | true | null | null | null |
quantized
|
neuralmagic/pixtral-12b-FP8-dynamic
|
10/10/2024
|
False
| false |
mistral-community/pixtral-12b
|
vllm, safetensors, llava, fp8, text-generation, conversational, en, de, fr, it, pt, hi, es, th, base_model:mistral-community/pixtral-12b, base_model:quantized:mistral-community/pixtral-12b, license:apache-2.0, compressed-tensors, region:us
| 12.7 | null | null | null | null | null | null | null | null | null | null | null |
bfloat16
|
llava
| true | 3,287 | 7 | false |
quantized
|
bartowski/Replete-LLM-Mist-Nemo-12b-test-merged-250k-exl2
|
08/06/2024
|
False
| false | null |
unsloth, text-generation, en, license:apache-2.0, region:us
| 12 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
merge
|
bartowski/Replete-LLM-V2-Llama-3.1-8b-exl2
|
08/24/2024
|
False
| false | null |
unsloth, text-generation, en, dataset:Replete-AI/The_Living_AI_Dataset, dataset:Replete-AI/code_bagel_hermes-2.5, license:apache-2.0, region:us
| 8 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
bartowski/Replete-LLM-Qwen2-7b-exl2
|
08/09/2024
|
False
| false | null |
unsloth, text-generation, en, dataset:Replete-AI/Everything_Instruct_8k_context_filtered, license:apache-2.0, region:us
| 7 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
bartowski/Replete-Coder-V2-Llama-3.1-8b-exl2
|
08/23/2024
|
False
| false |
kshabana/GOAT-coder-llama3.1-8b
|
unsloth, text-generation, en, base_model:kshabana/GOAT-coder-llama3.1-8b, base_model:finetune:kshabana/GOAT-coder-llama3.1-8b, license:apache-2.0, region:us
| 8 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
finetune
|
bartowski/Replete-LLM-Qwen2-7b_Beta-Preview-exl2
|
07/26/2024
|
False
| false | null |
unsloth, text-generation, dataset:Replete-AI/Everything_Instruct_Multilingual_8k_context_filtered, license:apache-2.0, region:us
| 7 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
nextvalueup/Qwen2.5-7B-Instruct_v3
|
10/31/2024
|
False
| false |
Qwen/Qwen2.5-7B-Instruct
|
unsloth, pytorch, qwen2, qwen, finetuned, korean, generated-text, text-generation, conversational, en, ko, base_model:Qwen/Qwen2.5-7B-Instruct, base_model:finetune:Qwen/Qwen2.5-7B-Instruct, license:apache-2.0, region:us
| 7 |
silu
| 3,584 | 18,944 | 28 | 28 |
4
| 152,064 | 32,768 | null | null | null |
float16
|
qwen2
| false | 10 | 0 | false |
finetune
|
bartowski/opus-v1.2-7b-exl2
|
02/29/2024
|
False
| false | null |
unsloth, axolotl, text-generation, en, region:us
| 7 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
Quant-Cartel/dreamgen-opus-v1.2-70b-exl2-rpcal
|
04/09/2024
|
False
| false | null |
unsloth, axolotl, text-generation, en, license:cc-by-nc-nd-4.0, region:us
| 70 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
bartowski/opus-v1.2-llama-3-8b-exl2
|
04/20/2024
|
False
| false | null |
unsloth, axolotl, text-generation, en, license:cc-by-nc-nd-4.0, region:us
| 8 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
unity/sentis-phi-1_5
|
02/25/2024
|
False
| false | null |
unity-sentis, text-generation, license:mit, region:us
| null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
unity/sentis-tiny-stories
|
01/11/2024
|
False
| false | null |
unity-sentis, onnx, text-generation, license:mit, region:us
| null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
bartowski/Lexi-Llama-3-8B-Uncensored-exl2
|
04/24/2024
|
False
| false | null |
uncensored, llama3, instruct, open, text-generation, license:llama3, region:us
| 8 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
finetune
|
tysm/Signer
|
04/20/2024
|
manual
| false | null |
tysm, signer, sign language, text-generation, en, license:other, region:us
| null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
bartowski/Llama-3-SauerkrautLM-8b-Instruct-exl2
|
04/22/2024
|
False
| false | null |
two stage dpo, dpo, text-generation, de, en, license:other, region:us
| 8 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
finetune
|
bartowski/CapybaraHermes-2.5-Mistral-7B-exl2
|
02/01/2024
|
False
| false | null |
trl, distilabel, dpo, rlaif, rlhf, text-generation, en, dataset:argilla/dpo-mix-7k, license:apache-2.0, region:us
| 7 | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
finetune
|
alexavil/doppelbot_test
|
01/18/2024
|
False
| false | null |
transformers.js, text-generation, en, dataset:alexavil/doppelbot_test, arxiv:1910.09700, region:us
| null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null | null |
sddfgdffhjht/ideamodel
|
10/13/2024
|
False
| false |
openai-community/gpt2
|
transformers.js, text-generation, en, dataset:Skylion007/openwebtext, base_model:openai-community/gpt2, base_model:finetune:openai-community/gpt2, license:bigscience-bloom-rail-1.0, region:us
| 0.1 |
gelu_new
| 768 | null | 12 | 12 | null | 50,257 | 1,024 | null | null | null | null |
gpt2
| false | 0 | 0 | null |
finetune
|
Xenova/llama2.c-stories110M
|
08/03/2023
|
False
| false | null |
transformers.js, pytorch, onnx, llama, text-generation, transformers, region:us
| null |
silu
| 768 | 2,048 | 12 | 12 |
12
| 32,000 | 1,024 | null | null | null | null |
llama
| false | 692 | 5 | false | null |
Xenova/llama2.c-stories15M
|
08/03/2023
|
False
| false | null |
transformers.js, pytorch, onnx, llama, text-generation, transformers, region:us
| null |
silu
| 288 | 768 | 6 | 6 |
6
| 32,000 | 256 | null | null | null | null |
llama
| false | 37,212 | 6 | false | null |
Xenova/llama2.c-stories42M
|
08/03/2023
|
False
| false | null |
transformers.js, pytorch, onnx, llama, text-generation, transformers, region:us
| null |
silu
| 512 | 1,376 | 8 | 8 |
8
| 32,000 | 1,024 | null | null | null | null |
llama
| false | 483 | 1 | false | null |
Xenova/really-tiny-falcon-testing
|
10/31/2023
|
False
| false |
fxmarty/really-tiny-falcon-testing
|
transformers.js, pytorch, onnx, falcon, text-generation, transformers, custom_code, base_model:fxmarty/really-tiny-falcon-testing, base_model:quantized:fxmarty/really-tiny-falcon-testing, region:us
| null | null | 32 | null | 2 | 2 | null | 65,024 | 2,048 | null | null | null | null |
falcon
| false | 15 | 1 | false |
quantized
|
Xenova/tiny-random-Starcoder2ForCausalLM
|
03/03/2024
|
False
| false |
hf-internal-testing/tiny-random-Starcoder2ForCausalLM
|
transformers.js, onnx, starcoder2, text-generation, base_model:hf-internal-testing/tiny-random-Starcoder2ForCausalLM, base_model:quantized:hf-internal-testing/tiny-random-Starcoder2ForCausalLM, region:us
| null |
gelu
| 32 | 37 | 4 | 2 |
2
| 1,024 | 512 | null | null | null | null |
starcoder2
| false | 362 | 0 | false |
quantized
|
BricksDisplay/stablelm-2-1_6b-bnb4
|
03/12/2024
|
False
| false | null |
transformers.js, onnx, stablelm, text-generation, region:us
| 1.6 |
silu
| 2,048 | 5,632 | 32 | 24 |
32
| 100,352 | 4,096 | null | null | null | null |
stablelm
| false | 7 | 0 | false | null |
BricksDisplay/stablelm-2-zephyr-1_6b-q4
|
03/12/2024
|
False
| false | null |
transformers.js, onnx, stablelm, text-generation, conversational, region:us
| 1.6 |
silu
| 2,048 | 5,632 | 32 | 24 |
32
| 100,352 | 4,096 | null | null | null | null |
stablelm
| false | 7 | 1 | false | null |
Xenova/stablelm-2-zephyr-1_6b
|
02/02/2024
|
False
| false |
stabilityai/stablelm-2-zephyr-1_6b
|
transformers.js, onnx, stablelm, text-generation, conversational, base_model:stabilityai/stablelm-2-zephyr-1_6b, base_model:quantized:stabilityai/stablelm-2-zephyr-1_6b, region:us
| 1.6 |
silu
| 2,048 | 5,632 | 32 | 24 |
32.0
| 100,352 | 4,096 | null | null | null | null |
stablelm
| false | 16 | 1 | false |
quantized
|
BricksDisplay/stablelm-2-1_6b-q4
|
03/12/2024
|
False
| false |
stabilityai/stablelm-2-1_6b
|
transformers.js, onnx, stablelm, text-generation, base_model:stabilityai/stablelm-2-1_6b, base_model:quantized:stabilityai/stablelm-2-1_6b, region:us
| 1.6 |
silu
| 2,048 | 5,632 | 32 | 24 |
32
| 100,352 | 4,096 | null | null | null | null |
stablelm
| false | 6 | 0 | false |
quantized
|
Xenova/stablelm-2-1_6b
|
02/02/2024
|
False
| false |
stabilityai/stablelm-2-1_6b
|
transformers.js, onnx, stablelm, text-generation, base_model:stabilityai/stablelm-2-1_6b, base_model:quantized:stabilityai/stablelm-2-1_6b, region:us
| 1.6 |
silu
| 2,048 | 5,632 | 32 | 24 |
32.0
| 100,352 | 4,096 | null | null | null | null |
stablelm
| false | 13 | 2 | false |
quantized
|
Xenova/tiny-random-StableLmForCausalLM
|
03/01/2024
|
False
| false |
hf-internal-testing/tiny-random-StableLmForCausalLM
|
transformers.js, onnx, stablelm, text-generation, base_model:hf-internal-testing/tiny-random-StableLmForCausalLM, base_model:quantized:hf-internal-testing/tiny-random-StableLmForCausalLM, region:us
| null |
gelu
| 64 | 37 | 4 | 2 |
4
| 1,024 | 512 | null | null | null | null |
stablelm
| false | 356 | 0 | false |
quantized
|
onnx-community/moondream2.text_model-ONNX
|
11/15/2024
|
False
| false | null |
transformers.js, onnx, safetensors, phi, text-generation, transformers, region:us
| 1.4 |
gelu_new
| 2,048 | 8,192 | 32 | 24 |
32
| 51,200 | 2,048 | null | null | null |
float32
|
phi
| false | 22 | 0 | false | null |
onnx-community/OuteTTS-0.2-500M
|
11/26/2024
|
False
| false |
OuteAI/OuteTTS-0.2-500M
|
transformers.js, onnx, qwen2, text-generation, text-to-speech, en, zh, ja, ko, base_model:OuteAI/OuteTTS-0.2-500M, base_model:quantized:OuteAI/OuteTTS-0.2-500M, license:cc-by-nc-4.0, region:us
| 0.5 |
silu
| 896 | 4,864 | 14 | 24 |
2
| 157,696 | 32,768 | null | null | null |
bfloat16
|
qwen2
| false | 1,205 | 10 | false |
quantized
|
onnx-community/oxy-1-micro
|
12/09/2024
|
False
| false |
oxyapi/oxy-1-micro
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:oxyapi/oxy-1-micro, base_model:quantized:oxyapi/oxy-1-micro, region:us
| null |
silu
| 1,536 | 8,960 | 12 | 28 |
2.0
| 151,936 | 32,768 | null | null | null |
float16
|
qwen2
| false | 18 | 0 | false |
quantized
|
onnx-community/Arch-Function-1.5B
|
12/09/2024
|
False
| false |
katanemo/Arch-Function-1.5B
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:katanemo/Arch-Function-1.5B, base_model:quantized:katanemo/Arch-Function-1.5B, license:other, region:us
| 1.5 |
silu
| 1,536 | 8,960 | 12 | 28 |
2
| 151,936 | 32,768 | null | null | null |
bfloat16
|
qwen2
| false | 38 | 1 | false |
quantized
|
onnx-community/Qwen2.5-Math-1.5B-Instruct
|
09/23/2024
|
False
| false |
Qwen/Qwen2.5-Math-1.5B-Instruct
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen2.5-Math-1.5B-Instruct, base_model:quantized:Qwen/Qwen2.5-Math-1.5B-Instruct, region:us
| 1.5 |
silu
| 1,536 | 8,960 | 12 | 28 |
2
| 151,936 | 4,096 | null | null | null | null |
qwen2
| false | 17 | 0 | false |
quantized
|
onnx-community/Qwen2.5-Math-1.5B
|
09/23/2024
|
False
| false |
Qwen/Qwen2.5-Math-1.5B
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen2.5-Math-1.5B, base_model:quantized:Qwen/Qwen2.5-Math-1.5B, region:us
| 1.5 |
silu
| 1,536 | 8,960 | 12 | 28 |
2
| 151,936 | 4,096 | null | null | null | null |
qwen2
| false | 13 | 0 | false |
quantized
|
onnx-community/Qwen2.5-Coder-3B-Instruct
|
11/13/2024
|
False
| false |
Qwen/Qwen2.5-Coder-3B-Instruct
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen2.5-Coder-3B-Instruct, base_model:quantized:Qwen/Qwen2.5-Coder-3B-Instruct, region:us
| 3 |
silu
| 2,048 | 11,008 | 16 | 36 |
2
| 151,936 | 32,768 | null | null | null |
float32
|
qwen2
| false | 58 | 5 | false |
quantized
|
onnx-community/Qwen2.5-Coder-1.5B-Instruct
|
09/23/2024
|
False
| false |
Qwen/Qwen2.5-Coder-1.5B-Instruct
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen2.5-Coder-1.5B-Instruct, base_model:quantized:Qwen/Qwen2.5-Coder-1.5B-Instruct, region:us
| 1.5 |
silu
| 1,536 | 8,960 | 12 | 28 |
2
| 151,936 | 32,768 | null | null | null |
float32
|
qwen2
| false | 54 | 2 | false |
quantized
|
onnx-community/Qwen2.5-Coder-1.5B
|
09/23/2024
|
False
| false |
Qwen/Qwen2.5-Coder-1.5B
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen2.5-Coder-1.5B, base_model:quantized:Qwen/Qwen2.5-Coder-1.5B, region:us
| 1.5 |
silu
| 1,536 | 8,960 | 12 | 28 |
2
| 151,936 | 32,768 | null | null | null | null |
qwen2
| false | 10 | 0 | false |
quantized
|
onnx-community/Qwen2.5-Coder-0.5B-Instruct
|
11/12/2024
|
False
| false |
Qwen/Qwen2.5-Coder-0.5B-Instruct
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen2.5-Coder-0.5B-Instruct, base_model:quantized:Qwen/Qwen2.5-Coder-0.5B-Instruct, region:us
| 0.5 |
silu
| 896 | 4,864 | 14 | 24 |
2
| 151,936 | 32,768 | null | null | null |
float32
|
qwen2
| false | 70 | 0 | false |
quantized
|
onnx-community/Qwen2.5-1.5B-Instruct
|
09/23/2024
|
False
| false |
Qwen/Qwen2.5-1.5B-Instruct
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen2.5-1.5B-Instruct, base_model:quantized:Qwen/Qwen2.5-1.5B-Instruct, region:us
| 1.5 |
silu
| 1,536 | 8,960 | 12 | 28 |
2
| 151,936 | 32,768 | null | null | null | null |
qwen2
| false | 385 | 3 | false |
quantized
|
onnx-community/Qwen2.5-1.5B
|
09/23/2024
|
False
| false |
Qwen/Qwen2.5-1.5B
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen2.5-1.5B, base_model:quantized:Qwen/Qwen2.5-1.5B, region:us
| 1.5 |
silu
| 1,536 | 8,960 | 12 | 28 |
2
| 151,936 | 32,768 | null | null | null | null |
qwen2
| false | 29 | 0 | false |
quantized
|
onnx-community/Qwen2.5-0.5B-Instruct
|
09/23/2024
|
False
| false |
Qwen/Qwen2.5-0.5B-Instruct
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen2.5-0.5B-Instruct, base_model:quantized:Qwen/Qwen2.5-0.5B-Instruct, region:us
| 0.5 |
silu
| 896 | 4,864 | 14 | 24 |
2
| 151,936 | 32,768 | null | null | null | null |
qwen2
| false | 658 | 2 | false |
quantized
|
onnx-community/Qwen2.5-0.5B
|
09/23/2024
|
False
| false |
Qwen/Qwen2.5-0.5B
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen2.5-0.5B, base_model:quantized:Qwen/Qwen2.5-0.5B, region:us
| 0.5 |
silu
| 896 | 4,864 | 14 | 24 |
2
| 151,936 | 32,768 | null | null | null | null |
qwen2
| false | 26 | 0 | false |
quantized
|
Xenova/Qwen1.5-1.8B-Chat
|
02/05/2024
|
False
| false |
Qwen/Qwen1.5-1.8B-Chat
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen1.5-1.8B-Chat, base_model:quantized:Qwen/Qwen1.5-1.8B-Chat, region:us
| 1.8 |
silu
| 2,048 | 5,504 | 16 | 24 |
16.0
| 151,936 | 32,768 | null | null | null | null |
qwen2
| false | 13 | 0 | false |
quantized
|
Xenova/Qwen1.5-1.8B
|
02/05/2024
|
False
| false |
Qwen/Qwen1.5-1.8B
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen1.5-1.8B, base_model:quantized:Qwen/Qwen1.5-1.8B, region:us
| 1.8 |
silu
| 2,048 | 5,504 | 16 | 24 |
16
| 151,936 | 32,768 | null | null | null | null |
qwen2
| false | 39 | 1 | false |
quantized
|
Xenova/Qwen1.5-0.5B-Chat
|
02/05/2024
|
False
| false |
Qwen/Qwen1.5-0.5B-Chat
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen1.5-0.5B-Chat, base_model:quantized:Qwen/Qwen1.5-0.5B-Chat, region:us
| 0.5 |
silu
| 1,024 | 2,816 | 16 | 24 |
16.0
| 151,936 | 32,768 | null | null | null | null |
qwen2
| false | 473 | 5 | false |
quantized
|
Xenova/Qwen1.5-0.5B
|
02/05/2024
|
False
| false |
Qwen/Qwen1.5-0.5B
|
transformers.js, onnx, qwen2, text-generation, conversational, base_model:Qwen/Qwen1.5-0.5B, base_model:quantized:Qwen/Qwen1.5-0.5B, region:us
| 0.5 |
silu
| 1,024 | 2,816 | 16 | 24 |
16.0
| 151,936 | 32,768 | null | null | null | null |
qwen2
| false | 16 | 1 | false |
quantized
|
onnx-community/Phi-3.5-vision-instruct
|
12/09/2024
|
False
| false |
microsoft/Phi-3.5-vision-instruct
|
transformers.js, onnx, phi3_v, text-generation, conversational, custom_code, base_model:microsoft/Phi-3.5-vision-instruct, base_model:quantized:microsoft/Phi-3.5-vision-instruct, region:us
| null |
silu
| 3,072 | 8,192 | 32 | 32 |
32
| 32,064 | 131,072 | null | null | null |
bfloat16
|
phi3_v
| true | 85 | 1 | false |
quantized
|
onnx-community/Phi-3-vision-128k-instruct
|
12/09/2024
|
False
| false |
microsoft/Phi-3-vision-128k-instruct
|
transformers.js, onnx, phi3_v, text-generation, conversational, custom_code, base_model:microsoft/Phi-3-vision-128k-instruct, base_model:quantized:microsoft/Phi-3-vision-128k-instruct, region:us
| null |
silu
| 3,072 | 8,192 | 32 | 32 |
32
| 32,064 | 131,072 | null | null | null |
bfloat16
|
phi3_v
| true | 52 | 1 | false |
quantized
|
onnx-community/Phi-3.5-mini-instruct-onnx-web
|
08/21/2024
|
False
| false | null |
transformers.js, onnx, phi3, text-generation, nlp, code, conversational, multilingual, arxiv:2404.14219, arxiv:2407.13833, license:mit, region:us
| null |
silu
| 3,072 | 8,192 | 32 | 32 |
32
| 32,064 | 131,072 | null | null | null |
bfloat16
|
phi3
| false | 707 | 12 | false |
finetune
|
microsoft/Phi-3-mini-4k-instruct-onnx-web
|
05/17/2024
|
False
| false | null |
transformers.js, onnx, phi3, text-generation, ONNX, ONNXRuntime, ONNXRuntimeWeb, transformers, nlp, conversational, custom_code, license:mit, region:us
| null |
silu
| 3,072 | 8,192 | 32 | 32 |
32
| 32,064 | 4,096 | null | null | null |
bfloat16
|
phi3
| false | 576 | 21 | false |
finetune
|
Xenova/Phi-3-mini-4k-instruct_fp16
|
05/07/2024
|
False
| false | null |
transformers.js, onnx, phi3, text-generation, ONNX, DML, ONNXRuntime, nlp, conversational, custom_code, license:mit, region:us
| null |
silu
| 3,072 | 8,192 | 32 | 32 |
32
| 32,064 | 4,096 | null | null | null |
bfloat16
|
phi3
| false | 766 | 2 | false |
finetune
|
Xenova/Phi-3-mini-4k-instruct
|
05/03/2024
|
False
| false | null |
transformers.js, onnx, phi3, text-generation, ONNX, DML, ONNXRuntime, nlp, conversational, custom_code, license:mit, region:us
| null |
silu
| 3,072 | 8,192 | 32 | 32 |
32
| 32,064 | 4,096 | null | null | null |
bfloat16
|
phi3
| false | 397 | 19 | false |
finetune
|
BricksDisplay/phi-1_5-bnb4
|
03/12/2024
|
False
| false | null |
transformers.js, onnx, phi, text-generation, custom_code, region:us
| null |
gelu_new
| 2,048 | 8,192 | 32 | 24 |
32
| 51,200 | 2,048 | null | null | null | null |
phi
| false | 5 | 0 | false | null |
BricksDisplay/phi-1_5-q4
|
03/12/2024
|
False
| false |
microsoft/phi-1_5
|
transformers.js, onnx, phi, text-generation, custom_code, base_model:microsoft/phi-1_5, base_model:quantized:microsoft/phi-1_5, region:us
| null |
gelu_new
| 2,048 | 8,192 | 32 | 24 |
32
| 51,200 | 2,048 | null | null | null | null |
phi
| false | 7 | 1 | false |
quantized
|
BricksDisplay/phi-1_5
|
03/12/2024
|
False
| false |
microsoft/phi-1_5
|
transformers.js, onnx, phi, text-generation, custom_code, base_model:microsoft/phi-1_5, base_model:quantized:microsoft/phi-1_5, region:us
| null |
gelu_new
| 2,048 | 8,192 | 32 | 24 |
32
| 51,200 | 2,048 | null | null | null | null |
phi
| false | 7 | 0 | false |
quantized
|
Xenova/phi-1_5_dev
|
01/02/2024
|
False
| false |
susnato/phi-1_5_dev
|
transformers.js, onnx, phi, text-generation, base_model:susnato/phi-1_5_dev, base_model:quantized:susnato/phi-1_5_dev, region:us
| null |
gelu_new
| 2,048 | 8,192 | 32 | 24 | null | 51,200 | 2,048 | null | null | null | null |
phi
| false | 20 | 0 | false |
quantized
|
Xenova/tiny-random-PhiForCausalLM
|
12/08/2023
|
False
| false |
hf-internal-testing/tiny-random-PhiForCausalLM
|
transformers.js, onnx, phi, text-generation, base_model:hf-internal-testing/tiny-random-PhiForCausalLM, base_model:quantized:hf-internal-testing/tiny-random-PhiForCausalLM, region:us
| null |
gelu
| 32 | 37 | 4 | 2 | null | 1,024 | 512 | null | null | null | null |
phi
| false | 406 | 1 | false |
quantized
|
Xenova/opt-350m
|
09/06/2023
|
False
| false |
facebook/opt-350m
|
transformers.js, onnx, opt, text-generation, base_model:facebook/opt-350m, base_model:quantized:facebook/opt-350m, region:us
| 0.35 |
relu
| 1,024 | 4,096 | 16 | 24 | null | 50,272 | 2,048 | null | null | null | null |
opt
| false | 22 | 0 | false |
quantized
|
Xenova/opt-125m
|
09/06/2023
|
False
| false |
facebook/opt-125m
|
transformers.js, onnx, opt, text-generation, base_model:facebook/opt-125m, base_model:quantized:facebook/opt-125m, region:us
| 0.125 |
relu
| 768 | 3,072 | 12 | 12 | null | 50,272 | 2,048 | null | null | null | null |
opt
| false | 29 | 0 | false |
quantized
|
Xenova/pygmalion-350m
|
09/06/2023
|
False
| false |
PygmalionAI/pygmalion-350m
|
transformers.js, onnx, opt, text-generation, base_model:PygmalionAI/pygmalion-350m, base_model:quantized:PygmalionAI/pygmalion-350m, region:us
| 0.35 |
relu
| 1,024 | 4,096 | 16 | 24 | null | 50,272 | 2,048 | null | null | null | null |
opt
| false | 13 | 0 | false |
quantized
|
Xenova/OpenELM-450M-Instruct
|
05/02/2024
|
False
| false |
apple/OpenELM-450M-Instruct
|
transformers.js, onnx, openelm, text-generation, conversational, custom_code, base_model:apple/OpenELM-450M-Instruct, base_model:quantized:apple/OpenELM-450M-Instruct, license:other, region:us
| 0.5 | null | null | null | null | null |
[3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6]
| 32,000 | null | null | null | null | null |
openelm
| false | 17 | 0 | false |
quantized
|
Xenova/OpenELM-270M-Instruct
|
05/02/2024
|
False
| false |
apple/OpenELM-270M-Instruct
|
transformers.js, onnx, openelm, text-generation, conversational, custom_code, base_model:apple/OpenELM-270M-Instruct, base_model:quantized:apple/OpenELM-270M-Instruct, region:us
| 0.3 | null | null | null | null | null |
[3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5]
| 32,000 | null | null | null | null | null |
openelm
| false | 27 | 1 | false |
quantized
|
onnx-community/AMD-OLMo-1B-SFT-DPO
|
11/03/2024
|
False
| false |
amd/AMD-OLMo-1B-SFT-DPO
|
transformers.js, onnx, olmo, text-generation, conversational, base_model:amd/AMD-OLMo-1B-SFT-DPO, base_model:quantized:amd/AMD-OLMo-1B-SFT-DPO, region:us
| 1 |
silu
| 2,048 | 8,192 | 16 | 16 |
16
| 50,304 | 2,048 | null | null | null | null |
olmo
| false | 51 | 0 | false |
quantized
|
onnx-community/tiny-random-olmo-hf
|
11/03/2024
|
False
| false |
katuni4ka/tiny-random-olmo-hf
|
transformers.js, onnx, olmo, text-generation, base_model:katuni4ka/tiny-random-olmo-hf, base_model:quantized:katuni4ka/tiny-random-olmo-hf, region:us
| null |
silu
| 64 | 172 | 2 | 2 |
2
| 50,304 | 2,048 | null | null | null | null |
olmo
| false | 316 | 0 | false |
quantized
|
onnx-community/AMD-OLMo-1B-SFT
|
11/03/2024
|
False
| false |
amd/AMD-OLMo-1B-SFT
|
transformers.js, onnx, olmo, text-generation, base_model:amd/AMD-OLMo-1B-SFT, base_model:quantized:amd/AMD-OLMo-1B-SFT, region:us
| 1 |
silu
| 2,048 | 8,192 | 16 | 16 |
16
| 50,304 | 2,048 | null | null | null | null |
olmo
| false | 5 | 0 | false |
quantized
|
onnx-community/AMD-OLMo-1B
|
11/03/2024
|
False
| false |
amd/AMD-OLMo-1B
|
transformers.js, onnx, olmo, text-generation, base_model:amd/AMD-OLMo-1B, base_model:quantized:amd/AMD-OLMo-1B, region:us
| 1 |
silu
| 2,048 | 8,192 | 16 | 16 |
16
| 50,304 | 2,048 | null | null | null | null |
olmo
| false | 13 | 1 | false |
quantized
|
onnx-community/OLMo-1B-hf
|
11/03/2024
|
False
| false |
allenai/OLMo-1B-hf
|
transformers.js, onnx, olmo, text-generation, base_model:allenai/OLMo-1B-hf, base_model:quantized:allenai/OLMo-1B-hf, region:us
| 1 |
silu
| 2,048 | 8,192 | 16 | 16 |
16
| 50,304 | 2,048 | null | null | null | null |
olmo
| false | 6 | 0 | false |
quantized
|
Felladrin/onnx-mpt-125m-c4
|
11/29/2023
|
False
| false |
wtang06/mpt-125m-c4
|
transformers.js, onnx, mpt, text-generation, custom_code, base_model:wtang06/mpt-125m-c4, base_model:quantized:wtang06/mpt-125m-c4, license:apache-2.0, region:us
| 0.125 | null | 768 | null | 8 | 12 | null | 50,368 | null | null | null | null | null |
mpt
| false | 14 | 0 | false |
quantized
|
Xenova/ipt-350m
|
08/31/2023
|
False
| false |
efederici/ipt-350m
|
transformers.js, onnx, mpt, text-generation, custom_code, base_model:efederici/ipt-350m, base_model:quantized:efederici/ipt-350m, region:us
| 0.35 | null | 1,024 | null | 16 | 24 | null | 50,432 | null | null | null | null | null |
mpt
| false | 14 | 0 | false |
quantized
|
jeiku/moondream2-onnx-bnb4
|
06/24/2024
|
False
| false | null |
transformers.js, onnx, moondream1, text-generation, image-text-to-text, custom_code, license:apache-2.0, region:us
| null | null | null | null | null | null | null | null | null | null | null | null |
float16
|
moondream1
| true | null | null | false | null |
Xenova/moondream2
|
03/25/2024
|
False
| false |
vikhyatk/moondream2
|
transformers.js, onnx, moondream1, text-generation, image-text-to-text, custom_code, base_model:vikhyatk/moondream2, base_model:quantized:vikhyatk/moondream2, license:apache-2.0, region:us
| 1.9 | null | null | null | null | null | null | null | null | null | null | null |
float16
|
moondream1
| true | 296 | 22 | false |
quantized
|
onnx-community/MobileLLM-600M
|
11/01/2024
|
False
| false |
facebook/MobileLLM-600M
|
transformers.js, onnx, mobilellm, text-generation, custom_code, base_model:facebook/MobileLLM-600M, base_model:quantized:facebook/MobileLLM-600M, region:us
| 0.6 |
silu
| 1,152 | 3,072 | 18 | 40 |
6
| 32,000 | 2,048 | null | null | null | null |
mobilellm
| false | 22 | 0 | false |
quantized
|
onnx-community/MobileLLM-350M
|
11/01/2024
|
False
| false |
facebook/MobileLLM-350M
|
transformers.js, onnx, mobilellm, text-generation, custom_code, base_model:facebook/MobileLLM-350M, base_model:quantized:facebook/MobileLLM-350M, region:us
| 0.35 |
silu
| 960 | 2,560 | 15 | 32 |
5
| 32,000 | 2,048 | null | null | null | null |
mobilellm
| false | 14 | 0 | false |
quantized
|
onnx-community/MobileLLM-1B
|
11/01/2024
|
False
| false |
facebook/MobileLLM-1B
|
transformers.js, onnx, mobilellm, text-generation, custom_code, base_model:facebook/MobileLLM-1B, base_model:quantized:facebook/MobileLLM-1B, region:us
| 1 |
silu
| 1,280 | 3,584 | 20 | 54 |
5
| 32,000 | 2,048 | null | null | null | null |
mobilellm
| false | 7 | 0 | false |
quantized
|
onnx-community/MobileLLM-125M
|
10/31/2024
|
False
| false |
facebook/MobileLLM-125M
|
transformers.js, onnx, mobilellm, text-generation, custom_code, base_model:facebook/MobileLLM-125M, base_model:quantized:facebook/MobileLLM-125M, region:us
| 0.125 |
silu
| 576 | 1,536 | 9 | 30 |
3
| 32,000 | 2,048 | null | null | null | null |
mobilellm
| false | 75 | 2 | false |
quantized
|
Felladrin/onnx-TinyMistral-248M-Chat-v1
|
03/09/2024
|
False
| false |
Felladrin/TinyMistral-248M-Chat-v2
|
transformers.js, onnx, mistral, text-generation, conversational, en, base_model:Felladrin/TinyMistral-248M-Chat-v2, base_model:quantized:Felladrin/TinyMistral-248M-Chat-v2, license:apache-2.0, region:us
| 0.248 |
silu
| 1,024 | 4,096 | 32 | 12 |
8
| 32,005 | 2,048 | null | null | null | null |
mistral
| false | 10 | 0 | false |
quantized
|
Felladrin/onnx-TinyMistral-248M-Chat-v2
|
04/02/2024
|
False
| false |
Felladrin/TinyMistral-248M-Chat-v2
|
transformers.js, onnx, mistral, text-generation, conversational, en, base_model:Felladrin/TinyMistral-248M-Chat-v2, base_model:quantized:Felladrin/TinyMistral-248M-Chat-v2, license:apache-2.0, region:us
| 0.248 |
silu
| 1,024 | 4,096 | 32 | 12 |
8
| 32,005 | 2,048 | null | null | null | null |
mistral
| false | 399 | 0 | false |
quantized
|
Felladrin/onnx-Minueza-32M-UltraChat
|
02/29/2024
|
False
| false |
Felladrin/Minueza-32M-UltraChat
|
transformers.js, onnx, mistral, text-generation, conversational, en, base_model:Felladrin/Minueza-32M-UltraChat, base_model:quantized:Felladrin/Minueza-32M-UltraChat, license:apache-2.0, region:us
| 0.032 |
silu
| 312 | 1,092 | 12 | 10 |
4
| 32,002 | 2,048 | null | null | null | null |
mistral
| false | 41 | 1 | false |
quantized
|
Felladrin/onnx-Minueza-32M-Chat
|
02/26/2024
|
False
| false |
Felladrin/Minueza-32M-Chat
|
transformers.js, onnx, mistral, text-generation, conversational, en, base_model:Felladrin/Minueza-32M-Chat, base_model:quantized:Felladrin/Minueza-32M-Chat, license:apache-2.0, region:us
| 0.032 |
silu
| 312 | 1,092 | 12 | 10 |
4.0
| 32,002 | 2,048 | null | null | null | null |
mistral
| false | 47 | 0 | false |
quantized
|
Felladrin/onnx-Minueza-32M-Base
|
02/26/2024
|
False
| false |
Felladrin/Minueza-32M-Base
|
transformers.js, onnx, mistral, text-generation, conversational, en, base_model:Felladrin/Minueza-32M-Base, base_model:quantized:Felladrin/Minueza-32M-Base, license:apache-2.0, region:us
| 0.032 |
silu
| 312 | 1,092 | 12 | 10 |
4.0
| 32,002 | 2,048 | null | null | null | null |
mistral
| false | 6 | 0 | false |
quantized
|
Felladrin/onnx-TinyMistral-248M
|
11/16/2023
|
False
| false |
Locutusque/TinyMistral-248M
|
transformers.js, onnx, mistral, text-generation, conversational, base_model:Locutusque/TinyMistral-248M, base_model:quantized:Locutusque/TinyMistral-248M, license:apache-2.0, region:us
| 0.248 |
silu
| 1,024 | 4,096 | 32 | 12 |
8.0
| 32,003 | 32,768 | null | null | null | null |
mistral
| false | 19 | 7 | false |
quantized
|
Felladrin/onnx-TinyMistral-248M-SFT-v4
|
12/11/2023
|
False
| false |
Felladrin/TinyMistral-248M-Chat-v2
|
transformers.js, onnx, mistral, text-generation, conversational, base_model:Felladrin/TinyMistral-248M-Chat-v2, base_model:quantized:Felladrin/TinyMistral-248M-Chat-v2, license:apache-2.0, region:us
| 0.248 |
silu
| 1,024 | 4,096 | 32 | 12 |
8.0
| 32,003 | 32,768 | null | null | null | null |
mistral
| false | 10 | 0 | false |
quantized
|
Xenova/tiny-random-mistral
|
10/31/2023
|
False
| false |
echarlaix/tiny-random-mistral
|
transformers.js, onnx, mistral, text-generation, base_model:echarlaix/tiny-random-mistral, base_model:quantized:echarlaix/tiny-random-mistral, region:us
| null |
gelu
| 32 | 37 | 4 | 2 |
2.0
| 32,000 | 512 | null | null | null | null |
mistral
| false | 19 | 0 | false |
quantized
|
Felladrin/onnx-TinyMistral-248M-v2
|
01/08/2024
|
False
| false |
Locutusque/TinyMistral-248M-v2
|
transformers.js, onnx, mistral, text-generation, base_model:Locutusque/TinyMistral-248M-v2, base_model:quantized:Locutusque/TinyMistral-248M-v2, license:apache-2.0, region:us
| 0.248 |
silu
| 1,024 | 4,096 | 32 | 12 |
8.0
| 32,005 | 32,768 | null | null | null | null |
mistral
| false | 14 | 3 | false |
quantized
|
onnx-community/nanoLLaVA-1.5
|
07/07/2024
|
False
| false |
qnguyen3/nanoLLaVA-1.5
|
transformers.js, onnx, llava, text-generation, multimodal, qwen, image-text-to-text, conversational, en, base_model:qnguyen3/nanoLLaVA-1.5, base_model:quantized:qnguyen3/nanoLLaVA-1.5, license:apache-2.0, region:us
| null |
silu
| 1,024 | 2,816 | 16 | 24 |
16
| 151,936 | 32,768 | null | null | null |
bfloat16
|
llava
| true | 39 | 4 | false |
quantized
|
Xenova/nanoLLaVA
|
05/11/2024
|
False
| false |
qnguyen3/nanoLLaVA
|
transformers.js, onnx, llava, text-generation, multimodal, qwen, image-text-to-text, conversational, custom_code, en, base_model:qnguyen3/nanoLLaVA, base_model:quantized:qnguyen3/nanoLLaVA, license:apache-2.0, region:us
| null |
silu
| 1,024 | 2,816 | 16 | 24 |
16
| 151,936 | 32,768 | null | null | null |
bfloat16
|
llava
| true | 63 | 12 | false |
quantized
|
Felladrin/onnx-llama2_xs_460M_experimental
|
11/10/2023
|
False
| false | null |
transformers.js, onnx, llama, text-generation, region:us
| 0.46 |
silu
| 1,024 | 4,096 | 16 | 24 |
2.0
| 50,304 | 1,024 | null | null | null | null |
llama
| false | 432 | 0 | false | null |
onnx-community/Llama-3.2-1B-Instruct-q4f16
|
09/27/2024
|
False
| false | null |
transformers.js, onnx, llama, text-generation, facebook, meta, pytorch, llama-3, conversational, en, de, fr, it, pt, hi, es, th, arxiv:2204.05149, license:llama3.2, region:us
| 1 |
silu
| 2,048 | 8,192 | 32 | 16 |
8
| 128,256 | 131,072 | null | null | null |
bfloat16
|
llama
| false | 1,136 | 10 | false |
finetune
|
ucalyptus/prem-1B-chat-onnx-q4
|
05/13/2024
|
False
| false | null |
transformers.js, onnx, llama, text-generation, conversational, region:us
| 1 |
silu
| 2,048 | 5,632 | 32 | 22 |
4
| 32,004 | 8,192 | null | null | null | null |
llama
| false | 11 | 1 | false |
finetune
|
From Parameters to Performance: A Data-Driven Study on LLM Structure and Development
This dataset is the official companion to the paper "From Parameters to Performance: A Data-Driven Study on LLM Structure and Development". It provides a comprehensive collection of structural configurations and performance metrics for a wide range of open-source Large Language Models (LLMs), enabling data-driven research on how structural choices impact model performance.
Paper: https://huggingface.co/papers/2509.18136 Code: https://github.com/DX0369/llm-structure-performance
Abstract
Large language models (LLMs) have achieved remarkable success across various domains, driving significant technological advancements and innovations. Despite the rapid growth in model scale and capability, systematic, data-driven research on how structural configurations affect performance remains scarce. To address this gap, we present a large-scale dataset encompassing diverse open-source LLM structures and their performance across multiple benchmarks. Leveraging this dataset, we conduct a systematic, data mining-driven analysis to validate and quantify the relationship between structural configurations and performance. Our study begins with a review of the historical development of LLMs and an exploration of potential future trends. We then analyze how various structural choices impact performance across benchmarks and further corroborate our findings using mechanistic interpretability techniques. By providing data-driven insights into LLM optimization, our work aims to guide the targeted development and application of future models. We will release our dataset at this https URL
Main Contributions
- Large-Scale LLM Structure and Performance Dataset: We introduce a comprehensive dataset containing structural configurations and performance metrics for a wide range of open-source LLMs. The dataset is available at Hugging Face.
- Quantitative Study on Structure's Impact: We provide a large-scale, quantitative validation of how structural configurations (e.g., layer depth, FFN size) influence LLM performance across different benchmarks.
- Mechanistic Interpretability Validation: We use layer-pruning and gradient analysis techniques to validate and provide deeper insights into our data-driven findings.
Usage and Reproduction
Follow these steps to reproduce the analysis and figures from the paper.
Step 1: Prepare the Data
The repository includes two of the three necessary data files (file/merge_performance_parameter.csv
and file/performance.csv
). The third file, file/model_info.csv
, can be download from: .
(Optional) Fetching New Model Data
The data_obtain.py script is provided for users who wish to gather the latest model information directly from the Hugging Face Hub. This step is not necessary to reproduce the original paper's results.
Create
models_list.txt
: This file should contain the list of Hugging Face model IDs you want to analyze, with one ID per line. Thedata_obtain.py
script will read from this file.Set Your Hugging Face Token: For reliable access to the Hugging Face API, set your token as an environment variable.
export HF_TOKEN='your_hf_token_here'
Run the data fetching script: This will create the
file/model_info.csv
file.python data_obtain.py
Step 2: Run the Analysis Notebooks
Once all data files are in the file/
directory, you can run the Jupyter Notebooks to perform the analysis and generate the visualizations. We recommend using Jupyter Lab or Jupyter Notebook.
Launch Jupyter:
jupyter lab
Run
analysis.ipynb
: Open and run the cells in this notebook to reproduce the analysis and visualizations.Run
regression.ipynb
: Open and run the cells in this notebook to reproduce the regression experiments.
- Downloads last month
- 59