Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Mungert
/
Qwen3-1.7B-abliterated-GGUF
like
10
Transformers
GGUF
conversational
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Train
Deploy
Use this model
main
Qwen3-1.7B-abliterated-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
55 commits
Mungert
Delete file(s) containing ['q2_k_l.gguf', 'q3_k_l.gguf', 'q4_k_l.gguf', 'q5_k_l.gguf', 'q6_k_l.gguf', 'q4_0_l.gguf', 'q5_0_l.gguf', 'q4_1_l.gguf', 'q5_1_l.gguf', 'bf16_q6_k.gguf']
b169d51
verified
13 days ago
.gitattributes
Safe
4 kB
Upload Qwen3-1.7B-abliterated-bf16.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-bf16.gguf
Safe
3.45 GB
xet
Upload Qwen3-1.7B-abliterated-bf16.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-bf16_q8_0.gguf
Safe
2.13 GB
xet
Upload Qwen3-1.7B-abliterated-bf16_q8_0.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-f16_q8_0.gguf
Safe
2.13 GB
xet
Upload Qwen3-1.7B-abliterated-f16_q8_0.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-iq3_m.gguf
Safe
854 MB
xet
Upload Qwen3-1.7B-abliterated-iq3_m.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-iq3_s.gguf
Safe
826 MB
xet
Upload Qwen3-1.7B-abliterated-iq3_s.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-iq3_xs.gguf
Safe
793 MB
xet
Upload Qwen3-1.7B-abliterated-iq3_xs.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-iq3_xxs.gguf
Safe
754 MB
xet
Upload Qwen3-1.7B-abliterated-iq3_xxs.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-iq4_nl.gguf
Safe
1.05 GB
xet
Upload Qwen3-1.7B-abliterated-iq4_nl.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-iq4_xs.gguf
Safe
1.01 GB
xet
Upload Qwen3-1.7B-abliterated-iq4_xs.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q3_k_m.gguf
Safe
940 MB
xet
Upload Qwen3-1.7B-abliterated-q3_k_m.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q3_k_s.gguf
Safe
826 MB
xet
Upload Qwen3-1.7B-abliterated-q3_k_s.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q4_0.gguf
Safe
974 MB
xet
Upload Qwen3-1.7B-abliterated-q4_0.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q4_1.gguf
Safe
1.08 GB
xet
Upload Qwen3-1.7B-abliterated-q4_1.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q4_k_m.gguf
Safe
1.11 GB
xet
Upload Qwen3-1.7B-abliterated-q4_k_m.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q4_k_s.gguf
Safe
1.06 GB
xet
Upload Qwen3-1.7B-abliterated-q4_k_s.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q5_0.gguf
Safe
1.19 GB
xet
Upload Qwen3-1.7B-abliterated-q5_0.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q5_1.gguf
Safe
1.3 GB
xet
Upload Qwen3-1.7B-abliterated-q5_1.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q5_k_m.gguf
Safe
1.26 GB
xet
Upload Qwen3-1.7B-abliterated-q5_k_m.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q5_k_s.gguf
Safe
1.23 GB
xet
Upload Qwen3-1.7B-abliterated-q5_k_s.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q6_k_m.gguf
Safe
1.42 GB
xet
Upload Qwen3-1.7B-abliterated-q6_k_m.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q8_0.gguf
Safe
1.83 GB
xet
Upload Qwen3-1.7B-abliterated-q8_0.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated.imatrix
Safe
2.07 MB
xet
Upload Qwen3-1.7B-abliterated.imatrix with huggingface_hub
4 months ago
README.md
Safe
14.8 kB
Upload README.md with huggingface_hub
2 months ago