Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Mungert
/
Qwen3-1.7B-abliterated-GGUF
like
10
Transformers
GGUF
conversational
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Train
Deploy
Use this model
e115f44
Qwen3-1.7B-abliterated-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
22 commits
Mungert
Upload Qwen3-1.7B-abliterated-q4_1.gguf with huggingface_hub
e115f44
verified
4 months ago
.gitattributes
3.02 kB
Upload Qwen3-1.7B-abliterated-q4_1.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-bf16_q4_k.gguf
Safe
1.47 GB
xet
Upload Qwen3-1.7B-abliterated-bf16_q4_k.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-bf16_q6_k.gguf
Safe
1.78 GB
xet
Upload Qwen3-1.7B-abliterated-bf16_q6_k.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-bf16_q8_0.gguf
Safe
2.13 GB
xet
Upload Qwen3-1.7B-abliterated-bf16_q8_0.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-f16.gguf
Safe
3.45 GB
xet
Upload Qwen3-1.7B-abliterated-f16.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-f16_q4_k.gguf
Safe
1.47 GB
xet
Upload Qwen3-1.7B-abliterated-f16_q4_k.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-f16_q6_k.gguf
Safe
1.78 GB
xet
Upload Qwen3-1.7B-abliterated-f16_q6_k.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-f16_q8_0.gguf
Safe
2.13 GB
xet
Upload Qwen3-1.7B-abliterated-f16_q8_0.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q3_k_l.gguf
Safe
1.01 GB
xet
Upload Qwen3-1.7B-abliterated-q3_k_l.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q3_k_m.gguf
Safe
940 MB
xet
Upload Qwen3-1.7B-abliterated-q3_k_m.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q3_k_s.gguf
Safe
826 MB
xet
Upload Qwen3-1.7B-abliterated-q3_k_s.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q4_0.gguf
Safe
974 MB
xet
Upload Qwen3-1.7B-abliterated-q4_0.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q4_1.gguf
Safe
1.08 GB
xet
Upload Qwen3-1.7B-abliterated-q4_1.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q4_k_l.gguf
Safe
1.18 GB
xet
Upload Qwen3-1.7B-abliterated-q4_k_l.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q4_k_m.gguf
Safe
1.11 GB
xet
Upload Qwen3-1.7B-abliterated-q4_k_m.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q4_k_s.gguf
Safe
1.06 GB
xet
Upload Qwen3-1.7B-abliterated-q4_k_s.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q5_k_l.gguf
1.33 GB
xet
Upload Qwen3-1.7B-abliterated-q5_k_l.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q5_k_m.gguf
Safe
1.26 GB
xet
Upload Qwen3-1.7B-abliterated-q5_k_m.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q5_k_s.gguf
Safe
1.23 GB
xet
Upload Qwen3-1.7B-abliterated-q5_k_s.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q6_k_l.gguf
Safe
1.49 GB
xet
Upload Qwen3-1.7B-abliterated-q6_k_l.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q6_k_m.gguf
Safe
1.42 GB
xet
Upload Qwen3-1.7B-abliterated-q6_k_m.gguf with huggingface_hub
4 months ago
Qwen3-1.7B-abliterated-q8_0.gguf
Safe
1.83 GB
xet
Upload Qwen3-1.7B-abliterated-q8_0.gguf with huggingface_hub
4 months ago