Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
bartowski
/
google_gemma-3-1b-it-GGUF
like
6
Text Generation
GGUF
conversational
License:
gemma
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
main
google_gemma-3-1b-it-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
29 commits
bartowski
Update README.md
116f762
verified
about 2 months ago
.gitattributes
Safe
3.23 kB
Upload google_gemma-3-1b-it.imatrix with huggingface_hub
about 2 months ago
README.md
Safe
13.9 kB
Update README.md
about 2 months ago
google_gemma-3-1b-it-IQ2_M.gguf
Safe
670 MB
xet
Upload google_gemma-3-1b-it-IQ2_M.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-IQ3_M.gguf
Safe
697 MB
xet
Upload google_gemma-3-1b-it-IQ3_M.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-IQ3_XS.gguf
Safe
690 MB
xet
Upload google_gemma-3-1b-it-IQ3_XS.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-IQ3_XXS.gguf
Safe
680 MB
xet
Upload google_gemma-3-1b-it-IQ3_XXS.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-IQ4_NL.gguf
Safe
722 MB
xet
Upload google_gemma-3-1b-it-IQ4_NL.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-IQ4_XS.gguf
Safe
714 MB
xet
Upload google_gemma-3-1b-it-IQ4_XS.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q2_K.gguf
Safe
690 MB
xet
Upload google_gemma-3-1b-it-Q2_K.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q2_K_L.gguf
Safe
690 MB
xet
Upload google_gemma-3-1b-it-Q2_K_L.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q3_K_L.gguf
Safe
752 MB
xet
Upload google_gemma-3-1b-it-Q3_K_L.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q3_K_M.gguf
Safe
722 MB
xet
Upload google_gemma-3-1b-it-Q3_K_M.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q3_K_S.gguf
Safe
689 MB
xet
Upload google_gemma-3-1b-it-Q3_K_S.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q3_K_XL.gguf
Safe
752 MB
xet
Upload google_gemma-3-1b-it-Q3_K_XL.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q4_0.gguf
Safe
722 MB
xet
Upload google_gemma-3-1b-it-Q4_0.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q4_1.gguf
Safe
764 MB
xet
Upload google_gemma-3-1b-it-Q4_1.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q4_K_L.gguf
Safe
806 MB
xet
Upload google_gemma-3-1b-it-Q4_K_L.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q4_K_M.gguf
Safe
806 MB
xet
Upload google_gemma-3-1b-it-Q4_K_M.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q4_K_S.gguf
Safe
781 MB
xet
Upload google_gemma-3-1b-it-Q4_K_S.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q5_K_L.gguf
Safe
851 MB
xet
Upload google_gemma-3-1b-it-Q5_K_L.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q5_K_M.gguf
Safe
851 MB
xet
Upload google_gemma-3-1b-it-Q5_K_M.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q5_K_S.gguf
Safe
836 MB
xet
Upload google_gemma-3-1b-it-Q5_K_S.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q6_K.gguf
Safe
1.01 GB
xet
Upload google_gemma-3-1b-it-Q6_K.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q6_K_L.gguf
Safe
1.01 GB
xet
Upload google_gemma-3-1b-it-Q6_K_L.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-Q8_0.gguf
Safe
1.07 GB
xet
Upload google_gemma-3-1b-it-Q8_0.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it-bf16.gguf
Safe
2.01 GB
xet
Upload google_gemma-3-1b-it-bf16.gguf with huggingface_hub
about 2 months ago
google_gemma-3-1b-it.imatrix
Safe
1.43 MB
xet
Upload google_gemma-3-1b-it.imatrix with huggingface_hub
about 2 months ago