Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
DevQuasar
/
google.gemma-3-4b-it-qat-int4-unquantized-GGUF
like
0
Follow
DevQuasar
144
Image-Text-to-Text
GGUF
conversational
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
google.gemma-3-4b-it-qat-int4-unquantized-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
16 commits
csabakecskemeti
Upload mmproj.gguf
fe1f0d9
verified
22 days ago
.gitattributes
2.55 kB
Upload mmproj.gguf
22 days ago
README.md
882 Bytes
Update README.md
22 days ago
google.gemma-3-4b-it-qat-int4-unquantized.Q2_K.gguf
1.73 GB
xet
Upload google.gemma-3-4b-it-qat-int4-unquantized.Q2_K.gguf with huggingface_hub
22 days ago
google.gemma-3-4b-it-qat-int4-unquantized.Q3_K_L.gguf
2.24 GB
xet
Upload google.gemma-3-4b-it-qat-int4-unquantized.Q3_K_L.gguf with huggingface_hub
22 days ago
google.gemma-3-4b-it-qat-int4-unquantized.Q3_K_M.gguf
2.1 GB
xet
Upload google.gemma-3-4b-it-qat-int4-unquantized.Q3_K_M.gguf with huggingface_hub
22 days ago
google.gemma-3-4b-it-qat-int4-unquantized.Q3_K_S.gguf
1.94 GB
xet
Upload google.gemma-3-4b-it-qat-int4-unquantized.Q3_K_S.gguf with huggingface_hub
22 days ago
google.gemma-3-4b-it-qat-int4-unquantized.Q4_K_M.gguf
2.49 GB
xet
Upload google.gemma-3-4b-it-qat-int4-unquantized.Q4_K_M.gguf with huggingface_hub
22 days ago
google.gemma-3-4b-it-qat-int4-unquantized.Q4_K_S.gguf
2.38 GB
xet
Upload google.gemma-3-4b-it-qat-int4-unquantized.Q4_K_S.gguf with huggingface_hub
22 days ago
google.gemma-3-4b-it-qat-int4-unquantized.Q5_K_M.gguf
2.83 GB
xet
Upload google.gemma-3-4b-it-qat-int4-unquantized.Q5_K_M.gguf with huggingface_hub
22 days ago
google.gemma-3-4b-it-qat-int4-unquantized.Q5_K_S.gguf
2.76 GB
xet
Upload google.gemma-3-4b-it-qat-int4-unquantized.Q5_K_S.gguf with huggingface_hub
22 days ago
google.gemma-3-4b-it-qat-int4-unquantized.Q6_K.gguf
3.19 GB
xet
Upload google.gemma-3-4b-it-qat-int4-unquantized.Q6_K.gguf with huggingface_hub
22 days ago
google.gemma-3-4b-it-qat-int4-unquantized.Q8_0.gguf
4.13 GB
xet
Upload google.gemma-3-4b-it-qat-int4-unquantized.Q8_0.gguf with huggingface_hub
22 days ago
google.gemma-3-4b-it-qat-int4-unquantized.f16.gguf
7.77 GB
xet
Upload google.gemma-3-4b-it-qat-int4-unquantized.f16.gguf with huggingface_hub
22 days ago
mmproj.gguf
851 MB
xet
Upload mmproj.gguf
22 days ago