Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -73,12 +73,12 @@ Link: [here](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GG
|
|
73 |
| [Reflection-Llama-3.1-70B.Q4_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q4_K.gguf) | Q4_K | 42.52GB | β
Available | π’ IMatrix | π¦ No
|
74 |
| [Reflection-Llama-3.1-70B.Q4_K_S.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q4_K_S.gguf) | Q4_K_S | 40.35GB | β
Available | π’ IMatrix | π¦ No
|
75 |
| [Reflection-Llama-3.1-70B.IQ4_NL.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.IQ4_NL.gguf) | IQ4_NL | 40.05GB | β
Available | π’ IMatrix | π¦ No
|
76 |
-
| Reflection-Llama-3.1-70B.IQ4_XS | IQ4_XS |
|
77 |
| [Reflection-Llama-3.1-70B.Q3_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K.gguf) | Q3_K | 34.27GB | β
Available | π’ IMatrix | π¦ No
|
78 |
| [Reflection-Llama-3.1-70B.Q3_K_L.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K_L.gguf) | Q3_K_L | 37.14GB | β
Available | π’ IMatrix | π¦ No
|
79 |
| [Reflection-Llama-3.1-70B.Q3_K_S.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K_S.gguf) | Q3_K_S | 30.91GB | β
Available | π’ IMatrix | π¦ No
|
80 |
| Reflection-Llama-3.1-70B.IQ3_M | IQ3_M | - | β³ Processing | π’ IMatrix | -
|
81 |
-
| Reflection-Llama-3.1-70B.IQ3_S | IQ3_S |
|
82 |
| Reflection-Llama-3.1-70B.IQ3_XS | IQ3_XS | - | β³ Processing | π’ IMatrix | -
|
83 |
| Reflection-Llama-3.1-70B.IQ3_XXS | IQ3_XXS | - | β³ Processing | π’ IMatrix | -
|
84 |
| [Reflection-Llama-3.1-70B.Q2_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q2_K.gguf) | Q2_K | 26.38GB | β
Available | π’ IMatrix | π¦ No
|
|
|
73 |
| [Reflection-Llama-3.1-70B.Q4_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q4_K.gguf) | Q4_K | 42.52GB | β
Available | π’ IMatrix | π¦ No
|
74 |
| [Reflection-Llama-3.1-70B.Q4_K_S.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q4_K_S.gguf) | Q4_K_S | 40.35GB | β
Available | π’ IMatrix | π¦ No
|
75 |
| [Reflection-Llama-3.1-70B.IQ4_NL.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.IQ4_NL.gguf) | IQ4_NL | 40.05GB | β
Available | π’ IMatrix | π¦ No
|
76 |
+
| [Reflection-Llama-3.1-70B.IQ4_XS.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.IQ4_XS.gguf) | IQ4_XS | 37.90GB | β
Available | π’ IMatrix | π¦ No
|
77 |
| [Reflection-Llama-3.1-70B.Q3_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K.gguf) | Q3_K | 34.27GB | β
Available | π’ IMatrix | π¦ No
|
78 |
| [Reflection-Llama-3.1-70B.Q3_K_L.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K_L.gguf) | Q3_K_L | 37.14GB | β
Available | π’ IMatrix | π¦ No
|
79 |
| [Reflection-Llama-3.1-70B.Q3_K_S.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K_S.gguf) | Q3_K_S | 30.91GB | β
Available | π’ IMatrix | π¦ No
|
80 |
| Reflection-Llama-3.1-70B.IQ3_M | IQ3_M | - | β³ Processing | π’ IMatrix | -
|
81 |
+
| [Reflection-Llama-3.1-70B.IQ3_S.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.IQ3_S.gguf) | IQ3_S | 30.91GB | β
Available | π’ IMatrix | π¦ No
|
82 |
| Reflection-Llama-3.1-70B.IQ3_XS | IQ3_XS | - | β³ Processing | π’ IMatrix | -
|
83 |
| Reflection-Llama-3.1-70B.IQ3_XXS | IQ3_XXS | - | β³ Processing | π’ IMatrix | -
|
84 |
| [Reflection-Llama-3.1-70B.Q2_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q2_K.gguf) | Q2_K | 26.38GB | β
Available | π’ IMatrix | π¦ No
|