legraphista commited on
Commit
4f7dda9
Β·
verified Β·
1 Parent(s): 0fbf166

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -73,12 +73,12 @@ Link: [here](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GG
73
  | [Reflection-Llama-3.1-70B.Q4_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q4_K.gguf) | Q4_K | 42.52GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
74
  | [Reflection-Llama-3.1-70B.Q4_K_S.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q4_K_S.gguf) | Q4_K_S | 40.35GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
75
  | [Reflection-Llama-3.1-70B.IQ4_NL.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.IQ4_NL.gguf) | IQ4_NL | 40.05GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
76
- | Reflection-Llama-3.1-70B.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟒 IMatrix | -
77
  | [Reflection-Llama-3.1-70B.Q3_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K.gguf) | Q3_K | 34.27GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
78
  | [Reflection-Llama-3.1-70B.Q3_K_L.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K_L.gguf) | Q3_K_L | 37.14GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
79
  | [Reflection-Llama-3.1-70B.Q3_K_S.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K_S.gguf) | Q3_K_S | 30.91GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
80
  | Reflection-Llama-3.1-70B.IQ3_M | IQ3_M | - | ⏳ Processing | 🟒 IMatrix | -
81
- | Reflection-Llama-3.1-70B.IQ3_S | IQ3_S | - | ⏳ Processing | 🟒 IMatrix | -
82
  | Reflection-Llama-3.1-70B.IQ3_XS | IQ3_XS | - | ⏳ Processing | 🟒 IMatrix | -
83
  | Reflection-Llama-3.1-70B.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟒 IMatrix | -
84
  | [Reflection-Llama-3.1-70B.Q2_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q2_K.gguf) | Q2_K | 26.38GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
 
73
  | [Reflection-Llama-3.1-70B.Q4_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q4_K.gguf) | Q4_K | 42.52GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
74
  | [Reflection-Llama-3.1-70B.Q4_K_S.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q4_K_S.gguf) | Q4_K_S | 40.35GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
75
  | [Reflection-Llama-3.1-70B.IQ4_NL.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.IQ4_NL.gguf) | IQ4_NL | 40.05GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
76
+ | [Reflection-Llama-3.1-70B.IQ4_XS.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.IQ4_XS.gguf) | IQ4_XS | 37.90GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
77
  | [Reflection-Llama-3.1-70B.Q3_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K.gguf) | Q3_K | 34.27GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
78
  | [Reflection-Llama-3.1-70B.Q3_K_L.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K_L.gguf) | Q3_K_L | 37.14GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
79
  | [Reflection-Llama-3.1-70B.Q3_K_S.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q3_K_S.gguf) | Q3_K_S | 30.91GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
80
  | Reflection-Llama-3.1-70B.IQ3_M | IQ3_M | - | ⏳ Processing | 🟒 IMatrix | -
81
+ | [Reflection-Llama-3.1-70B.IQ3_S.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.IQ3_S.gguf) | IQ3_S | 30.91GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
82
  | Reflection-Llama-3.1-70B.IQ3_XS | IQ3_XS | - | ⏳ Processing | 🟒 IMatrix | -
83
  | Reflection-Llama-3.1-70B.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟒 IMatrix | -
84
  | [Reflection-Llama-3.1-70B.Q2_K.gguf](https://huggingface.co/legraphista/Reflection-Llama-3.1-70B-IMat-GGUF/blob/main/Reflection-Llama-3.1-70B.Q2_K.gguf) | Q2_K | 26.38GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No