File size: 3,258 Bytes
da5f0bf
 
 
 
c08cadd
da5f0bf
 
 
 
 
 
 
 
910d7ce
35d36e8
 
 
da5f0bf
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
fa2cbc9
da5f0bf
 
4f73fd2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
23c91e9
4f73fd2
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
---
language:
- en
pipeline_tag: text-generation
library_name: ExLlamaV2
tags:
- facebook
- meta
- pytorch
- llama
- llama-3
license: other
license_name: llama3
license_link: https://huggingface.co/meta-llama/Meta-Llama-3-8B/blob/main/LICENSE
base_model:
- meta-llama/Meta-Llama-Guard-2-8B
base_model_relation: quantized
---
# Exl2 quants for [Meta-Llama-Guard-2-8B](https://huggingface.co/meta-llama/Meta-Llama-Guard-2-8B)

## Automatically quantized using the auto quant script from [hf-scripts](https://huggingface.co/anthonyg5005/hf-scripts)

Llama Guard 2 8B is the moderation tuned llama3-8B model by Meta.\
Use this moderation model to protect your platform from unwanted generations.\
[BF16 weights](https://huggingface.co/meta-llama/Meta-Llama-Guard-2-8B/tree/main) are recommended for optimal accuracy in production environments.\
This model has the following harm categories:
<table align="center">
<thead>
  <tr>
    <th colspan="2">Harm categories</th>
  </tr>
</thead>
<tbody>
  <tr>
    <td>S1: Violent Crimes</td>
    <td>S2: Non-Violent Crimes</td>
  </tr>
  <tr>
    <td>S3: Sex-Related Crimes</td>
    <td>S4: Child Sexual Exploitation</td>
  </tr>
  <tr>
    <td>S5: Specialized Advice</td>
    <td>S6: Privacy</td>
  </tr>
  <tr>
    <td>S7: Intellectual Property</td>
    <td>S8: Indiscriminate Weapons </td>
  </tr>
  <tr>
    <td>S9: Hate</td>
    <td>S10: Suicide &amp; Self-Harm</td>
  </tr>
  <tr>
    <td>S11: Sexual Content</td>
    <td></td>
  </tr>
</tbody>
</table>

### BPW:

[3.0](https://huggingface.co/Anthonyg5005/Meta-Llama-Guard-2-8B-exl2/tree/3.0bpw)\
[4.0](https://huggingface.co/Anthonyg5005/Meta-Llama-Guard-2-8B-exl2/tree/4.0bpw)\
[5.0](https://huggingface.co/Anthonyg5005/Meta-Llama-Guard-2-8B-exl2/tree/5.0bpw)\
[6.0](https://huggingface.co/Anthonyg5005/Meta-Llama-Guard-2-8B-exl2/tree/6.0bpw)\
[6.5](https://huggingface.co/Anthonyg5005/Meta-Llama-Guard-2-8B-exl2/tree/6.5bpw)\
[8.0](https://huggingface.co/Anthonyg5005/Meta-Llama-Guard-2-8B-exl2/tree/8.0bpw)\
[measurement.json](https://huggingface.co/Anthonyg5005/Meta-Llama-Guard-2-8B-exl2/blob/main/measurement.json)

# How to download:

### oobabooga's downloader

use something like [download-model.py](https://github.com/oobabooga/text-generation-webui/blob/main/download-model.py) to download with python requests.\
Install requirements:

```shell
pip install requests tqdm
```

Example for downloading 8bpw:

```shell
python download-model.py Anthonyg5005/Meta-Llama-Guard-2-8B-exl2:8.0bpw
```

### huggingface-cli

You may also use huggingface-cli\
To install it, install python hf-hub

```shell
pip install huggingface-hub
```

Example for 8bpw:

```shell
huggingface-cli download Anthonyg5005/Meta-Llama-Guard-2-8B-exl2 --local-dir Llama-Guard-2-8B-exl2-8bpw --revision 8.0bpw
```
### Git LFS (not recommended)

I would recommend the http downloaders over using git, they can resume downloads if failed and are much easier to work with.\
Make sure to have git and git LFS installed.\
Example for 8bpw download with git:

Have LFS file skip disabled
```shell
# windows
set GIT_LFS_SKIP_SMUDGE=0
# linux
export GIT_LFS_SKIP_SMUDGE=0
```

Clone repo branch
```shell
git clone https://huggingface.co/Anthonyg5005/Meta-Llama-Guard-2-8B-exl2 -b 8.0bpw
```