royleibov commited on
Commit
279671a
·
verified ·
1 Parent(s): aea70a2

Fix use this model

Browse files
Files changed (1) hide show
  1. README.md +16 -3
README.md CHANGED
@@ -11,20 +11,33 @@ base_model: ai21labs/Jamba-v0.1
11
 
12
  This model is a clone of [**ai21labs/Jamba-v0.1**](https://huggingface.co/ai21labs/Jamba-v0.1) compressed using ZipNN. Compressed losslessly to 67% its original size, ZipNN saved ~35GB in storage and potentially ~1PB in data transfer **monthly**.
13
 
14
- ## Requirement
15
 
16
  In order to use the model, ZipNN is necessary:
17
  ```bash
18
  pip install zipnn
19
  ```
20
 
21
- Then simply add at the beginning of the file
22
  ```python
 
 
23
  from zipnn import zipnn_hf
24
 
25
  zipnn_hf()
 
 
 
 
 
 
 
 
 
 
 
 
26
  ```
27
- And continue as usual. The patch will take care of decompressing the model correctly and safely.
28
 
29
  # Model Card for Jamba
30
 
 
11
 
12
  This model is a clone of [**ai21labs/Jamba-v0.1**](https://huggingface.co/ai21labs/Jamba-v0.1) compressed using ZipNN. Compressed losslessly to 67% its original size, ZipNN saved ~35GB in storage and potentially ~1PB in data transfer **monthly**.
13
 
14
+ ### Requirement
15
 
16
  In order to use the model, ZipNN is necessary:
17
  ```bash
18
  pip install zipnn
19
  ```
20
 
21
+ ### Use This Model
22
  ```python
23
+ # Use a pipeline as a high-level helper
24
+ from transformers import pipeline
25
  from zipnn import zipnn_hf
26
 
27
  zipnn_hf()
28
+
29
+ pipe = pipeline("text-generation", model="royleibov/Jamba-v0.1-ZipNN-Compressed", trust_remote_code=True)
30
+ ```
31
+ ```python
32
+ # Load model directly
33
+ from transformers import AutoTokenizer, AutoModelForCausalLM
34
+ from zipnn import zipnn_hf
35
+
36
+ zipnn_hf()
37
+
38
+ tokenizer = AutoTokenizer.from_pretrained("royleibov/Jamba-v0.1-ZipNN-Compressed", trust_remote_code=True)
39
+ model = AutoModelForCausalLM.from_pretrained("royleibov/Jamba-v0.1-ZipNN-Compressed", trust_remote_code=True)
40
  ```
 
41
 
42
  # Model Card for Jamba
43