jacobi commited on
Commit
906623a
·
verified ·
1 Parent(s): 1894b35

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -16
README.md CHANGED
@@ -107,7 +107,7 @@ Load this model using the `from_pretrained` method:
107
  from model2vec import StaticModel
108
 
109
  # Load a pretrained Model2Vec model
110
- model = StaticModel.from_pretrained("nano-snowflake-arctic-v2")
111
 
112
  # Compute text embeddings
113
  embeddings = model.encode(["Example sentence"])
@@ -121,26 +121,12 @@ You can also use the [Sentence Transformers library](https://github.com/UKPLab/s
121
  from sentence_transformers import SentenceTransformer
122
 
123
  # Load a pretrained Sentence Transformer model
124
- model = SentenceTransformer("nano-snowflake-arctic-v2")
125
 
126
  # Compute text embeddings
127
  embeddings = model.encode(["Example sentence"])
128
  ```
129
 
130
- ### Distilling a Model2Vec model
131
-
132
- You can distill a Model2Vec model from a Sentence Transformer model using the `distill` method. First, install the `distill` extra with `pip install model2vec[distill]`. Then, run the following code:
133
-
134
- ```python
135
- from model2vec.distill import distill
136
-
137
- # Distill a Sentence Transformer model, in this case the BAAI/bge-base-en-v1.5 model
138
- m2v_model = distill(model_name="BAAI/bge-base-en-v1.5", pca_dims=256)
139
-
140
- # Save the model
141
- m2v_model.save_pretrained("m2v_model")
142
- ```
143
-
144
  ## How it works
145
 
146
  Model2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.
 
107
  from model2vec import StaticModel
108
 
109
  # Load a pretrained Model2Vec model
110
+ model = StaticModel.from_pretrained("jacobi/nano-snowflake-arctic-v2")
111
 
112
  # Compute text embeddings
113
  embeddings = model.encode(["Example sentence"])
 
121
  from sentence_transformers import SentenceTransformer
122
 
123
  # Load a pretrained Sentence Transformer model
124
+ model = SentenceTransformer("jacobi/nano-snowflake-arctic-v2")
125
 
126
  # Compute text embeddings
127
  embeddings = model.encode(["Example sentence"])
128
  ```
129
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
130
  ## How it works
131
 
132
  Model2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.