topshik commited on
Commit
b7d42ca
·
verified ·
1 Parent(s): e23e93c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +21 -1
README.md CHANGED
@@ -76,8 +76,9 @@ Java Subset:
76
  - Security: Code suggestions should not be assumed to be secure or free of vulnerabilities.
77
 
78
  # Sample Usage
79
- Here’s an example of how to run and sample from the model:
80
 
 
81
  ```python
82
  import json
83
  from transformers import AutoTokenizer, AutoModelForCausalLM
@@ -129,6 +130,25 @@ print("### Prediction")
129
  print(tokenizer.decode(out[0][input_len:]))
130
  ```
131
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
132
  # Citation
133
  If you use this model, please cite:
134
 
 
76
  - Security: Code suggestions should not be assumed to be secure or free of vulnerabilities.
77
 
78
  # Sample Usage
79
+ Here are examples of how to run and sample from the model.
80
 
81
+ ## Generic generaion
82
  ```python
83
  import json
84
  from transformers import AutoTokenizer, AutoModelForCausalLM
 
130
  print(tokenizer.decode(out[0][input_len:]))
131
  ```
132
 
133
+ ## Fill in the middle generation
134
+ ```python
135
+ prefix = """
136
+ def fibonacci(n: int) -> int:
137
+ """
138
+
139
+ suffix = """
140
+ if __name__ == "__main__":
141
+ print(fibonacci(10))
142
+ """
143
+
144
+ encoded_input = tokenizer(f"<fim_suffix>suffix<fim_prefix>{prefix}<fim_middle>", return_tensors='pt', return_token_type_ids=False)
145
+ input_len = len(encoded_input["input_ids"][0])
146
+ out = model.generate(
147
+ **encoded_input,
148
+ max_new_tokens=100,
149
+ )
150
+ ```
151
+
152
  # Citation
153
  If you use this model, please cite:
154