Xenova HF Staff whitphx HF Staff commited on
Commit
9e658b3
·
verified ·
1 Parent(s): f7d125c

Add/update the quantized ONNX model files and README.md for Transformers.js v3 (#2)

Browse files

- Add/update the quantized ONNX model files and README.md for Transformers.js v3 (428b9e24aa60d9a7850857a5a501ccf182503183)


Co-authored-by: Yuichiro Tachibana <[email protected]>

Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -7,7 +7,7 @@ tags:
7
 
8
  # text-davinci-002 Tokenizer
9
 
10
- A 🤗-compatible version of the **text-davinci-002 tokenizer** (adapted from [openai/tiktoken](https://github.com/openai/tiktoken)). This means it can be used with Hugging Face libraries including [Transformers](https://github.com/huggingface/transformers), [Tokenizers](https://github.com/huggingface/tokenizers), and [Transformers.js](https://github.com/xenova/transformers.js).
11
 
12
  ## Example usage:
13
 
@@ -21,8 +21,8 @@ assert tokenizer.encode('hello world') == [31373, 995]
21
 
22
  ### Transformers.js
23
  ```js
24
- import { AutoTokenizer } from '@xenova/transformers';
25
 
26
  const tokenizer = await AutoTokenizer.from_pretrained('Xenova/text-davinci-002');
27
  const tokens = tokenizer.encode('hello world'); // [31373, 995]
28
- ```
 
7
 
8
  # text-davinci-002 Tokenizer
9
 
10
+ A 🤗-compatible version of the **text-davinci-002 tokenizer** (adapted from [openai/tiktoken](https://github.com/openai/tiktoken)). This means it can be used with Hugging Face libraries including [Transformers](https://github.com/huggingface/transformers), [Tokenizers](https://github.com/huggingface/tokenizers), and [Transformers.js](https://github.com/huggingface/transformers.js).
11
 
12
  ## Example usage:
13
 
 
21
 
22
  ### Transformers.js
23
  ```js
24
+ import { AutoTokenizer } from '@huggingface/transformers';
25
 
26
  const tokenizer = await AutoTokenizer.from_pretrained('Xenova/text-davinci-002');
27
  const tokens = tokenizer.encode('hello world'); // [31373, 995]
28
+ ```