bibproj commited on
Commit
469007c
·
verified ·
1 Parent(s): 1267e6e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +37 -5
README.md CHANGED
@@ -1,5 +1,37 @@
1
- ---
2
- license: other
3
- license_name: modified-mit
4
- license_link: https://huggingface.co/moonshotai/Kimi-K2-Instruct-0905/blob/main/LICENSE
5
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: modified-mit
4
+ library_name: mlx
5
+ tags:
6
+ - mlx
7
+ pipeline_tag: text-generation
8
+ base_model: moonshotai/Kimi-K2-Instruct-0905
9
+ ---
10
+
11
+ # mlx-community/moonshotai_Kimi-K2-Instruct-0905-mlx-3bit
12
+
13
+ This model [mlx-community/moonshotai_Kimi-K2-Instruct-0905-mlx-3bit](https://huggingface.co/mlx-community/moonshotai_Kimi-K2-Instruct-0905-mlx-3bit) was
14
+ converted to MLX format from [moonshotai/Kimi-K2-Instruct-0905](https://huggingface.co/moonshotai/Kimi-K2-Instruct-0905)
15
+ using mlx-lm version **0.26.3**.
16
+
17
+ ## Use with mlx
18
+
19
+ ```bash
20
+ pip install mlx-lm
21
+ ```
22
+
23
+ ```python
24
+ from mlx_lm import load, generate
25
+
26
+ model, tokenizer = load("mlx-community/moonshotai_Kimi-K2-Instruct-0905-mlx-3bit")
27
+
28
+ prompt = "hello"
29
+
30
+ if tokenizer.chat_template is not None:
31
+ messages = [{"role": "user", "content": prompt}]
32
+ prompt = tokenizer.apply_chat_template(
33
+ messages, add_generation_prompt=True
34
+ )
35
+
36
+ response = generate(model, tokenizer, prompt=prompt, verbose=True)
37
+ ```