zRzRzRzRzRzRzR
commited on
Commit
·
d191847
1
Parent(s):
d22a3a2
README.md
CHANGED
@@ -13,16 +13,10 @@ library_name: transformers
|
|
13 |
|
14 |
Based on our latest technological advancements, we have trained a `GLM-4-0414` series model. During pretraining, we incorporated more code-related and reasoning-related data. In the alignment phase, we optimized the model specifically for agent capabilities. As a result, the model's performance in agent tasks such as tool use, web search, and coding has been significantly improved.
|
15 |
|
16 |
-
## Installation
|
17 |
-
|
18 |
-
Install the transformers library from the source code:
|
19 |
-
|
20 |
-
```shell
|
21 |
-
pip install git+https://github.com/huggingface/transformers.git
|
22 |
-
```
|
23 |
-
|
24 |
## Inference Code
|
25 |
|
|
|
|
|
26 |
```python
|
27 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
28 |
|
|
|
13 |
|
14 |
Based on our latest technological advancements, we have trained a `GLM-4-0414` series model. During pretraining, we incorporated more code-related and reasoning-related data. In the alignment phase, we optimized the model specifically for agent capabilities. As a result, the model's performance in agent tasks such as tool use, web search, and coding has been significantly improved.
|
15 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
## Inference Code
|
17 |
|
18 |
+
Make Sure Using `transforemrs>=4.51.3`.
|
19 |
+
|
20 |
```python
|
21 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
22 |
|