zR
zRzRzRzRzRzRzR
AI & ML interests
LLM & Agent
Recent Activity
new activity
1 day ago
THUDM/CogView4-6B:请问有图生图功能吗
new activity
15 days ago
THUDM/cogvlm2-llama3-caption:Add support for transformers>=4.49
new activity
27 days ago
THUDM/GLM-4-32B-0414:Fix template when add_generation_prompt=true
Organizations
zRzRzRzRzRzRzR's activity
请问有图生图功能吗
1
#11 opened 8 days ago
by
mc112611
Add support for transformers>=4.49
8
#12 opened 29 days ago
by
Kaixuanliu
Fix template when add_generation_prompt=true
#14 opened about 1 month ago
by
matteogeniaccio
Update library name and base model
#2 opened 29 days ago
by
reach-vb

System prompt and inference settings
2
#7 opened about 1 month ago
by
danihend
What's the max context on this?
1
#9 opened about 1 month ago
by
ThePabli
When running with a single GPU, I get an error saying the VRAM is insufficient. However, when using multiple GPUs on a single machine, there are many errors. My vllm version is 0.8.4.
1
#6 opened about 1 month ago
by
hanson888

Some bug when using function call with vllm==0.8.4
2
#4 opened about 1 month ago
by
waple

I get too many repetitions
6
#5 opened about 1 month ago
by
JLouisBiz

Readme information missing for this non-reasoning model
2
#4 opened about 1 month ago
by
Doctor-Chad-PhD

When will the GLM-4/Z1 series model support VLLM?
2
#6 opened about 1 month ago
by
David3698
fail to load BLM-Z1-9B DUE TO tokenization ISSUE
5
#1 opened about 1 month ago
by
Fleinstein
bug in demo code
2
#2 opened about 1 month ago
by
undefined-x
Base model
2
#1 opened about 1 month ago
by
mrfakename

License
2
#2 opened about 1 month ago
by
mrfakename

Update modeling_chatglm.py for transformers 4.49 compatibility
#89 opened 3 months ago
by
sylwia-kuros

开源模型不支持with_structured_output结构化输出吗
1
#88 opened 3 months ago
by
RoboTerh
蒸馏版在哪?
1
#2 opened 3 months ago
by
sunhaha123
Create Felix
#6 opened 3 months ago
by
Carino189

diffusion_pytorch_model.bin missing?
➕
1
1
#26 opened 5 months ago
by
Rafael1
