What's the meaning of "s32B" and "b79K" in CLIP-ViT-H-14-laion2B-s32B-b79K ?
1
#13 opened 8 months ago
by
xieyang233
Different result between model space and local deployment
4
#12 opened 9 months ago
by
jeff-lee
Extracting `text_encoder` from `ViT-H-14` using `open_clip_torch`?
1
#9 opened about 1 year ago
by
Chanuhf
What is the difference between 'open_clip_pytorch_model.bin' and 'pytorch_model.bin'?
1
#8 opened about 1 year ago
by
buaadwxl
Request: DOI
2
#7 opened over 1 year ago
by
Ab0715
make a space please
#5 opened almost 2 years ago
by
micole66
Make the model to load automatically without waiting
5
#3 opened almost 2 years ago
by
micole66
`model_max_length` might be missing from the `tokenizer_config.json`
2
#2 opened almost 2 years ago
by
fischcheng