timm/vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.6B • Updated Jan 21 • 439 • 2
timm/vit_huge_patch14_clip_224.laion2b_ft_in12k Image Classification • 0.6B • Updated Jan 21 • 233 • 1
timm/vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k Image Classification • 0.6B • Updated Jan 21 • 240 • 2
timm/vit_large_patch14_clip_224.openai_ft_in1k Image Classification • 0.3B • Updated Jan 21 • 2.73k • 1
timm/vit_large_patch14_clip_224.openai_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 990 • 38
timm/vit_base_patch32_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 21.6k • 2
timm/vit_base_patch32_clip_384.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 232
timm/vit_base_patch32_clip_448.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 634k • 4
timm/vit_base_patch16_clip_224.laion2b_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 1.97k • 1
timm/vit_base_patch16_clip_224.openai_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 3.11k • 1
timm/vit_base_patch16_clip_384.laion2b_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 156 • 5
timm/vit_base_patch32_clip_384.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 451
timm/vit_base_patch16_clip_384.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 1.07k • 4
timm/vit_large_patch14_clip_336.openai_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 128 • 1
timm/vit_base_patch16_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 2.66k • 2
timm/vit_base_patch16_clip_224.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 2.16k
timm/vit_base_patch16_clip_384.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 40 • 1