Please use the modified models from repo:

  • VLT5Tokenizer
  • VLT5config
  • VLT5VQA
Downloads last month
114
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.