Missing configuration files for transformer support
I attempted to use this model directly with the Hugging Face transformers library, but encountered errors related to missing configuration files, such as preprocessor_config.json and tokenizer_config.json. These files appear to be essential for seamless integration with the library.
For example, the following code should ideally work without issues:
# Load model directly
from transformers import AutoProcessor, AutoModelForVisualQuestionAnswering
processor = AutoProcessor.from_pretrained("microsoft/OmniParser")
model = AutoModelForVisualQuestionAnswering.from_pretrained("microsoft/OmniParser")
However, it fails due to missing configuration files. This suggests that some additional files, such as preprocessor_config.json
, tokenizer_config.json
, or other related configurations, might need to be included in the repository.
If it’s in your plans to support this model with the transformers library, adding these missing configuration files would make the model much easier to use and more accessible to the community.
Please let me know if I am overlooking something, or if there’s an alternative approach to make this work.
I have the same error, how can we fix it?