Transformers does not recognize model type `exaone4` architecture
#3
by
BrandNewGD
- opened
I ran pip install git+https://github.com/lgai-exaone/transformers@add-exaone4, but I'm still getting a ValueError. (My transformers version is 4.54.0.dev0.)
ValueError: The checkpoint you are trying to load has model type exaone4
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
Similar ValueError for me initially, I had to use the uploaded chat_template.jinja as chat template, as well as the txt and json files as tokenizer files to get it working with llama.cpp