AttributeError: 'DynamicCache' object has no attribute 'get_usable_length'. Did you mean: 'get_seq_length'?
sentence-transformers 5.1.0
setuptools 80.9.0
sympy 1.14.0
threadpoolctl 3.6.0
tokenizers 0.21.4
tomli 2.2.1
torch 2.8.0
torchvision 0.23.0+cu129
tqdm 4.67.1
transformers 4.55.2
past_key_values_length = past_key_values.get_usable_length(seq_length)
AttributeError: 'DynamicCache' object has no attribute 'get_usable_length'. Did you mean: 'get_seq_length'?
past_key_values_length = past_key_values.get_usable_length(seq_length)
AttributeError: 'DynamicCache' object has no attribute 'get_usable_length'. Did you mean: 'get_seq_length'?
This was removed in transformers
in this PR - https://github.com/huggingface/transformers/pull/39106 I think everything in tranformers 4.46.0
We've done a fix and reimplemented the removed function here: https://huggingface.co/it-just-works/stella_en_1.5B_v5_bf16/commit/03aedd040580357ec688f3467f1109af5e053249
This is not Github otherwise I would have done a PR back to this repo with the fix, but maybe @infgrad can apply the changes from the above commit.
You can also use this mirror which has the fix applied: https://huggingface.co/it-just-works/stella_en_1.5B_v5_bf16