not working with latest transformers versions
#1
by
sebraun
- opened
Hi, this seems to solve the prepare_inputs_for_generation issue, but I get another error:
/modeling_phi4mm.py", line 1232, in forward
kv_seq_len += past_key_value.get_usable_length(kv_seq_len, self.layer_idx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'DynamicCache' object has no attribute 'get_usable_length'. Did you mean: 'get_seq_length'?
For which package versions has this been tested? I am on:
torch 2.9.0
transformers 4.57.1
peft 0.17.1
accelerate 1.11.0