runtime error

Exit code: 1. Reason: config.json: 0%| | 0.00/29.0 [00:00<?, ?B/s] config.json: 100%|██████████| 29.0/29.0 [00:00<00:00, 143kB/s] Using base Qwen tokenizer... tokenizer_config.json: 0%| | 0.00/7.30k [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 7.30k/7.30k [00:00<00:00, 34.1MB/s] vocab.json: 0%| | 0.00/2.78M [00:00<?, ?B/s] vocab.json: 100%|██████████| 2.78M/2.78M [00:00<00:00, 12.0MB/s] merges.txt: 0%| | 0.00/1.67M [00:00<?, ?B/s] merges.txt: 100%|██████████| 1.67M/1.67M [00:00<00:00, 44.8MB/s] tokenizer.json: 0%| | 0.00/7.03M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 7.03M/7.03M [00:00<00:00, 23.9MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 51, in <module> tokenizer, model = load_qwen_n8n_model() File "/home/user/app/app.py", line 17, in load_qwen_n8n_model model = AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 279, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4260, in from_pretrained checkpoint_files, sharded_metadata = _get_resolved_checkpoint_files( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1100, in _get_resolved_checkpoint_files raise EnvironmentError( OSError: npv2k1/Qwen2.5-7B-n8n does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.

Container logs:

Fetching error logs...