runtime error
Exit code: 1. Reason: config.json: 0%| | 0.00/29.0 [00:00<?, ?B/s][A config.json: 100%|██████████| 29.0/29.0 [00:00<00:00, 146kB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 4, in <module> generator = pipeline("text-generation", model="Markeaze/dashboard") File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 940, in pipeline framework, model = infer_framework_load_model( File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/site-packages/transformers/pipelines/base.py", line 303, in infer_framework_load_model raise ValueError( ValueError: Could not load model Markeaze/dashboard with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>,). See the original errors: while loading with AutoModelForCausalLM, an error is thrown: Traceback (most recent call last): File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/site-packages/transformers/pipelines/base.py", line 290, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/site-packages/transformers/modeling_utils.py", line 262, in _wrapper return func(*args, **kwargs) File "/home/user/.pyenv/versions/3.10.16/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3958, in from_pretrained raise EnvironmentError( OSError: Markeaze/dashboard does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.
Container logs:
Fetching error logs...