runtime error

Exit code: 1. Reason: math = <module 'math' from β”‚ β”‚ β”‚ β”‚ '/usr/local/lib/python3.10/lib-dynloa… β”‚ β”‚ β”‚ β”‚ MODEL = 'meta-llama/Llama-3.1-8B-Instruct' β”‚ β”‚ β”‚ β”‚ MODEL_COMPLETION = 'deepseek-ai/DeepSeek-R1-Distill-Qwen… β”‚ β”‚ β”‚ β”‚ OLLAMA_BASE_URL = None β”‚ β”‚ β”‚ β”‚ OLLAMA_BASE_URL_COMPLETION = None β”‚ β”‚ β”‚ β”‚ OPENAI_BASE_URL = None β”‚ β”‚ β”‚ β”‚ OPENAI_BASE_URL_COMPLETION = None β”‚ β”‚ β”‚ β”‚ random = <module 'random' from β”‚ β”‚ β”‚ β”‚ '/usr/local/lib/python3.10/random.py'> β”‚ β”‚ β”‚ β”‚ TOKEN_INDEX = 1 β”‚ β”‚ β”‚ β”‚ TOKENIZER_ID = None β”‚ β”‚ β”‚ β”‚ TOKENIZER_ID_COMPLETION = None β”‚ β”‚ β”‚ β”‚ VLLM_BASE_URL = None β”‚ β”‚ β”‚ β”‚ VLLM_BASE_URL_COMPLETION = None β”‚ β”‚ β”‚ ╰──────────────────────────────────────────────────────────────────────────╯ β”‚ ╰──────────────────────────────────────────────────────────────────────────────╯ Exception: Error loading InferenceEndpointsLLM: 404 Client Error: Not Found for url: https://api-inference.huggingface.co/status/meta-llama/Llama-3.1-8B-Instruct (Request ID: Root=1-681e796c-3ffd104b771935736fd00014;f7168571-2f00-4253-9baf-62186f51cb59)

Container logs:

Fetching error logs...