runtime error

Exit code: 1. Reason: �██████| 190/190 [00:00<00:00, 1.89MB/s] tokenizer_config.json: 0%| | 0.00/26.0 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 26.0/26.0 [00:00<00:00, 276kB/s] config.json: 0%| | 0.00/762 [00:00<?, ?B/s] config.json: 100%|██████████| 762/762 [00:00<00:00, 3.60MB/s] vocab.json: 0%| | 0.00/1.04M [00:00<?, ?B/s] vocab.json: 100%|██████████| 1.04M/1.04M [00:00<00:00, 32.0MB/s] merges.txt: 0%| | 0.00/456k [00:00<?, ?B/s] merges.txt: 100%|██████████| 456k/456k [00:00<00:00, 23.6MB/s] tokenizer.json: 0%| | 0.00/1.36M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 1.36M/1.36M [00:00<00:00, 21.6MB/s] model.safetensors: 0%| | 0.00/353M [00:00<?, ?B/s] model.safetensors: 24%|██▍ | 84.6M/353M [00:01<00:03, 70.5MB/s] model.safetensors: 100%|██████████| 353M/353M [00:01<00:00, 219MB/s] generation_config.json: 0%| | 0.00/124 [00:00<?, ?B/s] generation_config.json: 100%|██████████| 124/124 [00:00<00:00, 792kB/s] /home/user/app/app.py:101: UserWarning: You have not specified a value for the `type` parameter. Defaulting to the 'tuples' format for chatbot messages, but this is deprecated and will be removed in a future version of Gradio. Please set type='messages' instead, which uses openai-style dictionaries with 'role' and 'content' keys. chatbot = gr.Chatbot(height=400, bubble_full_width=False) /home/user/app/app.py:101: DeprecationWarning: The 'bubble_full_width' parameter is deprecated and will be removed in a future version. This parameter no longer has any effect. chatbot = gr.Chatbot(height=400, bubble_full_width=False) Traceback (most recent call last): File "/home/user/app/app.py", line 127, in <module> demo.queue(concurrency_count=4).launch() TypeError: Blocks.queue() got an unexpected keyword argument 'concurrency_count'

Container logs:

Fetching error logs...