Cannot find the model library that corresponds to `OpenOrca-Platypus2-13B-q4f16_1`
Hi!
I'm running linux
I install mlc-chat using:
conda create -n mlc-chat-venv -c mlc-ai -c conda-forge mlc-chat-cli-nightly
conda activate mlc-chat-venv
I am able to run other models.
I have cloned the OpenOrca-Platypus2-13B-q4f16_1.
~/llm/mlc-chat$ mlc_chat_cli --model OpenOrca-Platypus2-13B-q4f16_1
Use MLC config: "/home/benjamin/llm/mlc-chat/dist/prebuilt/mlc-chat-OpenOrca-Platypus2-13B-q4f16_1/mlc-chat-config.json"
Use model weights: "/home/benjamin/llm/mlc-chat/dist/prebuilt/mlc-chat-OpenOrca-Platypus2-13B-q4f16_1/ndarray-cache.json"
[07:46:40] /usr/share/miniconda/envs/mlc-llm-build/conda-bld/mlc-chat-cli-nightly-package_1695791170594/work/cpp/cli_main.cc:378: Cannot find the model library that corresponds to OpenOrca-Platypus2-13B-q4f16_1
.
We searched over the following possible paths:
- OpenOrca-Platypus2-13B-q4f16_1
- dist/prebuilt/lib
- dist/OpenOrca-Platypus2-13B-q4f16_1
- dist/prebuilt/OpenOrca-Platypus2-13B-q4f16_1
I could not find OpenOrca-Platypus2-13B-q4f16_1-vulkan.so in the /dist/prebuilt/lib folder.
I also could not find it here: https://github.com/mlc-ai/binary-mlc-llm-libs
How can i get this model to work? Am i doing something wrong?
thanks,
- B.
Hello,
Thank you for the email. Please update the mlc-chat-config file and retry.
The team has been updated this model (and it will be the same for similar models) to use the Llama-2-13b-chat-hf-q4f16_1 library files. You should have those in your lib directory.
Thanks
David
Thank you! I confirm that the updated .json file fixes my installation :)