VLLM Deployment Issues

#4
by ArtusDev - opened

Hi!
It seems like VLLM deployment for Apertus is not working properly. Basic deployment setup (nightly vllm and latest transformers) with and without experimental xielu implementation fails on weights load up:

[1;36m(VllmWorker TP1 pid=546) ERROR 09-02 04:29:26 [multiproc_executor.py:574]   File "/usr/local/lib/python3.12/dist-packages/vllm/model_executor/models/apertus.py", line 466, in load_weights
[1;36m(VllmWorker TP1 pid=546) ERROR 09-02 04:29:26 [multiproc_executor.py:574]     param = params_dict[name]
[1;36m(VllmWorker TP1 pid=546) ERROR 09-02 04:29:26 [multiproc_executor.py:574]             ~~~~~~~~~~~^^^^^^
[1;36m(VllmWorker TP1 pid=546) ERROR 09-02 04:29:26 [multiproc_executor.py:574] KeyError: 'layers.0.mlp.act_fn.beta'

Is this a known issue or there are specific requirements for VLLM deployment set up right now? I couldn't find any direct deployment guidelines regarding Apertus.

I also have issues with vLLM deployment via GpuStack.

Failed to initialize vllm candidates selector: The checkpoint you are trying to load has model type apertus but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date. You can update Transformers with the command pip install --upgrade transformers. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command pip install git+https://github.com/huggingface/transformers.git.

pip show transformers
Version: 4.55.3

Edit: Only tried 8B-Instruct.

Hello, we are sorry for the inconvenience. There is an issue with VLLM (#24100) when loading the weights.

In the meantime, you can use sglang or transformers, they should work fine.

Thanks!
Will be waiting for the fix to get merged.

Swiss AI Initiative org

it was merged into vLLM main. hope all ok now

mjaggi changed discussion status to closed

Sign up or log in to comment