How to define/configure a LoRA Adapter?

#1
by josmith9873 - opened

I came here from the vLLM LoRA Adapter documentation. I have a LoRA safetensor file trained using the mistral-finetune repository here which I would like to try and use with a model run using the vLLM package. However, there's no documentation I can find on how a LoRA Adapter should be defined or configured. Providing the filepath to the safetensor file output by running the code in that repository produces an error about no adapter_config.json file being found. Are there any resources about how I can provide the necessary metadata to use this file as a LoRA layer on my model?

Sign up or log in to comment