Finetuned merged lora adaptors to Mixtral-8x7b-instruct-v0.1 on Swedish instruct data
You likely need tokenizer and tokenizer.config from original model to load properly.
license: apache-2.0
datasets:
- jeremyc/Alpaca-Lora-GPT4-Swedish
language:
- sv
license: apache-2.0