This project is sponsored by PrimeLine

Model Card

This model is an finetuned version for german instructions and conversations in style of Alpaca. "### Assistant:" "### User:" The dataset used is deduplicated and cleaned, with no codes inside. The focus is on instruction following and conversational tasks.

The model archictecture is based on codellama with 34B parameters, trained on 100% renewable energy powered hardware.

This work is contributed by private research of flozi00

Join discussions about german llm research, and plan larger training runs together: https://join.slack.com/t/slack-dtc7771/shared_invite/zt-219keplqu-hLwjm0xcFAOX7enERfBz0Q

Downloads last month
17
Safetensors
Model size
33.7B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train flozi00/codellama-34b-german-assistant-v1

Collection including flozi00/codellama-34b-german-assistant-v1