LLäMmlein 7B

This is a German 7B LLaMA language model trained from scratch using our adapted Tinyllama codebase on the German portion of RedPajama V2. Find more details on our page and our preprint!

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("LSX-UniWue/LLaMmlein_7B")

tokenizer = AutoTokenizer.from_pretrained("LSX-UniWue/LLaMmlein_7B")
Downloads last month
86
Safetensors
Model size
6.74B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for LSX-UniWue/LLaMmlein_7B

Adapters
1 model

Dataset used to train LSX-UniWue/LLaMmlein_7B

Collection including LSX-UniWue/LLaMmlein_7B