Text Generation
Transformers
Safetensors
llama
Unsloth
model adaptation
NLI
conversational
text-generation-inference
mlynatom's picture
Update README.md
418910f verified
metadata
library_name: transformers
license: llama3.1
datasets:
  - ctu-aic/cs_instruction_tuning_collection
  - ctu-aic/en_instruction_tuning_collection
language:
  - cs
  - en
  - de
  - fr
  - it
  - pt
  - hi
  - es
  - th
base_model:
  - meta-llama/Llama-3.1-8B-Instruct
pipeline_tag: text-generation
tags:
  - Unsloth
  - model adaptation
  - NLI

Model Card for Llama 3.1 8B Instruct -> IT_(cs+en)

Llama 3.1 8B Instruct instruction-tuned using a mixture of cs_instruction_tuning_collection and en_instruction_tuning_collection. More information in the thesis: TBA. (The notation is thesis is: B+IT -> IT_(cs+en))

๐Ÿ›‘ Ethical Considerations and Limitations

This model is a Czech-adapted version of Meta's LLaMA 3.1 8B Instruct, developed as part of master's thesis. It is intended solely for academic and research purposes.

  • โš ๏ธ Not Intended for Production Use: This model has not undergone extensive safety testing, fine-tuning for alignment, or robust filtering of harmful outputs. Do not deploy this model in any application or setting that impacts users or the public.
  • โ— Potential for Harm: The model may generate biased, offensive, false, or otherwise harmful content. It does not include safeguards such as moderation layers or toxicity detection.
  • ๐Ÿงช Experimental Nature: This model is an academic experiment accompanying a thesis project and may contain unintended behaviors or limitations due to limited training data, resources, or evaluation.
  • ๐Ÿ‘ค Responsibility: Any use of this model is at the userโ€™s own risk. The author does not assume responsibility for any consequences arising from the use of the model.
  • ๐Ÿ”’ Respect for Original License: This adaptation is subject to the original terms and conditions set by Meta for LLaMA models.

Researchers and practitioners using this model must ensure appropriate ethical oversight and conduct rigorous evaluations before any further deployment or fine-tuning.

Citation

TBA