andreeatomescu commited on
Commit
5395a9a
·
verified ·
1 Parent(s): bb7f31e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +71 -11
README.md CHANGED
@@ -1,15 +1,75 @@
1
  ---
2
- datasets:
3
- - klusai/ds-tf1-en-3m
4
- language:
5
- - ro
6
- - en
7
- metrics:
8
- - bleu
9
- base_model:
10
- - google/gemma-3-1b-it
11
- pipeline_tag: text-generation
12
  tags:
13
  - translation
 
 
14
  - fables
15
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ license: apache-2.0
 
 
 
 
 
 
 
 
 
3
  tags:
4
  - translation
5
+ - en-ro
6
+ - literary
7
  - fables
8
+ - low-resource
9
+ - lora
10
+ - gemma
11
+ - tinyfabulist
12
+ model-index:
13
+ - name: tf2-1b (Gemma 3 1B, EN→RO fable translator)
14
+ results:
15
+ - task:
16
+ type: translation
17
+ name: English → Romanian
18
+ dataset:
19
+ name: TinyFabulist-TF2 (15 k EN–RO fables)
20
+ type: klusai/tf2-en-ro-15k
21
+ metrics:
22
+ - name: BLEU
23
+ type: bleu
24
+ value: 34.7
25
+ verified: false
26
+ - name: LLM-Eval (5-dim average)
27
+ type: custom
28
+ value: 4.62 / 5
29
+ verified: false
30
+ language:
31
+ - en
32
+ - ro
33
+ ---
34
+
35
+ # 🌱 TinyFabulist-TF2-1B · Gemma 3 1B EN→RO Fable Translator
36
+
37
+ **`tf2-1b`** is a parameter-efficiently fine-tuned checkpoint (LoRA adapters merged) of **Google Gemma 3 1B** that specialises in translating *moral fables* from **English** into **Romanian**.
38
+ It was produced during the TinyFabulist-TF2 project and is intended as a lightweight, cost-efficient alternative to GPT-class APIs for literary translation in low-resource settings.
39
+
40
+ ---
41
+
42
+ ## 📰 Model Summary
43
+ | | |
44
+ |---|---|
45
+ | **Base model** | [`google/gemma-3b-1b`](https://huggingface.co/google/gemma-3b-1b) |
46
+ | **Architecture** | Decoder-only Transformer, 1 B params |
47
+ | **Fine-tuning method** | Supervised SFT (full-sequence), then instruction-tuning, then LoRA adapters (rank = 16)<br>Adapters merged for this release |
48
+ | **Training data** | 12 000 EN–RO fable pairs (train split, TinyFabulist-TF2) |
49
+ | **Validation** | 1 500 pairs |
50
+ | **Eval set** | 1 500 pairs (held-out) |
51
+ | **Objective** | Next-token cross-entropy on Romanian targets |
52
+ | **Hardware / budget** | 1× A6000 GPU (48 GB) · ~4 h · ≈ \$32 |
53
+ | **Intended use** | Off-line literary translation of short stories / fables |
54
+ | **Out-of-scope** | News, legal, medical, or very long documents; languages other than EN ↔ RO |
55
+
56
+ ---
57
+
58
+ ✨ How It Works
59
+
60
+ This model translates short English fables or moral stories into fluent, natural Romanian, capturing not just the literal meaning but also the narrative style and ethical lesson. Simply provide a short story in English, and the model will generate a Romanian version that preserves the storytelling tone and clarity, making it suitable for children’s literature, educational content, or creative writing. Designed to be lightweight, it works well even on modest hardware and is intended as a free, accessible alternative to large proprietary translation services. The model is ideal for teachers, students, and researchers looking to generate high-quality literary translations in low-resource or offline settings.
61
+
62
+ 🚧 Limitations & Biases
63
+
64
+ Trained exclusively on synthetic data → may reproduce GPT-style phrasing.
65
+ Domain‐specific: excels on short, moralistic narratives; underperforms on technical or colloquial prose.
66
+ No guard-rails: user must filter harmful content downstream.
67
+ Context window = 2 048 tokens (≈ 1 500 Romanian words).
68
+
69
+ ✅ License
70
+
71
+ Released under Apache 2.0.
72
+ Dataset (TinyFabulist-TF2 EN–RO 15 k) is CC-BY-4.0.
73
+
74
+
75
+ Questions or feedback? Open an issue or DM @klusai. Happy translating! 🚀