Turkish NLP E-Commerce
Collection
3 items
•
Updated
This model is a LoRA fine-tuned version of meta-llama/Llama-3.2-3B-Instruct specifically trained for end-to-end aspect-based sentiment analysis on Turkish e-commerce product reviews.
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
# Load base model
base_model = AutoModelForCausalLM.from_pretrained(
"meta-llama/Llama-3.2-3B-Instruct",
torch_dtype=torch.float16,
low_cpu_mem_usage=True
)
# Load tokenizer
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-3.2-3B-Instruct")
# Load LoRA adapter
peft_model = PeftModel.from_pretrained(base_model, "opdullah/Llama-3.2-3B-tr-ABSA")
# Example review
review = "Bu telefonun arka kamerasını beğendim ama bataryası yetersiz."
# Prepare input
messages = [{"role": "user", "content": review}]
inp = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
input_ids = tokenizer(inp, return_tensors="pt")["input_ids"].to("cuda")
# Generate output
outputs = peft_model.generate(input_ids, max_new_tokens=1024)
result = tokenizer.decode(outputs[0]).split("<|start_header_id|>assistant<|end_header_id|>")[-1]
print(result)
Expected Output:
[{"term": "arka kamerasını", "polarity": "positive"}, {"term": "bataryası", "polarity": "negative"}]
The model outputs JSON format with the following structure:
[
{
"term": "aspect_term_in_turkish",
"polarity": "positive|negative|neutral"
}
]
Example outputs:
[{"term": "arka kamerasını", "polarity": "positive"}, {"term": "bataryası", "polarity": "negative"}]
[{"term": "fiyatı", "polarity": "positive"}, {"term": "kalitesi", "polarity": "negative"}]
[{"term": "teslimat hızı", "polarity": "positive"}, {"term": "ambalaj", "polarity": "positive"}]
torch>=2.0.0
transformers>=4.36.0
peft>=0.7.0
accelerate>=0.25.0
This model is designed for:
If you use this model, please cite:
@misc{llama-turkish-absa,
title={Llama-3.2-3B Turkish ABSA},
author={Abdullah Koçak},
year={2025},
url={https://huggingface.co/opdullah/Llama-3.2-3B-tr-ABSA}
}
@misc{llama3.2,
title={Llama 3.2: Revolutionizing edge AI and vision with open, customizable models},
author={Meta},
year={2024},
publisher={Meta AI},
url={https://ai.meta.com/blog/llama-3-2-connect-2024-vision-edge-mobile-devices/}
}
Base model
meta-llama/Llama-3.2-3B-Instruct