Text Classification
PEFT
Safetensors
English

Usage with PEFT

This model uses LoRA fine-tuning. To use it:

from transformers import AutoModelForSequenceClassification, AutoTokenizer
from peft import PeftModel

# Load base model
base_model = AutoModelForSequenceClassification.from_pretrained("google-bert/bert-base-uncased", num_labels=2)
tokenizer = AutoTokenizer.from_pretrained("google-bert/bert-base-uncased")

# Load LoRA weights
model = PeftModel.from_pretrained(base_model, "anjali-mudgal/prompt_guardrail_bert-LoRA")
lora_config = LoraConfig(
    r=8,  
    lora_alpha=32,  
    lora_dropout=0.15, 
    bias="all",  
    task_type="SEQ_CLS",
    target_modules=["query", "key", "value", "output.dense"]
)

### Framework versions

- PEFT 0.14.0
Downloads last month
4
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for anjali-mudgal/prompt_guardrail_bert-LoRA-r8

Adapter
(77)
this model